Import AI: Issue 23: Brain-controlled robots, nano drones, and Amazon’s robot growth

by Jack Clark

Fukoku Mutual Life Insurance Co. plans to eliminate 35 jobs following the introduction of an insurance AI system from IBM Watson, according to Mainichi.  Insurance work is a good candidate for AI automation as it deals with vast amounts of structured data, and it’s easy to recalibrate strategies according to financial performance. Expect more here.

Rise of the (cheap) machines: Robots are expensive, dangerous, and hard to program. Startup Franka Emika aims to solve two of those problems with a new robot arm that costs about ten thousand dollars, which is significantly cheaper than similar arms from companies such as Rethink Robotics and Universal Robots.  Founder Sami Haddadin once demonstrated the safety of the robot’s ‘force sensing technology’ by an arm to try to stab him with a knife to show off how the force-sensing system would prevent them from impaling him. Courage.

Private infrastructure for a public internet:Since 2008, instead of applying leap seconds to our servers using clock steps, we have “smeared” the extra second across the hours before and after each leap. The leap smear applies to all Google services, including all our APIs,” Google says. Developers can access this Network Time Protocol for free via the internet here.

Pete Warden of Google kicked off a great trend with his ‘TensorFlow for poets’ tutorial. Now Googler @hardmaru has followed it up with Recurrent Neural Networks for Artists.

A drone for your pocket, sir? In Santa’s stocking this year for the US marine corps? Hand-held ‘nanodrones’, which will be used for surveillance and navigation when deployed. The marine corps will field 286 of these systems in total. It would be helpful if we had a decent system to track the increasing complexity in software being deployed on drones, as well as hardware. How long until we can fit a trained AI inside the computational envelop of snack-sized drones like these?
… in related news, local lawmakers in North Dakota passed a bill regulating the use of drones by police force. The original legislation forbad state agencies from using drones armed with “any lethal or non-lethal weapons”. The legislation was amended before passing to only include a ban on lethal weapons.
drones are already being used in traditional and non-traditional warfare. Here’s a video from AFP’s Deputy Iraq Bureau Chief of Iraq forces firing at an Islamic State drone that had dropped an explosive charge on one of the soldiers. Elsewhere, Israel’s Air Force released a video claiming to depict air force fighter jets shooting down a Hamas drone

Mind-controlled robots: in the future, we can expect armies to deploy a mixture of AI-piloted drones and human-piloted ones. This research paper ‘Brain-Swarm Interface (BSI): Controlling a Swarm of Robots with Brain and Eye Signals from an EEG Headset” describes a way to let a human control drones through a combination of thoughts and eye movements. People can control the drones directionally (up, down, left, and right) via tracking eye movements, and also through their thoughts. The technology is able to detect two distinct states of thought. One mental state forces the drones to disperse as they travel, and  the other forces them to aggregate together. They validated the approach on a handful of real-world robots, as well as 128 simulated machines.

Input request: Miles Brundage of the Future of Humanity Institute is soliciting feedback for a blogpost about AI progress. Let him know what he should know.

Rio Tinto has deployed 73 house-sized robot trucks from Japanese company Komatsu across four mines in Australia’s barren, Mars-red northwest corner. The chunky, yellow vehicles haul ore for 24 hours a day, 7 days a week. Next, it plans to build robotic train that can be autonomously driven and loaded and unloaded.

Number of robots deployed in Amazon facilities (primarily Kiva systems robots for fulfillment center automation):
– 2013: 1000.
– 2014: 15,000.
– 2015: 30,000.
– 2016: 45,000.

A reasonably thorough examination from Effective Altruism of the different research strategies people are using within AI safety research. Features analysis of:MIRI, FHI, OpenAI,  and the Center for Human-compatible AI, among others.

Fashion e-tailer GILT is using image classification techniques to train AI to identify similar dresses people may like to purchase. Typical recommender systems use hard-wired signals and techniques like collaborative filtering to offer recommended products. Neural networks can instead be trained to infer some of the underlying differences without needing them to be labelled, instead you show it dresses and similar dresses, then it learns to identify subtle features of similarity…
… expect much more here as companies begin to use generative adversarial network techniques to build more imaginative Ais to offer more insightful recommendations. GAN techniques can let you manipulate certain aesthetic qualities via latent variables identified by the AI, as this video from Adobe/UCBerkeley shows. It’s likely that these techniques, combined with evolution strategies such as those used by Sentient via its ‘Sentient Aware Visual Search’ product will dramatically improve the range and diversity of recommendations companies can offer. Though I imagine Amazon will still notice you have bought a hammer and then helpfully spend the next week suggesting other, near-identical hammers for you to buy.

Tech Tales:

[2025: A protest outside the smoked-glass European headquarters of a technology company.]

Police drones fuzz the air above the hundred or so protesters. ‘Stingray’ machines sniff the mobile networks, gathering data on the crowd. People march on the entrance of the tech company’s building with placards bearing slogans for a chaotic world: Our data, our property! Pay your taxes or GET OUT! Gentrifiers!. The police keep their distance, forming a perimeter around the crowd. If the protesters misbehave then this will tighten into what is known as a ‘kettle’, enclosing the crowd. Then each member will be individually searched and questioned, with their responses filmed by police chest-cams and orbiting Evidence Drones.

In the midst of the crowd one figure shrugs off their backpack and places it on the crowd before them, then stoop down to open it. They remove a helmet and place it on their head. Little robots the size of a baby’s hand begin to stream out of the bag, rustling across the asphalt toward the entrance to the tech HQ. The protesters part and let the river of living metal through.

The robots reach the sides of the building and start to climb onto the walls. They flow up the stone columns flanking the two-story high glass doors. The protester wearing the helmet stares at the entrance, raising a hand to steady the helmet as he cocks his head. The drones begin to waltz around another, hissing, as each of them spurts a splotch of red paint onto the side of the building.

Slowly, the machines inscribe an unprintable string of swearwords onto the glass of the doors and stone of the walls. It’s on the 7th swearword that the drones stop, as they all stiffen at once and drop off the side of the building, after the police grab the protester and rips the helmet off his head.

When he has his date in court he is given a multi-year jail sentence, after the police find that the paint expelled by the drones contained a compound that ate into the glass and stone of the building, scarring the words into it.