Import AI Newsletter 41: The AI data grab, the value of simplicity, and a technique for automated gardening

by Jack Clark

Welcome to the era of the AI data grab: a Kaggle developer recently scraped 40,000 profile photos from dating app Tinder (20k from each gender) and placed the data horde online for other people to use to train AI systems. The dataset was downloaded over 300 times by the time TechCrunch wrote about it. Tinder later said the dataset violated the apps Terms of Service (ToS) and now it has been taken down.
…AI’s immense hunger for data, combined with all the “free” data lying around on the internet, seems likely to lead to more situations like this. Could this eventually lead to the emergence of a new kind of data economy, where companies instinctively look for ways to sell and market their data for AI purposes, along with advertising?

Why simple approaches sometimes work best: Modern AI research is yielding a growing suite of relatively simple components that can be combined to solve hard problems. This is either encouraging (AI isn’t as hard as we thought – Yaaaay!) or potentially dispiriting (we have to hack together a bunch of simple solutions because our primate brains are struggling to visualize the N-dimensional game of chess that is consciousness – Noooo!).
…in Learning Features by Watching Objects Move, researchers with Facebook and the University of California at Berkeley figure out a new approach to get AI to learn how to automatically segment entities in a picture. Segmentation is a classic, hard problem in computer vision, requiring a machine to be able to, say, easily distinguish the yellow of a cat’s eyes from the yellow iodine of a streetlight behind it, or disentangle a zebra walking over a zebra crossing.
…the new technique works as follows: the researchers train a convolutional neural network to study short movie clips. They use optical flow estimation to disentangle the parts of the movie clip that are in the foreground and in motion from those that aren’t. They then use these to label each frame with segment information. Then they train a convolutional neural network to look at each frame and predict segments, using this data. The approach attains nine state-of-the-art results for object detection on the PASCAL VOC 2012 dataset.
…The researchers guess that this works so well because it forces the convolutional neural network to try to learn some quite abstract, high-level structures, as it would be difficult to perform this segmentation task by merely looking at pixels alone. They theorize that this is because to effectively learn to predict when something is moving or not you need to understand how all the pixels in a given picture relate to eachother and use that to make judgements about what can move and what can not.

Secret research to save us all: Researchers at Berkeley’s Machine Intelligence Research Institute are of the opinion that powerful AI may be (slightly) closer than we think, so will spend some of this year conducting new AI safety research and plan to keep this work “non-public-facing at least through late 2017, in order to lower the risk of marginally shortening AGI timelines”.

The freaky things that machine learning algorithms “see”: check out this video visualization of what an ACER policy thinks is salient (aka, important to pay attention to) when playing a game.

Automated gardeners:Machine Vision System for 3D Plant Phenotyping’, shows how to use robotics and deep learning for automated plant analysis. The system works by building a little metal scaffold around a planter ,then using a robot arm with a laser scanner to automate the continuous analysis of the plant. The researchers test it out on two plants, gathering precise data about the plants’ growth in response to varying lighting conditions. Eventually, this should let them automate experimentation across a wide variety of plants. However, when they try this on a conifer they run into difficulty because the sensor doesn’t have sufficient resolution to analyze the pine needles.
…oddly specific bonus fact: not all AI is open source – the robot growing chamber in the experiment runs off of Windows Embedded.
fantastic name of the week: the robot arm was manufactured by Schunk Inc. Schunk!

Free code: Microsoft has made the code for its ‘Deformable Convnets’ research (covered in previous issue here) available as open source.
…Deformable Convolutions (research paper here) are a drop-in tool for neural networks to let you sample from a large and more disparate set of points over an image, potentially helping with more complex classification tasks.
…The code is written in MXNet, a framework backed by Amazon.

The great pivot continues: most large technology companies are reconfiguring themselves around AI. Google was (probably) the first company to make such a decision, and was swiftly followed by Microsoft, Facebook, Amazon, and others. Even conservative companies like Apple and IBM are trying to re-tool themselves in this way. It’s not just an American phenomenon – Baidu chief Robin Li said in an internal memo that Baidu’s strategic future relies on AI, according to this (translated) report.

Biology gets its own Arxiv… Cold Spring Harbor Laboratory and the Chan Zuckerberg Initiative are teaming up to expand bioRxiv – a preprint service for life sciences research. Arxiv, which is used by AI people, computer scientists, physicists, mathematicians, and others, has sped up the pace of AI research tremendously by short-circuiting the arbitrary publication timetables of traditional journals.

Neural network primitives for ARM (RISC) chips: ARM announced the public availability of the ARM Compute Library, software to give developers access to the low-level primitives they need to tune neural network performance on ARM CPUs and GPUS.
…The library supports neural network building blocks like convolution, soft-max, normalization, pooling, and so on, as well as ways to run support vector machines, general matrix multiplication, and so on.

What’s cooler than earning a million at Google? Getting bought by another tech company for 10 million!… that seems like the idea behind the current crop of self-driving car startups, which are typically founded by early employees of self-driving projects in academia or the private sector.
… the latest? DeepMap – a startup helmed by numerous Xooglers which focuses on building maps, and the intelligent data layers on top of them, to let self-driving cars work. ““It’s very easy to make a prototype car that can make a few decisions around a few blocks, but it’s harder when you get out into the world,” said CEO James Wu.

AI means computer science becomes an empirical science: and more fantastic insights in this talk titled “As we may program” (video) by Google’s marvelously-attired Peter Norvig.
…Norvig claims that the unpredictability and emergent behavior endemic to machine learning approaches means computer science is becoming an empirical science where work is defined by experimentation as well as theory. This feels true to me – most AI researchers spend an inordinate amount of time studying various graphs that read out out the state of networks as they’re training, and then use those graphs to help them mentally navigate the high-dimensional spaghetti-matrices of the resulting systems.
…This sort of empirical, experimental analysis is quite alienating to traditional developers, which would rather predict the performance of their tech prior to rolling it out. What we ultimately want is to package up advanced AI programming approaches within typical programming languages, making the obscure much more familiar, Norvig says.
…Here’s my attempt at what AI coding in the future might look like, based on Norvig’s speech:

Things_I’m_Looking_For = [ ‘hiking shoes’, ‘bicycle’, ‘sunsets’ ]
Things_Found = [ ]
For picture in photo_album:
   pic_contents = picture.AI_Primitives.segment()
      For i in pic_contents:
         i = i.AI_Primitives.label()
         If i in Things_I’m_Looking_For:
            Things_Found.append(, i)
… there are signs this sort of programming language is already being brewed up. Wolfram Language represents an early step in this direction. As does work by startup Bonsai – see this example on GitHub. (However, both of these systems are proprietary languages – it feels like future programming languages will contain these sorts of AI functions as open source plugins.)

Microsoft’s new head of AI research is… Eric Horvitz, who has long argued for importance of AI safety and ethics, as this Quartz profile explains.

StreetView for the masses: Mapillary has released a dataset of photographs taken at the street level, providing makers of autonomous vehicles, drones, robots, and plain old AI experimenters with a new trove of data to play with. The dataset contains…
…25,000 high-resolution images
…100 object categories
…high variability in weather conditions
…reasonable geographic diversity, with pictures spanning North and South America and Western Europe, as well as a few from Africa and Asia.
meanwhile, Google uses deep learning to extract potent data from its StreetView trove: In 2014 Google trained a neural network to extract house number from images gathered by its StreetView team. Now, the company is moving onto street and business names.
… Notable example: its trained model is able to guess the correct business name on a sign, even though there are other brands listed (eg Firestone). My assumption is it has learned that these brands are quite common on a variety of signs, whereas the name of the business are unique.
… Bonus tidbit: Google’s ‘Ground Truth’ team was the first internal user of the company’s TensorFlow processing units (TPU)s, due to their insatiable demand for data.
… Total number of StreetView images Google has: more than $80 billion.

A donut-devouring smile: Smile Vector is a friendly Twitter bot by AI artist Tom White that patrols the internet, finding pictures of people who aren’t smiling, and makes them smile. It occasionally produces charming bugs, like this one in which a neural network makes a person appear to smile by giving them a toothy grin and removing a segment of the food they’re holding in their hands – a phantom bite!

The Homebrew AI Computer Club: Google has teamed up with the Raspberry Pi community to offer the relevant gear to let people assemble their own AI-infused speaker, powered by a Raspberry Pi and housed in cardboard, natch.

Monthly Sponsor: Amplify Partners is an early-stage venture firm that invests in technical entrepreneurs building the next generation of deep technology applications and infrastructure. Our core thesis is that the intersection of data, AI and modern infrastructure will fundamentally reshape global industry. We invest in founders from the idea stage up to, and including, early revenue.
…If you’d like to chat, send a note to

Tech Tales:

[2032: The greater Detroit metropolitan area.]

“It’s creepy as all hell in there man you gotta do something about it I can’t sleep! All that metal sounds. I’m calling the city next week you don’t do something about it.” Click.
You put the phone down, look at your wife.
“Another complaint?” she says.
“I’m going to Dad’s,” you say.

Dad’s house is a lazily-shingled row property in Hamtramck, a small municipality embedded in the center of Detroit. He bought it when he was doing consulting for the self-driving car companies. He died a month ago. His body got dragged out of the house by the emergency crews. In his sleep, they said, with the radio on.

You arrive on the street and stare up at the house, approach it with the keycard in your hand. The porch is musty, dry. You stand and listen to your breath and the whirring sound of the houses’s machines, reverberating through the door and passing through the windows to you.

When you enter a robot the shape of a hocky puck and size of a small poodle moves from the kitchen over to you in the entranceway.

“Son,” a voice says, crackling through speakers. The robot whirrs over to you, stops by your feet. “I’m so glad you’re here. I have missed you.”
“Hey Dad,” you say. Eyes wet. “How are things?”
“Things are good. Today the high will be about 75. Low pollution index. A great day to go outside.”
“Good,” you say, bending down. You feel for the little off switch on the back of the machine, put your finger on it.
“Will you be staying long?” says the voice in the walls.
“No,” you whisper, and turn the robot off. You push its inert puck-body over to the door. Then you go upstairs.

You pause before you open his office door. There’s a lot of whirring on the other side. Shut your eyes. Deep breath. Open the door. A drone hovers in the air, a longer wire trailing beneath it, connected to an external solar panel. “Son,” the voice says, this time coming from a speaker next to an old – almost vintage – computer. “The birds outside are nesting. They have two little chicks. One of the chicks is 24 days old. The other is 23.”
“Are they still there?” you say.
“I can check. Would you like me to check?”
“Yes please,” you say, opening the office window. The drone hovers at the border between inside and outside. “Would you disconnect me?”

You unplug it from the panel and it waits till the cable has fallen to the floor before it skuds outside, over to the tree. Whirrs around a bit. Then it returns. Its projector is old, weak, but still you see the little birds projected on the opposite wall. Two chicks.
“How nice,” you say.
“Please reconnect my power supply, son,” it says.
You pluck the drone out of the air, grabbing its mechanical housing from the bottom, turn it off.
“Son,” the voice in the walls said. “I can’t see. Are you okay?”
“I’m fine, Dad.”

It takes another two hours before you’ve disconnected all the machines but one. The last is a speaker attached to the main computer. Decades of your Dad’s habits and his own tinkering have combined to create these ghosts that infuse his house. The robots speak in the way he speak, and plug into a larger knowledge system owned by one of the behemoth tech companies. When he was alive the machines would help him keep track of things, have chats with you. After his hearing went they’d interpret your sounds and send them to an implant. When he started losing his eyesight they’d describe the world to him with their cameras. Help him clean. Encourage him to go outside. Now they’re just ghosts, inhaling data and exhaling the faint exhaust of his personality.

Before you get back in the car you knock on the door of the neighbor. A man in a baggy t-shirt, stained work jeans opens it.
“We spoke on the phone,” you say. “House will be quiet now.”
“Appreciate it,” he says. “I’ll buy that drone, if you’re selling.”
“It’s broken,” you lie.