DeepMind’s Demis Hassabis. Free deep learning courses. Microsoft launches IoT services. Tarzan the robot. AI + drugs. Facebook’s dream of a neural lace.
Yesterday, we attended another fascinating meeting of the AI All Party Parliamentary Group (APPG AI). The session covered the topic of ‘the future of work’. The sentiment that came out was generally optimistic about how AI will impact jobs, albeit with a realisation that the types of jobs will be largely different and that there must be (re)education to prepare us for this shift.
Richard Susskind (author of The Future of Professions) said that ”in the 20s people will have a choice: to compete with the machines or build the machines.”
I think that this quote from Richard aptly sums it all up: “2020 will be the decade of redeployment, not unemployment.”
Today, FT (paywall) published a fascinating piece by Demis Hassabis (co-founder of DeepMind and recently named one of the TIME 100) on AI. He speaks about the technological progress, the meaning and potential of AI, and DeepMind.
I particularly liked one quote: “As we discover more about the learning process itself and compare it to the human brain, we could one day attain a better understanding of what makes us unique, including shedding light on such enduring mysteries of the mind as dreaming, creativity and perhaps
one day even consciousness.”
David Venturi, who is creating his own data science MA program, put together a useful list of 10 free online courses on deep learning. He found 24 online courses on Class Central, 10 of them being free.
The courses range from those which are geared towards beginners, those which are geared towards experts, and those which are specialised (such as one focusing on Natural Language Processing, which was recently put up on Youtube).
Microsoft launched IoT Central yesterday, a new Internet of Things (IoT) service that gives enterprises a fully managed solution for setting up their IoT deployments without needing the in-house expertise necessary for deploying a cloud-based IoT solution from scratch. It’s basically IoT-as-a-Service.
In addition, the company is bringing its Azure Stream Analytics to edge devices, making it easier to provision new IoT devices, and it’s launching a completely new analytics service for time series data.
Jonathan Rogers and Ai-Ping Hu will use the field of robotics to create a field of robots. Their team is building machines that will hang over the crops, suspended by parallel guy-wires. The robots, fitted with cameras, will swing like gibbons along the cables, taking picture after picture of each plant. Down each row, then side to side, and back again, from one wire to another.
With Georgia Tech robots dangling over the field, UGA researchers will be able to get more frequent measurements and to avoid some laborious field work. Someday, they may be able to stay at their laptops miles away, in the air conditioning, scanning a steady stream of images and data sent back from the robots.
Atomwise, a San Francisco-based startup and Y Combinator alum, has built a system it calls AtomNet, which attempts to generate potential drugs for diseases like Ebola and multiple sclerosis. The academic labs will receive 72 different drugs that the neural network has found to have the highest probability of interacting with the disease, based on the molecular data it’s seen.
Atomwise’s system only generates potential drugs—the compounds created by the neural network aren’t guaranteed to be safe, and need to go through the same drug trials and safety checks as anything else on the market. The company believes that the speed at which it can generate trial-ready drugs based on previous safe molecular interactions is what sets it apart.
As Recode reports, the company’s secretive R&D lab — known as Building 8 — is creating a “‘brain-computer speech-to-text interface,’ technology that’s supposed to translate your thoughts directly from your brain to a computer screen without any need for speech or fingertips.”
Brian Resnick from Vox spoke with Rebecca Saxe, a neuroscientist at MIT argues that the EEGs we have are not sensitive enough to pick individual words out of your brain and turn them into words on a screen. However, it is feasible to use current tech to build keyboards and mouses that could be controlled with the mind, which could be very helpful for those with paralysis or disabilities that make typing hard or impossible.
Dubbed the “Driver Attention System,” Cadillac’s solution to the problem of returning control to the human driver in cases where the computers can no longer handle a task uses an infrared camera to track a driver’s head position while Super Cruise is active.
Spend too long looking away from the road ahead, and the car will alert you that it’s time to focus on the road. Alerts happen through a series of escalating alerts on the main instrument display, an LED bar set into the
steering wheel, and even haptic rumbles from your seat. And since it’s checking (and asking) for the driver to keep their eyes on the road, this system can’t be spoofed like those Tesla ones. Check out the other cool features here.
In this episode of the Data Show, Ben Lorica spoke with Reza Zadeh, adjunct professor at Stanford University, co-organizer of ScaledML, and co-founder of Matroid, a startup focused on commercial applications of deep learning and computer vision.
They speak about the rise of deep learning, hardware/software interfaces for machine learning, and the many commercial applications of computer vision.