Insurance based on a selfie and Google AI’s new sounds. https://cognitionx.com/news-briefing/
Selling life insurance has traditionally involved an in-depth assessment of the customer by a qualified underwriter using a well-worn set of actuarial models. But this job is about to be replaced. By a selfie.
The disruption to the insurance industry from AI and Machine Learning has already started. 2/3 of insurers already use artificial intelligence-based virtual assistants. In the long run, the bigger challenge is that machines could undermine the industry by giving customers better tools to decide whether insurance is even necessary.
One of the most complex professions in the world is at risk of being replaced. By a selfie. Life insurers are looking not just for basic information, such as gender, but also for clues about how quickly people are ageing, their body mass index, and whether they smoke. Lapetus, a US based start-up, thinks their model is a far more accurate prediction of life expectancy than traditional methods, and the whole process takes only a few minutes. The bespoke risks in society will never go away as there is always innovation in the economy. Nonetheless, AI will not make risks disappear entirely and people will always want to protect themselves against the unexpected.
Google Magenta (subset of the Google Brain team) have released some new sounds from there AI, NSynth, and by new sounds we mean fundamentally new. By ingesting a massive database of sounds the neural nets in NSynth developed a representation of a range of instruments and can reproduce them. The innovation is that it can blend these, not by overlaying them, but rather by combining the mathematical representations to produce new sounds.
The full database of sounds and the algorithm used is freely available online.
In conversations about AI much of the opinions expressed are based on lots of “information” that contains little in the way of knowledge. Politicians have a tenuous grasp if any on the systems that are rapidly moving to change our world and government more broadly seems unprepared for the changes that are already here. “What happens to the lawyers, accountants, medical diagnosticians, manufacturers, supply chain managers and customer service representatives when they’re displaced? Where will they go? What will they do?”
McKinsey & Co. predicts car data could become a $750 billion industry by 2030. Companies like Delphi
and startup Otonomo are in the market to collect and sell car data, building platforms that the big car makers desire. As self-driving platforms develop, automakers will partner with these companies because, as Intel CEO Brian Krzanich said, data is “the next oil”.
The Partnership on AI to Benefit People and Society announced yesterday they are welcoming 22 new members, including tech giants, such as Intel, Sony, and Salesforce, and nonprofits the Electronic Frontier Foundation and UNICEF. The partnership aims to “study and formulate best practices on AI technologies, to advance the public’s understanding of AI, and to serve as an open platform for discussion and engagement about AI and its influences on people and society.”
In addition to increasing their membership, the partnership announced plans to launch a set of activities around the challenges and opportunities in their thematic pillars. These include: topic-specific and sector-specific Working Groups to research and formulate best practices; the creation of a Civil Society Fellowship program aimed at assisting people at non-profits and NGOs who wish to collaborate on topics in AI and society; the formation of a cross-conference “AI, People, and Society” Best-Paper Award; and the start of an AI Grand Challenges series to stimulate aspirational efforts in harnessing AI to address some of the most pressing long-term social and societal issues.
OpenAI has created a robotics system, trained entirely in simulation and deployed on a physical robot, which can learn a new task after seeing it done once. Using an algorithm called one-shot imitation learning, a human can communicate how to do a new task by performing it in VR. Given a single demonstration, the robot is able to solve the same task from an arbitrary starting configuration.
Wired’s Matthew Reynolds visits Ocado’s warehouse near Birmingham to discover how they are using AI and Automation to reduce human involvement as much as possible. In a 90,000 square foot space in Dordon, 35 kilometres of conveyor belts transfer 50,000 products to ensure that their 190,000 deliveries find their way smoothly to customers every week.
Ocado’s CTO, Paul Clarke explains that the crate’s journey through the warehouse is calibrated entirely by algorithms, including ensuring that heavy items aren’t placed on top of delicate produce, “from the moment an item arrives in the warehouse, Clarke says, a human never touches it until it’s placed into a shopping bag just minutes before it goes out for delivery.”