The roundtable breakfast event will take place in London next Friday, 3 November, from 8AM-10AM. Join us to hear from senior HR directors on their experiences of implementing AI, the projects which have and haven’t worked, the tools and vendors they would recommend and the breadth of use cases for AI in HR.
We’ll discuss the big questions in the industry, like:
How are others getting ahead with AI?
What are the business and investment cases?
Who are the best vendors, and which products are suited to us?
If you can’t make it to the event, check out what our research team has been working on and see how our research can help you understand how HR is being affected by AI: from automating repetitive processes (like CV scanning) to providing rich analytics, and more, AI is having a major impact.
Wal-Mart Stores Inc is rolling out shelf-scanning robots in more than 50 U.S. stores to replenish inventory faster and save employees time when products run out. The approximately 2-foot (0.61-meter) robots come with a tower that is fitted with cameras that scan aisles to check stock and identify missing and misplaced items, incorrect prices and mislabeling. The robots pass that data to store employees, who then stock the shelves and fix errors.
The robots are 50% more productive than their human
counterparts and can scan shelves significantly more accurately and three times faster, King said. Store employees only have time to scan shelves about twice a week.
Zapping brain cells from living human tissue? It sounds like a creepy Halloween tale, but for the Allen Institute for Brain Science, it’s a clever way to understand more fully how the brain works — and potentially bring healing to future patients. “It doesn’t creep me out at all,” Jonathan Ting, an assistant investigator at the Seattle institute who’s been deeply involved in the project, told GeekWire. “I feel like it’s our obligation as scientists.”
Now the institute is unveiling the payoff from the first four years of their experimentation with brain tissue. The first batch of data from human nerve cells has become part of the Allen Cell Types Database, an open-access tool for researchers. Much more data will be added in the years to come. “Our goal is to truly understand
how the human brain works, not just the brains of mice or other animals that most brain scientists study,” Christof Koch, chief scientist and president of the Allen Institute for Brain Science, said in a news release. “There is no better way to do that than to directly examine live cells of the human cortex.”
Online, learning pronunciation is generally “hear a recording, then repeat it (to an empty room).” A new platform, Blue Canoe Learning, uses an established curriculum and machine learning to make things easier and more effective. It’s the first company to join AI2’s new incubator in Seattle, and has raised a $1.4M round to expand its operations.
Blue Canoe (itself a mnemonic phrase) has worked to digitize the Color Vowel System and package it as an app. It’s still at a very early stage, with more content planned as the company learns from its pilot programs. Users play a card game (the first of several games and activities to be included) that requires them to say the vocabulary word on the card they play; a machine learning system listens and identifies whether they have pronounced it correctly,
and if not, gives relevant feedback.
The innovation — developed by Microsoft, McKinstry, and Cummins — could have big implications for the future of cloud computing and sustainable energy. this week, those three companies unveiled the Advanced Energy Lab, a pilot program designed to prove that energy-hungry data centers can be powered by fuel cells.
In a gleaming white room in the warehouse sits a 20-rack data center concept with fuel cells mounted above. Natural gas is pumped directly into the fuel cells and then
converted to power to run the racks. The goal is to replace the traditional way data centers are powered, by moving energy over long distances from power plants through substations before converting it into the right voltage at its final destination. The fuel cell technology could herald a big shift away from that method, reducing the energy loss that occurs by transporting and converting the energy.
Saudi Arabia is looking toward a post-oil future by sinking some US$500B into a massive, ultra-futuristic megacity project it calls Neom (or Neo-Mostaqbal; new future). Saudi Crown Prince Mohhamed bin Salman announced the giant project on Tuesday, a brand new city on the intersection of three countries, where “there is no room for old thinking.”
Since its founding by Elon Musk and others nearly two years ago, nonprofit research lab OpenAI has published dozens of research papers. One posted online Thursday is different: Its lead author is still in high school.
The wunderkind is Kevin Frans, a senior currently working on his college applications. He trained his first neural net—the kind of system that tech giants use to recognize your voice or face—two years ago, at the age of 15. Frans landed at OpenAI after taking on one of the lab’s list of problems in need of new ideas. He made progress, but got stuck and emailed OpenAI researcher John Schulman for advice. After some back and forth on the matter of trust region policy optimization, Schulman checked out Frans’s blog and got a surprise. “I didn’t expect from those emails that he was in high school,” he says.
Sophia, a female humanoid created by Hanson Robotics and modeled after Audrey Hepburn, was revealed to crowds at the Future Investment conference in Saudi Arabia this Wednesday. During her big debut, Sophia spoke in an interview with New York Times columnist Andrew Ross Sorkin. During her interview, Sorkin asked Sophia if humanity had anything to be worried about in regards to her and other artificial intelligence, a topic that Musk has not shied away from in the past. In response, Sophia replied that no, humanity does not have anything to worry about in regards to A.I., and perhaps Sorkin has been listening to Musk a bit too closely. This was a direct reference to Musk’s past warnings about A.I. being a ” fundamental existential risk for human civilization,” NPR reported.
“Just feed it The Godfather movies as input. What’s the worst that could happen?” Musk tweeted in response, referring to the notably violent 1972 film.
Artificial intelligence software can beat the world’s most widely used test of a machine’s ability to act human, Google’s reCAPTCHA, by copying how human vision works, a new study finds. “Our system has the ability to learn using relatively few examples, much like the human brain,” says study lead author Dileep George, cofounder of Vicarious.
These new findings suggest the need for more robust automated human-checking techniques, and could help improve computer perception for robotics tasks, scientists add. A CAPTCHA is considered broken if an algorithm can successfully solve it at least 1 percent of the time. Now San Francisco Bay Area startup Vicariousreveals its AI software can solve reCAPTCHAs at an accuracy rate of 66.6 percent, BotDetect at 64.4 percent, Yahoo at 57.4 percent and PayPal at 57.1 percent.