Most people have a love-hate relationship with customer service. Thankfully, innovation in AI is making the industry more likeable by making customer service agents more effective and efficient. This week, DigitalGenius raised $14.75M to bring AI to customer service. Their software connects with companies’ CRM and customer service platforms to train AI assistants using transcripts from historical customer service interactions. For advice, check out Falon Fatemi‘s (Node) article on how to leverage AI to improve customer service.
Also, this week, Twitter released a new API designed to help developers build apps that can power customer service, chatbots and brand engagement on Twitter, an area Twitter has been increasingly invested in this year.
As AI begins to infiltrate nearly every industry, developers are making plans to learn how to tap the technology and gain business insights. Some 83% of developers did not work with AI or machine learning in 2017, but of those, 73% said they plan to learn about these technologies in 2018, according to a new survey from DigitalOcean.
Developers also expect AI to cause some issues in the coming year: When asked what big challenges they anticipate for 2018, 63% said automating workflows, and 32% said incorporating AI and machine learning into the business.
Apolitical discusses how governments are using chat bots to help people cut through layers of bureaucracy to find the information they need and open up new channels for public consultation. As these robots become more prevalent, we need to ask what role they’ll play in the government of the future, and what lies beyond them. Are they the next step towards digital democracy?
Since June 2017, Jersey has been using a chat bot developed by apptivism, a London startup. The chat bot works via Facebook Messenger; public servants pick topics on which they want to consult the island’s citizens, set up a series of questions, then use the chat bot to solicit responses. People using the chat bot are able to see fact-checked information from a range of sources, receive instant feedback and share the chat bot with friends and colleagues.
Incorporating ideas from past work such as Tacotron and WaveNet, Google has added more improvements to end up with their new system, Tacotron 2. Their approach does not use complex linguistic and acoustic features as input. Instead, they generate human-like speech from text using neural networks trained using only speech examples and corresponding text transcripts.
A full description of their new system can be found in our paper “Natural TTS Synthesis by Conditioning WaveNet on Mel Spectrogram Predictions.” In a nutshell it works like this: They use a sequence-to-sequence model optimised for TTS to map a sequence of letters to a sequence of features that encode the audio. These features, an 80-dimensional audio spectrogram with frames computed every 12.5 milliseconds, capture not only pronunciation of words, but also various subtleties of human speech, including volume, speed and intonation. Finally these features are converted to a 24 kHz waveform using a WaveNet-like architecture.
New York City has unanimously passed a bill that will attempt to provide transparency to the way that city government agencies use algorithms. It’s the first bill examining algorithmic bias to be passed in the country, and could spell out increased scrutiny in the government’s use of algorithms nationwide.
The bill, which was signed by Mayor Bill de Blasio last week, will assign a task force to examine the way that New York City government agencies use algorithms to aid the judicial process.
Kuri was originally unveiled almost a year ago at the Consumer Electronics Show in Las Vegas, and it’s been steadily getting smarter and closer to production-ready status since then, with regular updates from the roboticists at Mayfield, who wanted to create a domestic robot that wasn’t just functional, but that would be welcomed in as a virtual member of the family.
The little robot features touch sensors, expressive eyes with a built-in camera and live-streaming capabilities, the ability to communicate via onboard speakers, microphones and gestural motion actuators, obstacle avoidance smarts and wheels that can handle room crossing from one room into another, as well as multiple types of floors and carpets.
Presented by Microsoft Ventures, Madrona Venture Group, Notion Capital, and Vertex Ventures Israel
Helping high-potential startups gain access to capital and further the development of AI to solve problems and improve people’s lives.
The 3 winning startups receive a $1,000,000 USD investment, with a total prize pool valued at $5,500,000 USD.
3 Regional Competitions Around the Globe: Top applicants from each region will compete in live local pitch competitions in Seattle, Tel Aviv, and London, with 1 winner in each region. Application deadline is 31 December. Apply here.
In 10 to 15 years, robots will be able to write better computer code than even the most skilled coders today, according to former Pepsi president and Apple CEO John Sculley, a development that is representative of the growing role of artificial intelligence (AI) in the workforce.
“AI is foundational for almost everything that will
be in the business world as we move further and further into the century,” Sculley told FOX Business. “The concept of work will look different in 10 or 20 years.”
Researchers hope that the AI system called Deception Analysis and Reasoning Engine (DARE) could be used in courtrooms to discover whether or not people are telling the truth.
The machine was developed by researchers from the University of Maryland and Dartmouth college who trained the system using videos of people in court. In their study, published on the Cornell University Library website, the researchers, led by Dr Zhe Wu, said: ‘On the vision side, our system uses classifiers trained on low level video features which predict human micro-expressions.”