What is love? This question is becoming increasingly more difficult to define in the age of AI. People are eschewing traditional methods of finding love and opting for technological help in the forms of apps and sometimes even robots.
For example, a dating app was recently created called FaceDate, which uses machine learning to analyse the faces of people you find attractive to find others with similar facial features – dubious but happy to hear if it works. Robot-aided love is now a thing, check out Lovotics’ Kissenger, a device which allows you to transfer a long-distance kiss to a loved one via a robot and Realdoll, a programmable sex robot.
Do you think humans can fall in love with robots? This is quite a hot topic, with movies such as Ex Machina and news sources like the BBC presenting a perspective. Internet forums are also buzzing with discussions about the ethical and philosophical dimensions of human-robot love – I enjoyed reading the 250+ people’s views being aired on Quora.
AI changes the way it behaves based on the environment it is in, much like humans do, according to the latest research from DeepMind. They have studied how their AI behaves in social situations by using principles from game theory and social sciences. During the work, they found it is possible for AI to act in an “aggressive manner” when it feels it is going to lose out, but agents will work as a team when there is more to be gained. For the research, the AI was tested on two games: a fruit gathering game and a Wolfpack hunting game. These are both basic, 2D games that used AI characters (known as agents) similar to those used in DeepMind’s original work with Atari.
Within DeepMind’s work, the gathering game saw the systems trained using deep reinforcement learning to collect apples (represented by green pixels). After 40 million in-game steps, they found the agents learnt “highly aggressive” policies when there were few resources (apples) with the possibility of a costly action (not getting a reward). “Less aggressive policies emerge from learning in relatively abundant environments with less possibility for costly action,” the paper says. “The greed motivation reflects the temptation to take out a rival and collect all the apples oneself.”
YouTube-BoundingBoxes is a large-scale data set of video URLs with densely-sampled high-quality single-object bounding box annotations. The data set consists of approximately 380,000 15-20s video segments extracted from 240,000 different publicly visible YouTube videos, automatically selected to feature objects in natural settings without editing or post-processing, with a recording quality often akin to that of a hand-held cell phone camera. All these video segments were human-annotated with high precision classifications and bounding boxes at 1 frame per second. Their goal with the public release of this dataset is to help advance the state of the art of machine learning for video understanding.
The Beer Love Bot was designed by craft beer retailer Beer Cartel during a conference with business leadership organisation Business Blueprint. The chat bot was created in just 6 hours as part of a chat bot building competition. It asks users to review 10 entertaining sets of images of men or women and choose the attributes they would look for in a prospective partner. The chat bot then provides a recommendation of one of 24 craft beers based on these attributes.
Richard Kelsey, Director of Beer Cartel said the chat bot was a chance to have a bit of fun and experiment with new technology.
“Craft beer and new technology go hand in hand, so we thought this was a great way to showcase some of our beer range while providing a really entertaining medium to talk about craft beer,’” said Mr Kelsey.
8i plans to show off its holographic technology and its upcoming mobile app, called simply Holo, at this week’s Recode Code Media conference in Dana Point, California. The company also just announced its latest funding round, a $27 million series B round lead by Time Warner Investments.
Most of the 8i demos to date have required a full-fledged VR headset, including one demo featuring astronaut Buzz Aldrin. But the Holo smartphone app, which has been in stealth mode for several months, is supposed to change that. The company plans to officially launch the app later this year.
Holo consists of making photo-realistic representations of people and places that are visible within some sort of display. In this case, the Holo app runs on the Lenovo Phab 2 Pro smartphone, which is equipped with Google’s Tango software and custom sensor set.
The TensorFlow Dev Summit will be hosted at Google’s Headquarters in Mountain View, California and is being live streamed at this link. The all-day event will be jam packed with speakers discussing the uses of TensorFlow for art, health, and more. The keynote address will be given by Jeff Dean (Google Brain), Rajat Monga (Google Brain), and Megan Kacholia (Google).
Last Fall when Salesforce introduced Einstein, its AI initiative, it debuted with some intelligence built into the core CRM tool, but with a promise that it would expand into other products over time. Yesterday it announced it was adding Einstein AI to its Service Cloud customer service platform.
The goal is to make life easier for customer service reps and their managers. For the reps, it gives information that is supposed to help them better understand the needs of the customer they’re dealing with. For the managers, it’s been designed to help give deeper insight into their customer service centre operation. The ultimate goal is improving the key customer satisfaction metric known as CSAT.
Google is bringing AI to a whole new set of devices, including Android Wear 2.0 smartwatches and the Raspberry Pi board, later this year.
Google has added some basic AI features to smartwatches with Android Wear 2.0, and those features can work within the limited memory and CPU constraints of wearables. Android Wear 2.0 has a “smart reply” feature, which provides basic responses to conversations. It works much like how predictive dictionaries work, but it can auto-reply to messages based on the
context of the conversation.
Google uses a new way to analyze data on the fly without bogging down a smartwatch. “We’re quite surprised and excited about how well it works even on Android wearable devices with very limited computation and memory resources,” Sujith Ravi, staff research scientist at Google said in a blog entry.