This week, it has been confirmed that Apple is acquiring Shazam, a popular music discovery/recognition app. As to what they will gain with this purchase, suggestions include: gaining access data about listening tastes, competing with Google’s latest song-identifying tech, supercharging the HomePod, AR advertising (Shazam’s other offering), and more.
The Mayor of London, in partnership with the UK Business Angels Association, is seeking 10 ground-breaking, London-based, AI innovators in healthcare, retail, transport, environment, manufacturing, finance, insurance and others to participate in the first event in the Mayor of London’s TechInvest Programme.
The first event in this series will be hosted by the Mayor in London’s Living Room at the top of iconic City Hall, on the 18th January 2017, and invites entrepreneurs using Artificial Intelligence to compete for the opportunity to pitch. Click here to apply. The deadline for applications is midnight on the 20th of December.
Article to Share With Your Less Data Savvy Friends
Santa’s proprietary algorithms are making his list
Think of all the AI technology he’ll enlist
Using social media, criminal databases, Watson and checking it twice
Scoring mechanisms and machine learning to discover who’s naughty or nice
Chris Rodley doesn’t consider himself an artist or a computer scientist. But when he tweeted an image he created using a machine learning algorithm that crossed a page of dinosaur illustrations with a page of flower prints, many assumed he was both. The cleverly merged image, which looks like a horticulturist’s take on Jurassic Park, went viral. Soon after, he found himself inundated with messages asking to purchase high-resolution copies.
“It was a highly successful procrastination attempt,” says Rodley, a Ph.D. student in digital cultures at the University of Sydney. He explained that he generated the mashup on DeepArt.io, an online program powered by deep neural networks (a kind of advanced machine-learning algorithm) that identify and combine stylistic elements of one image and apply them to another. No artistic or coding experience required. Check out the article for more on algorithmic art.
Check out this great article on Baidu and their quest for AI domination from Jessi Hempel. She travelled to Beijing to chronicle this tenuous moment in Baidu’s history. At 18 years old, Baidu has built the country’s dominant search engine, a business substantial enough to make it one of the most important tech companies in all of China.
China is poised to be a hotbed of AI development in the near future. [CEO and Founder Robin] Li believes he’s setting Baidu on course to own this next revolution—one that, in turn, will vault Baidu to its rightful place in the stratosphere. Soon, Li intones, his company will deliver the AI technology that infuses everything and every system—from medicine to entertainment to cars—with intelligence. “In human history, humans invented tools, and then had to learn how to use them,” Li tells me. “In the future, devices will need to learn human.”
Evan Schnidman, a 31-year-old who set up his own firm after a Harvard University PhD dissertation that looked at the Federal Reserve’s communications, is hoping the approach that lured $3.3M in a fund-raising round last December will work in the corporate sphere. St. Louis-based Prattle has until now focused on applying NLP to make assessments on Fed and other central
bank policy statements.
Schnidman’s play is that Prattle’s services will offer a productivity boost and help generate trading calls more quickly. “That report on the earnings call means that an analyst that’s struggling to keep up with covering 10 or 12 companies can cover 30 or 50,” Schnidman said in an interview by phone. It’s about “machines helping to aid the human decision process,” he said. “We’re still a long way off a point where machines are driving decision-making totally.”
MIT researchers have found that changing the way neural networks look at data makes them better at understanding languages. Researchers discovered that by applying a recently developed interpretative technique, they could analyse neural networks and assess the way they work.
What they found is that the systems typically concentrated on lower-level tasks, such as sound recognition, before moving onto higher-level tasks, like understanding the nuances of sentences and different ways they could be interpreted.
Microsoft launched AI for Earth over the summer with an initial $2M. Since the launch, the company has given out 35 grants to groups in 10 countries. The new investment will support and expand upon the current projects as well as new ventures over the next five years. “We’re impatient and a lot of partners are too,” Lucas Joppa (chief environmental scientist) told MIT Technology Review. “There isn’t enough money and resources for people doubling down on these issues.”
AI for Earth grants give groups access and training to use the Azure platform and other Microsoft AI products. The projects so far include producing a more precise land cover map of the Chesapeake Bay watershed for conservation work, a mosquito tracking program that lets researchers learn about wildlife populations through blood analysis (it also functions as an early-warning system for Zika outbreaks), and getting farmers more data to increase crop yields with fewer resources.
Medium and large enterprises are set to double their usage of machine learning by the end of 2018, according to a new report from professional services firm Deloitte. The number of machine learning pilots and implementations will double by the end of 2018, and then will double from that number by the end of 2020, Deloitte’s report predicts. Businesses spent $17 billion on the technology in 2017, and that is expected to increase to $57.6 billion by 2021.
Deloitte identified five factors that have held back machine learning growth: 1) Too few practitioners, 2) high costs, 3) tools are too young, 4) confusing models, and 5) business regulations
Off the back of a Facebook civic hackathon, David Doyle, Seattle’s open data program manager, told StateScoop: “As we enter 2018, we are encouraging all departments to think about how to potentially leverage opportunities to apply machine learning to their current and future open datasets based on the two use cases that emerged from the hackathon.”
How and where the city will apply machine learning will likely be decided in the coming months in Seattle’s 2018 Open Data Plan, Doyle told StateScoop. Looking at the larger picture, Doyle said the city also intends to begin considering machine learning as a broader piece of its operations. Also, see how the Seattle Department Of Transportation expects to save $7M within a year.