Issue 166: CognitionX Data Science, AI and Machine Learning


The future of education in this AI world is a topic front of mind for many. CognitionX community member Professor Rose Luckin (UCL Knowledge Lab) has just published a new paper in which she lays out her thoughts on how we can deploy AI for assessment to replace our current ‘stop-and-test’ methods.

Do you think it is ethical for AI to assess children’s performance? Answer below and please leave a comment as well.

Yes          No          Not Sure

Best,

Tabitha UntiltheBotsTakeOver Goldstaub

Inspiration

Towards artificial intelligence-based assessment systems

We were very excited to hear that Rose Luckin, Professor
of
 Learner Centred Design at the UCL Knowledge Lab, has published a paper on the use of AI in educational assessments (check out our panel at DLD where she discussed this, if you haven’t already).

In the paper, she discusses the fact that “‘Stop and test’ assessments do not rigorously evaluate a student’s understanding of a topic. AI-based assessment provides constant feedback to teachers, students and parents about how the student learns, the support they need and the progress they are making towards their learning goals.”

She discusses the cost of AI assessment, the potential ethical concerns, and how we can make it a reality. To do so, “it will be essential to work with educators and system developers to specify data standards that prioritize both the sharing of data and the ethics underlying data use. It is also essential that we use the older AI approaches that involve modelling as well as the more modern machine-learning techniques.”

Future of Transportation

Drive.ai is retrofitting vehicles to make them self-driving

In the race to develop self-driving cars, Mountain View startup Drive.ai is focused on making the vehicles think more like humans do, drawing on deep learning.

“When you make a self-driving car’s brain, there are two approaches. One is traditional rules-based robotics and the other is deep learning,” said CEO Sameep Tandon, who co-founded Drive.ai with seven others, mainly alumni of Stanford’s Artificial Intelligence Lab. Deep learning “can adapt to harder environments and handle more-nuanced scenarios in everyday driving — who has the right of way at a stop sign, is it safe to make a right turn on red? It’s hard to write algorithms for all scenarios.”

Drive.ai is creating a hardware and software kit to retrofit commercial vehicles, such as delivery trucks or ride-hailed taxis — “any business with fleets of vehicles,” Tandon said — for self-driving. In the next few weeks, it will announce partners interested in using its technology. It is still hashing out a business model, such as whether it will sell kits directly, operate a service to install them or have customers pay per mile.

Education and Advice We Rate

Data science in 30 minutes: predicting content demand with machine learning

The Data Incubator, a data science fellowship program, is currently running a Data Science in 30 minutes webinar series. Next week (9 March at 5:30PM EST) features a free webinar with Dr. Becky Tucker of Netflix. Dr. Tucker is a Senior Data Scientist at Netflix where she specializes in predictive modelling for content demand. The webinar is free; all you need to do is register.

The talk will cover work done by Becky and the Content Data Science team at Netflix, which seeks to evaluate where Netflix should spend their next content dollar using machine learning and predictive models.

Art

Amper raises $4M to use AI to write music

Amper, a startup that offers AI-powered music composition, is announcing that it has raised $4 million in funding.

You might not expect a film composer like Drew Silverstein (who founded the company with Sam Estes
and Michael Hobe) to create a product that is ostensibly competing with his own work, but Silverstein doesn’t see that way.

Instead, he pitched Amper as a fast, affordable and royalty-free way to create the music for more “functional” projects (like commercials a or short online videos), where most companies would currently use pre-written stock music. Silverstein said he’s been approached about these projects in the past but found that “the economics just don’t work” — the budgets just aren’t big enough to justify creating new music.

Business Impact of AI

S&P Global Announces Strategic Relationship and Investment in Kensho

S&P Global, a leading provider of transparent and independent ratings, benchmarks, analytics and data to the capital and commodity markets, announces that it has entered into a strategic relationship with Kensho Technologies, Inc. (Kensho), a provider of next-generation analytics, machine learning, and data visualization systems to Wall Street’s premier global banks and investment institutions.

The two parties have agreed to a long-term commercial relationship which will result in product and data collaboration between Kensho and S&P Global’s Market Intelligence division, a leader in multi-asset class research data and insights.

Through this strategic relationship, S&P Global Market Intelligence data will feed Kensho analytics platforms and serve as a basis for existing and new Kensho analytical tools. Both parties will collaborate on future product development to bring new and innovative capabilities to market. Additionally, S&P Global will have a board observer seat at Kensho.

Ethics Questions for the Day

AI being turned against spyware

Dr Eva Maia at VisionTechLab, a young cybersecurity firm in Matosinhos, Portugal, said that attacks on computer networks are not only multiplying, they are also growing sneakier.

In the EU-funded SecTrap project, VisionTechLab has been studying the market for a new line of defence that could rob malicious software of its current hiding places.

The challenge in cybersecurity is to use this ability to distinguish between innocent and malicious behaviour on a computer. For this, Alberto Pelliccione, chief executive of ReaQta, a cybersecurity venture in Valletta, Malta, has found an analogous way of educating by experience.

ReaQta breeds millions of malware programs in a virtual testing environment known as a sandbox, so that algorithms can inspect their antics at leisure and in safety. It is not always necessary to know what they are trying to steal. Just to record the applications that they open and their patterns of operation can be enough.

Chat Bots, yadda yadda yadda

As Messenger’s bots lose steam, Facebook pushes menus over chat

Facebook’s Messenger bots may not be having the impact the social network desired. Just yesterday, online retailer Everlane, one of the launch partners for the bot platform, announced it was ditching Messenger for customer notifications and returning to email. Following this, Facebook today announced an upgraded Messenger Platform, which introduces a new way for users to interact with bots: via simple persistent menus, including those without the option to chat with the bot at all.

The Messenger platform update yesterday tackles the problem that it hasn’t always been obvious how to get a bot talking, by offering an alternative to the more limited – and sometimes confusing – systems that were previously available.

Instead of forcing users to talk with a bot, developers can choose to create a persistent menu that allows for multiple, nested items as a better way of displaying all the bots’ capabilities in a simple interface.

Something to Get Involved In

Amazon Alexa Fund Fellowship

The Amazon Alexa Fund is establishing an Alexa Fund Fellowship program to support academic institutions leading the charge in the fields of text-to-speech (TTS), natural language understanding (NLU), automatic speech recognition (ASR), conversational artificial intelligence (AI), and other related engineering fields. The goal of the Alexa Fund Fellowship program is to educate students about voice technology and empower them to create the next big thing.

The program will launch with four universities: Carnegie Mellon University, University of Waterloo, University of Southern California (USC), and John Hopkins University. Each university will receive cash funding as well as Alexa-enabled devices and mentorship from the Alexa Science teams to develop a graduate or undergraduate class curriculum. To learn more about our first group of Alexa Fund Fellows, check out the full blog post.

I’ve been making some changes based on Feedback. Would love to hear from more of you. Please do click to share your thoughts! 


Save

Published in
0 Comments

Leave a reply

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
ErrorHere

Log in with your credentials

or    

Forgot your details?

Create Account