You will also receive a complimentary subscription to the ZDNet’s Tech Update Today and ZDNet Announcement newsletters. You may unsubscribe from these newsletters at any time.
Google Cloud’s wide-ranging partnership with the NCAA will be on full display this weekend during the Final Four men’s basketball tournament.
Using a predictive analytics architecture and workflow built over the last few months, Google says it will produce and air real-time ads during half time that will attempt to anticipate what will happen in the second half of the Final Four games.
The end-to-end analytics architecture leverages Cloud Spanner, Cloud Datalab and BigQuery to ingest, load, and analyze observations from the first half of each game, along with decades of historical NCAA data, to make predictions about the second half. Google said the predictions will include things like the number of three-point shots each team might attempt.
“As halftime starts, the real work begins. We’ll have only minutes to turn our prediction into a TV spot,” Google Cloud team member Courtney Blacker wrote in a blog post. “Our creative team will take the prediction generated by our team of data scientists and data analysts and create the ad right there in the Alamodome, using a real-time rendering system built by Cloneless and Eleven Inc.”
Cloud AI wars: Re:Invent 2017: AWS all about capturing data flows via AI, Alexa, database, IoT cloud services | Google I/O 2017: Here’s what we learned | Google bets on AI-first as computer vision, voice recognition, machine learning improve | How Microsoft plans to turn Azure into an ‘AI cloud’ | IBM enhances Watson Data Platform, with an eye towards AI
“This is likely the first time a company has used its own real-time predictive analytics to create ads during a live televised sporting event,” Blacker said.
Beyond game play predictions, Google’s real aim here is to show off the possibilities of predictive analytics and machine learning in an example that’s relatable to the general population.
The NCAA partnered with Google Cloud last year and has been migrating 80-plus years worth of game data across 24 sports and 19,000 teams over to Google’s architecture. With the NCAA’s team and game data on Google’s cloud, Google’s arsenal of machine learning technology has also been used to improve the tournament selection and team seeding process.
PREVIOUS AND RELATED COVERAGE
Google’s Cloud AutoML uses the company’s research and technology to enable enterprises to customize models and tune algorithms with their proprietary data.
If you don’t have cable, you can still stream for free all 67 NCAA tournament games through the iPhone, Android phones, web, tablet, and set-top boxes (including Apple TV and more). Here’s how to watch online — plus, all the March Madness Live dates and schedules you need to know.
Customer service calls can be … infuriating. Part of the reason is that humans generally aren’t great at reading subtle emotional cues, especially if we only have voice to go by.
At the same time, we often inadvertently broadcast unintended emotional signals, easily leading to miscommunication and discomfort over the phone.
But an MIT spinoff called Cogito is using voice analytics to help customer service reps better understand how customers are feeling. The technology behind Cogito’s enterprise product, which can predict a customer’s emotional state by analyzing tone and voice patterns, has also been used to identify signs of PTSD and depression in veterans.
It doesn’t take a huge imaginative leap to envision the same technology giving computers and robots a simulated version of empathy.
The analytics Cogito developed arose out of conflict.
In 2001, MIT Media Lab professor Alex “Sandy” Pentland was in India to launch Media Lab Asia. “I noticed a lot of the meetings we had, particularly the board of directors, were awful,” Pentland told MIT News, which has tracked the company it helped launch.
The problem, Pentland surmised, was the way people were communicating their ideas–not necessarily the words they were using, but the tone and emphasis behind the words.
Out of that experience grew Pentland’s infatuation with quantifying how people speak, which often stands in contrast to what people are saying. That is, Pentland wanted to understand subtle cues in speech and tone, as well as body language, that have nothing to do with language.
To aid his effort, MIT researchers developed what they call sociometers–name badges with embedded sensors that track patterns in speech and body movement during conversation.
The researchers were able to predict the outcome of interactions like job interviews to an extraordinarily high degree without actually listening to the words being spoken. There is, as behavioral scientists have long held, a rich layer of communication in every interaction that happens independently of language.
Pentland’s research quickly turned to healthcare, where he found voice analytics could help detect symptoms of depression or determine whether doctors and patients really unstand each other during interactions.
More recently, DARPA and the U.S. Department of Veterans Affairs have given Cogito, the company Pentland formed in 2007 with former MIT MBA student Joshua Feast, grant money to determine if the technology can be used to flag veterans likely suffering from PTSD, which could help the VA deliver services more effectively.
During a 2013 clinical trial supported by DARPA, Cogito noticed an increase in attitudes linked with PTSD among trial participants following the Boston Marathon bombing.
The money, of course, is in enterprise products.
About five million Americans work in call centers. Cogito created a product called Cogito Dialog, which analyzes voice signals to determine things like customer engagement and frustration. The software gives call center employees real-time feedback during calls, allowing them to adjust their approach.
A case study with Humana, a large health insurance company, revealed a 28 percent increase in customer satisfaction and a 63 percent increase in employee engagement during calls when using voice analytics tracking.
“It’s aiding that intuitive understanding we have when we listen to people,” Pentland said, “helping people do that better.”
Cogito’s enterprise clients include MetLife and United Health.
Being able to foster communication and connection beyond language could have interesting implications in diplomacy and conflict resolution, and its bound to improve our experience with AI assistants and robots.
I’d certainly love to have a real-time analytical readout during political conversations at large family gatherings.