Samsung Acquires Viv, AllSeen Alliance Disbands, and Bias in NLP
Yitaek HwangYitaek Hwang
Viv became the latest AI company to be acquired. TechCrunch announced earlier this week that Samsung has agreed to acquire Viv. Although Viv has yet to launch since its celebrated demo in May at Disrupt NY 2016, it is widely regarded as the better version of Siri, developed by the creators of Siri themselves. With Google releasing Google Home earlier this week and the other tech giants rushing to dominate the voice interface platform, what does this mean for Samsung?
Summary:
Takeaway: Tech giants are figuring out that the future of IoT is ultimately going to be determined by how the users interact with and integrate all the connected devices into their lives. Samsung’s SVP Jacopo Lenzi nails it in his interview, “We do see the evolution of the customer experience being enabled by AI particularly as we continue to add devices to their system, to IoT, and the importance of something like this to really allow you just to engage with technology in the way they really want to which is simple conversational interface.” Here we see that the next wave of IoT is not just in building chat bots or AI assistants, but building an interface for consumers with which to interact.
+ Forbes: What Amazon gets about Smart Homes that Google yet doesn’t
AllSeen Alliance, the steward of AllJoyn led by Qualcomm, reported that it voted to disband last week according to Stacey Higginbotham. This means the end of AllJoyn and another device discovery and communication standard for IoT becoming obsolete. This leaves Intel’s Iotivity and Google’s Weave to try to accomplish a network standard to make IoT more accessible.
Summary:
Takeaway: One of the main issues for consumer IoT adoption is the lack of standards. While consumers are more or less familiar with Wi-Fi, Bluetooth, and NFC, it’s rare to find someone who understands lotivity, ZigBee, Z-Wave, or Weave. While cellular carriers are rushing to build their own networks for IoT, maybe coming to an agreement for a standard should be on our priority list.
+ Wareable: Thread, ZigBee, Z-Wave: Why Smart Home standards matter
+ Medium: Three reasons carriers are building new networks for IoT
Approximately 8 of the 319 million people in the United States read the Wall Street Journal, about 2 percent of the population. If you look at the language — standardized English — being fed into many natural language processing units, it’s based on the language of that 2 percent. And many machines literally use the venerable, business-focused newspaper to better understand the English language.
- Tonya Riley
Brendan O’Connor, assistant professor of computer science at the University of Massachusetts Amherst, points out that the way tech giants train their AI systems doesn’t necessarily reflect how we typically speak. It is lacking diversity. Many NLP tools like Google’s SyntaxNet (deep learning language processing framework) fail to pick up language construction of dialects. This leads to Google’s search systems to push websites that are primarily written in African-American languages, further down the search results due to the way it was trained.
O’Connor sees this as a reflection of bias and lack of diversity in AI systems currently. AI systems are only as good as the data we feed it. As NLP continues to get better, it may be time to employ linguists to make sure the training set reflects not just standard English but the slang and dialects used by 98% of the population.
New Podcast Episode
Recent Articles