by Scott David, 20 October 2016, https://www.weforum.org/agenda/2016/10/wireless-earphones-fourth-industrial-revolution?utm_content=bufferfa6bf&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer
To think that the new Apple AirPods are just the latest iteration of headphones misses the point. As does the idea that they are about liberation from tangled white cords. Their significance is far greater than that. These are the first signs of the Fourth Industrial Revolution, an age when computing will become an extension of the individual and part of everything we do.
This is about the beginning of mass adoption of new behaviours for communicating with computers. Ambient computing and invisible interfaces are taking hold, and changing our relationship with technology.
We are experiencing a step change rather than something new. Bluetooth headphones are already readily available, but what is different here is the AirPod’s integration with artificial-intelligence-driven assistance equipped with natural language voice recognition – Siri.
Other voice-activated services such as Google Home, Microsoft Cortana, Amazon Echo and IBM Watson Conversation are all playing on this field. What is changing is that these companies are starting to get the experience design right, from the hardware that accurately hears the voice input, to the artificial intelligence cloud services that understand the intent, to the quality and relevance of the actions or answers performed in response. There is a lot that can go wrong between a voice command and an artificially intelligent answer, but each link in the chain is improving quickly.
The AirPods are filled with miniaturized sensors and features. They detect when you’re talking and use beam-forming microphones to filter noise, to get a clear voice signal to the cloud, so it can hear exactly what you’re asking for. This is a crucial first step in voice interface usability. And Apple have introduced a new gesture (or “haptic”, as they are known) to the human-computer relationship. We’ve seen gestural relationships before. Point, Click, Drag came with the mass adoption of personal computers, the graphical user interface and the mouse. Tap and Swipe ushered in the smartphone era and ever-present computing into our lives. This new “tap tap” is the sound of an invisible screen-less interface being activated.
Like Amazon’s Echo, a device that sits in the home, waiting to seamlessly deliver cloud services following voice commands, this is the art of product design. The right human-centred design solution removes barriers to usage, and the result is that it begins to feel natural, ambient and available. At what point will it feel like the headphone is doing the work, when your mobile phone rarely comes out of your pocket?
Natural Language Processing in the cloud is a gateway technology to the use of other interconnected machine-learning services. It is also a form of AI that big tech companies have started to get right, thanks to better analysis of bigger data sets, and algorithms that understand language. It explains the recent explosion in text based chatbots that are being integrated into conversational interfaces, from Facebook Messenger to WeChat, Telegram and Skype. Expect to see a lot more of them soon.
Invoke Article 33 of the ILO constitution
against the military junta in Myanmar
to carry out the 2021 ILO Commission of Inquiry recommendations
against serious violations of Forced Labour and Freedom of Association protocols.
#WearMask #WashHands
#Distancing
#TakePicturesVideos