11/16/2019 / By Tracey Watson
Just when you think modern technology couldn’t possibly get any more invasive, one of the tech giants takes it to the next level.
The concept of machines that can understand and interpret human emotions has long been the stuff of science fiction books and movies. Now, according to insider information obtained by Bloomberg, Amazon.com Inc is working on developing a gadget that can figure out what your emotional state is or whether you might be ill, and then make “appropriate” suggestions – like getting you to buy medicine from the Amazon.com website which will, of course, be delivered to your home in minutes via Amazon drones.
The gadget, which is being described as a “health and wellness product,” will be worn on the wrist and will use voice recognition software to recognize and interpret human emotions.
The product is still in the development stage and may never come to market, but a patent was filed in 2017 for a system that uses voice software to determine whether a person is feeling “joy, anger, sorrow, sadness, fear, disgust, boredom, stress, or other emotional states.”
Experts are already expressing concerns about massive potential privacy issues with this type of technology, especially since hackers have been able to turn other voice recognition devices like the “Echo” into live microphones, allowing unscrupulous individuals to access sensitive information. (Related: Creepy teddy bear caught leaking kids’ private conversations online.)
Imagine what a stalker could do with this type of device strategically hidden in a victim’s home and hacked to become a microphone.
Plus, Amazon itself doesn’t have the best track record when it comes to privacy issues. (Related: Amazon Echo devices spy on you in your own home… police are now trying to acquire those recordings.)
Bloomberg reported:
The concept is likely to add fuel to the debate about the amount and type of personal data scooped up by technology giants, which already collect reams of information about their customers. Earlier this year, Bloomberg reported that Amazon has a team listening to and annotating audio clips captured by the company’s Echo line of voice-activated speakers.
As reported by Interesting Engineering, this new technology would detect “abnormal” physical conditions in wearers, including symptoms like a sore throat or cough. It will also interpret your emotions from the way you speak, constantly on the alert for indications of excitement, sadness, etc.
The patent information document explains:
A cough or sniffle, or crying, may indicate that the user has a specific physical or emotional abnormality. …
A current physical and/or emotional condition of the user may facilitate the ability to provide highly targeted audio content, such as audio advertisements or promotions, to the user.
An example would be a person who coughs or has a raspy voice being “diagnosed” with a cold and being presented with the option to purchase cough drops with 1-hour delivery. At that point the device might also provide the recipe for chicken soup – the traditional home remedy for a cold – and even tell you to “feel better!”
Of course, this device is still only in the early stages of development and before Amazon can hope to market it the company will have to find a way to convince people that it doesn’t violate or put their privacy at risk.
Frankly, a device that pretends to be an empathetic friend/doctor so that it can pitch products to you might only appeal to the truly lonely who have literally nobody else to talk with.
Learn more about the privacy risks posed by voice recognition software at PrivacyWatch.news.
Sources for this article include:
Tagged Under: AI, Amazon, Amazon.com, bad doctors, Big Brother, Big Pharma, Big Tech, data mining, Echo, evil corporations, future tech, health freedom, human emotions, Jeff Bezos, medical diagnosis, Medicine, privacy watch, surveillance, tech giants, technocrats, technology, voice recognition software
COPYRIGHT © 2017 GLITCH.NEWS
All content posted on this site is protected under Free Speech. Glitch.news is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Glitch.news assumes no responsibility for the use or misuse of this material. All trademarks, registered trademarks and service marks mentioned on this site are the property of their respective owners.