top of page
Robert Stines

Are you talking to a machine . . . maybe?



At the Google I/O, we watched in awe as Google's artificial intelligence (Google Duplex) called a local hair salon to make an appointment. The crowd laughed as the machine said "Mm-hmm" and "Uhmmm" - subtle fluctuations to imitate a human speaking.


People commenting on this demonstration were equally amazed and disturbed that the human on the other end of the line did not know she was talking to a machine. Of course, one question was "is that legal?" Shouldn't Google notify the human that she is speaking to a computer? Presumably, Google's throng of attorneys analyzed this issue before deciding to green light the demonstration.


The critical question is whether AI, such as Google Duplex, has to disclose itself when asked. Meaning, if the hair salon representative had asked "to whom am I speaking," could the AI lie? From a common sense and ethical perspective, it would seem that AI developers cannot and must not program AI to make false statements. In other words, AI should be regulated by the 'Blade Runner' rules, which makes it illegal for AI applications such as social media bots, chatbots and virtual assistants to conceal their identity and pose as humans.


According to recent reports, Google has updated Duplex to disclose that it is a machine. One has to wonder if this decision was driven by litigation concerns. Consumers could potentially raise misrepresentation or fraud claims if they are led to believe they are talking to a human, but are actually talking to a computer.


The Ashley Madison lawsuit is an example of a fraud claim related to AI bots. In 2015, there was litigation related to a data security breach that allegedly occurred at AshleyMadison.com, a “dating” website designed to facilitate intimate relationships for individuals who are either married or in a committed relationship. The data breach received most of the attention, but there were also allegations that Ashley Madison made extensive use of artificial intelligence “bots” and other mechanisms to mimic fake users (specifically, female users) to induce actual (predominantly male) users to make purchases. In 2017, the court approved a settlement for several million dollars.


Regulating Artificial Intelligence

There is little doubt that emerging technology is driving society into a cyber revolution. Some welcome the change, others are fearful of a superintelligent machine. Google Duplex is a glimpse of our future and highlights the need for regulation. Unfortunately, regulation is notoriously slow and reactive. Companies like Google should continue to listen to their consumers (and their attorneys) to do the right thing.



1 Comment


Dale Largie
Dale Largie
Jul 03, 2018

The future is scary. They always say 'movies are a figment of the truth', which I must agree. As I read the article - and others of a similar nature, I can't help but to this of movies like 'Terminator', 'Matrix', 'I-Robot', and the like. AI is a threat. There will be those with good intentions and there will be those with bad intentions. However, both good and bad will produce AI that will change our way of life. For better or worse has yet to be determined.

Like
bottom of page