Yesterday, Google announced Duplex, a new product that can conduct natural-sounding conversations over the phone, and can complete “real world” tasks while doing so.

The result is as astonishing as it is unnerving. Here’s a recording of Duplex scheduling a hair salon appointment:

Here’s a recording of Duplex calling a restaurant to book a table:

In both demos, the Duplex bot speaks to a person to make a reservation. The bot’s end of the conversation is filled with natural-sounding pauses, umm’s, and ahh’s. When the human asks a question, Duplex replies with uncanny accuracy, casually umm-ing and ahh-ing all the while.

I share Jeremy’s creeped-out reaction. Because this technology was designed to deceive humans. That’s not a value judgment, mind: the aim of the product is to act as human-sounding as possible. What’s more, the demos above are impressive because Duplex specifically withholds the fact that it’s not human. The net effect is, for better and for worse, a form of deception: Duplex was elegantly, intentionally designed to deceive. (And given that reality’s on shaky ground as it is, I don’t think this is the most responsible goal.)

For me, this immediately gets into a whole host of ethical and moral issues. (Note: I’m not an ethicist, nor am I a lawyer, nor am I anyone who can reliably match their socks.) Should Duplex be required to disclose its non-human status to the unsuspecting human on the line? If not, why not? What data is Duplex collecting from the human during the course of the phone call? Are there any consent or retention policies attached to the collection of that data?

More broadly, it’s hard to listen to those demos without wondering how services like Duplex will radically alter the workforce. In other words, whose jobs involve making these kinds of phone calls? I think of a dear friend who works in an administrative role; I think of my sister, who’s been struggling to find a job, and has been applying to customer support positions. What happens to them, and people like them, once technologies like Duplex reach the mainstream? I think we’re well past the point where our industry gets a pass for launching products without thinking about their second-order effects. But additionally: if we assume that this technology’s a given, what kind of policies and protections do we need to help the folks affected by it?

Now, I should note that in the announcement, the Google team says they’ll be working on the disclosure question:

The Google Duplex technology is built to sound natural, to make the conversation experience comfortable. It’s important to us that users and businesses have a good experience with this service, and transparency is a key part of that. We want to be clear about the intent of the call so businesses understand the context. We’ll be experimenting with the right approach over the coming months.

I’ll say this: it’s telling that matters of transparency, disclosure, and trust weren’t considered important for the initial release.