With all of my doomsday blabbering about the rise of AI (Artificial Intelligence) and the certain extinction it will (probably) bring, I thought it was time to focus on some good things about AI that are right around the corner. After all, the AI future will be good – up until it starts going bad, and one of the things I’m most excited about is Google Duplex.
Naturally, Google is the leading commercial company when it comes to practical usage and development of AI, and it’s no wonder, due to the massive amounts of data and money at its disposal. Google Duplex is set to be one of the biggest game changers in terms of real world use, and one that will make consumers stop and say “whoa… that’s pretty awesome/scary/useful/Westworld-y”.
Ready to see something amazing? Check out the demonstration, which was held at Google I/O conference this year: https://youtu.be/bd1mEm2Fy08?t=43s
Right now, Duplex only has the ability to schedule appointments, but the way it does so is scary good. I’ll remind you, that was an actual call, and while someone who has worked with AI and AI programming would notice the subtle differences in tone and inflection between the bot and the human, most people would find no perceivable difference.
This comes from the Google Assistant feature that was rolled out years ago, much to the chagrin of folks who preferred the more simple Google assistant, and found the newer version a pointless update (I, admittedly, fell into that category for a month or so). But Google, being Google, was thinking long haul, and used the new Google Assistant as the foundation and springboard from which features like Duplex would arise. All of the usage data collected and compiled gave way to this, the first AWESOME legitimization for all that data mining. Google Assistant will now finally act like an actual assistant, and not just some gimmicky party trick.
Siri be like:
And best of all, Duplex was trained using your voicemails! That’s right kids. When you signed up for Google Voice, or use any Google services to transcribe your voicemails, Google’s TensorFlow technology learned all of the weird speech patterns that humans use in everyday speech (things that we don’t even think about, like peppering conversations with “like” or “um” and used that to develop the awesome ability to sound just like us.
Just. Like. Us.
That’s what made that moment when the AI said “Mm hmm…” so funny (and amazing) at the same time. Implementation? Rumors has it, Google may begin rolling out this feature to test groups later this year, to allow for more real-world testing and polishing. Full scale roll-out? If the stars align, and Google keeps the project up for consumer use, I dare say next year.
When I hear anything more about it, I’ll have my people call your people.