Tuesday, May 28, 2024

How Apple and Google Will Get Us Talking to Our PCs

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

In the future, we’ll talk to our computers, and they’ll talk back. We’ll speak regular English, and they’ll figure out what we want and try to give it to us.

I know this is true because I saw it on Star Trek.

This future is coming sooner than you think — you could be talking to your various devices every day as soon as this year. The technology is ready. The biggest barrier to widespread voice command is not the computer, but the user.

How will the industry move us from our current habits around typing, pointing and clicking to the final frontier of talking to our computers?

The leading technology companies are and must be in the user-modification business. And let me tell you, it’s a tough racket. Humans are stubborn animals, creatures of habit. We don’t like to change the way we do things.

Ordinary companies pander to the existing expectations and habits of the popular user base. But extraordinary companies like Apple and Google lead in part by skillfully changing user behavior.

Apple and Google have discovered a secret weapon in the war on ingrained customer habit: The cell phone.

Talk to the Handheld

It’s hard to change user behavior when someone is using a desktop computer. The user is already sitting in a comfortable chair, using a keyboard and monitor of any size they want, powered by a CPU that would have been a supercomputer a decade earlier.

It turns out that people are more willing to change their behavior with a cell phone, thanks to the inherent limits of the device and the useful things made possible by doing things differently.

Apple and Google (and Microsoft and all the other major companies with serious R&D labs) know what the future looks like, more or less. They know that the PC of the future is a huge, smart sheet of glass that works like a sophisticated iPad. They know we’ll talk to computers. They know that our cell phones will inform myriad services of our exact location at all times, reporting back everything we do.

The public doesn’t know or accept all this, for the most part. The challenge for the companies who want to sell us the future is to change us — alter our expectations, behaviors and desires.

Let me give you one example each from Apple and Google about how they’ve already achieved radical change in user behavior thanks to mobile phones.

At first, nobody prefers an on-screen keyboard to a physical one. They’re awkward to use. They don’t give you tactile feedback. Apple wants to sell you one, eventually, so they can get rich off their many patents for on-screen keyboard technology, and also make your life awesome. But how?

Apple’s brilliant, long-term plan centered around the iPhone. They shipped it in 2007 with no physical keyboard option. They didn’t make one. And they didn’t let anyone else make one, either. We all wanted one. But we were denied.

At first, everyone bitched and moaned about the tiny, clumsy and slow iPhone keyboard. But eventually, users grudgingly accepted them. More than 100 million users.

When Apple came out with the iPad, its medium-size on-screen keyboard seemed huge compared to the iPhone’s. Apple had already acclimated us. In fact, Apple feels we are so well conditioned by the iPhone’s on-screen keyboard that we’ll accept their the desktop keyboard technology they just filed a patent for, which has no keys or moving parts at all. Later, we’ll also embrace their desktop tablet version of Mac as well.

Had Apple shipped no-key keyboards for desktops first, they would have been rejected. But Apple trained us first with phones, then slowly transitioned that behavior to larger devices.

Google does it, too. If you had told me ten years ago that millions of people would actively seek out technologies that tracked their location at all times, I would have thought you were nuts. But today, we all do. The benefits of location technology — from turn-by-turn directions to location-based social networking to smart coupons — are so compelling that people can’t wait to buy phones and install apps that reveal our exact location at all times to who-knows-who.

Google’s participation in this radically altered user expectation is part of a larger strategy to acclimate us to all kinds of privacy invasions. I’m not suggesting that Google’s strategy is sinister. They know we want our privacy to be invaded in exchange for a world of user benefits. Gmail servers read our e-mail and serve up advertising related to the topics of conversation in those e-mails. When we sign up for Google+, Google already knows who are friends are, and offers them up to us so we can invite them.

Google’s best tool for changing user expectations around privacy is the Android platform in particular, and mobile apps in general. By downloading the Google+ app and selection a feature, every picture you take on that phone is automatically uploaded to the cloud somewhere.

When you’re on Google+, those pictures are already “posted,” but privately. The click of a single button makes them instantly public. That same app lets you see the public Google+ posts of complete strangers — people you’re not following and who are not following you — based on their proximity to you.

More than Just Talk

Both Apple and Google have done it before, and they’ll do it again. They will train us to talk to our computers like Star Trek, and they’ll do it with our cell phones.

Apple acquired a small mobile voice-controlled virtual assistant app in April called Siri. The app lets you talk in natural language and ask for things like dinner reservations, directions and other tasks, and the app will use the phone’s location system, contacts and other apps to make it all happen intelligently.

According to a post in the blog 9to5Mac, Siri technology has been spotted inside the upcoming iOS 5 version of operating system that powers iPhones.

The voice-command part of Siri is based on technology by Nuance, the leader in desktop voice command and dictation. It also happens that Nuance itself is one of Apple’s closest and newest partners.

A report in the blog TechCrunch revealed that much of Apple’s giant new billion-dollar data center in North Carolina has been dedicated to powering Nuance voice recognition technology.

Clearly, Apple is preparing massively for a voice-controlled future, where millions of users are talking to their Apple devices.

I believe Apple’s strategy is to mirror their successful plan on keyboards — change people’s behavior with mobile phones, then move that behavior up the chain all the way to the desktop.

Voice-command is the secret sauce that will make giant desktop touch tablets possible. Nobody wants to use a touch screen for every input, to replace everything we do today with mouse and keyboard. But a combination of talking and touching will prove very compelling.

Apple’s mobile virtual assistant technology will be nice, but the ultimate purpose of it is to enable Apple to lead the future of desktop computing, which will be giant, voice-command and gesture-controlled touch tablets.

Google also wants to steal the future of computing from Microsoft. That company has already been busy training users to talk to their handhelds. One of the coolest and most popular features of their general mobile app is voice-command search. You press a button and talk. Google does a great job of recognizing what you say, then turning it into a Google search.

Google has more than just voice-command search in store for us. Late last year, then-CEO of Google, Eric Schmidt, told the Wall Street Journal:

“Let’s say you’re walking down the street. Because of the info Google has collected about you, ‘we know roughly who you are, roughly what you care about, roughly who your friends are.’ Google also knows, to within a foot, where you are. Mr. Schmidt leaves it to a listener to imagine the possibilities: If you need milk and there’s a place nearby to get milk, Google will remind you to get milk.”

While milk-loving people everywhere will love the mobile Google app of the near future, the ultimate end game, I believe, is the future of computing in general. Schmidt laid out a very clear vision for this future some four years ago. He suggested that by 2012, “The goal is to enable Google users to be able to ask the question such as ‘What shall I do tomorrow?’ and ‘What job shall I take?’”

In other words, Google wants to start with being your personal assistant on your phone, but get promoted to being a personal advisor, counselor and consigliere. They’ll start with helping with your errands, and end up giving you career counseling.

The era of the voice-command virtual assistant in your phone is upon us. But that’s just the beginning. Once Apple and Google train us to talk to our mobiles, it’s just a matter of time before they have us talking to all our computers. Just like on Star Trek.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles