Google’s computer programs are gaining a better understanding of the world, and now it wants them to handle more of the decision-making for the billions of people who use its services.
CEO Sundar Pichai and other top executives brought Google’s audacious ambition into sharper focus Wednesday at an annual conference attended by more than 7,000 developers who design apps to work with its wide array of digital services.
Among other things, Google unveiled new ways for its massive network of computers to identify images, as well as recommend, share, and organize photos. It also is launching an attempt to make its voice-controlled digital assistant more proactive and visual while expanding its audience to Apple’s iPhone, where it will try to outwit an older peer, Siri.
The push marks another step toward infusing nearly all of Google’s products with some semblance of artificial intelligence – the concept of writing software that enables computers to gradually learn to think more like humans.
Google punctuated the theme near the end of the conference’s keynote address by projecting the phrase, “Computing that works like we do.”
Pichai has made AI the foundation of his strategy since becoming Google’s CEO in late 2015, emphasizing that technology is rapidly evolving from a “mobile-first” world, where smartphones steer the services that companies are building, to an “AI-first” world, where the computers supplement the users’ brains.
AI unnerves many people because it conjures images of computers eventually becoming smarter than humans and eventually running the world. That may sound like science fiction, but the threat is real enough to prompt warnings from respected technology leaders and scientists, including Tesla Motors CEO Elon Musk and Stephen Hawking.
But Pichai and Google co-founder Larry Page, now CEO of Google corporate parent Alphabet Inc., see it differently. They believe computers can take over more of the tedious, grunt work so humans have more time to think about deeper things and enjoy their lives with friends and family.
Other big tech companies, including Amazon.com, Microsoft, Apple and Facebook, also are making AI a top priority as they work on similar services to help users stay informed and manage their lives.
Google believes it can lead the way in AI largely because it has built a gigantic network of data centers with billions of computers scattered around the world. This while people using its dominant internet search engine and leading email service have been feeding the machines valuable pieces of personal information for nearly 20 years.
Now, Google is drawing upon that treasure trove to teach new tricks to its digital assistant, which debuted last year on its Pixel phone and an internet-connected speaker called Home that is trying to mount a challenge to Amazon’s Echo. Google Assistant is on more than 100 million devices after being on the market for slightly more than six months and now is trying to invade new territory with a free app released Wednesday that works on the operating system powering Apple’s iPhone. Previously, the assistant worked only on Google’s Android software.
Google’s assistant will be at a disadvantage on the iPhone, though, because Siri – a concierge that Apple introduced in 2011 – is built into that device.
A new service called Google Lens will give Assistant a new power. Lens uses AI to identify images viewed through a phone. For instance, point the phone at a flower and Assistant will call upon Lens to identify the type of flower. Or point the camera at the exterior of a restaurant and it will pull up reviews of the place.
Pinterest has a similar tool. Also called Lens, it lets people point their cameras at real-world items and find out where to buy them, or find similar things online.
Google Photos is adding a new tool that will prompt you to share photos you take of people you know. For instance, Photos will notice when you take a shot of a friend and nudge you to send it to her, so you don’t forget. Google will also let you share whole photo libraries with others. Facebook has its own version of this feature in its Moments app.
One potentially unsettling new feature in Photos will let you automatically share some or all of your photos with other people. Google maintains the feature will be smart enough so that you would auto-share only specific photos – say, of your kids – to your partner or a friend.
Google is also adding a feature to Photos to create soft-cover and hard-cover albums of pictures at prices beginning at $9.99. The app will draw upon its AI powers to automatically pick out the best pictures to put in the album.
…