Google's doubling down on A.I everywhere
This week Google held its annual developer conference, I/O, and the focus on its progress in AI was at the center of everything. Its new Google AI website shows just how many public projects it's working on, which include hardware.
Google Lens, an app for your phone, identifies objects in the real world if you point your camera at them. It's not lost on me that it's just like the Hotdog or Not app from this week's Silicon Valley episode, but it is legitimately useful – one demo showed the app detecting the type of flower the user was pointing it at.
It's a very similar concept to Google Goggles, an app from years ago that could search real world objects, but takes it to the next level thanks to machine learning.
What's interesting about Google Lens isn't that it's a standalone app, but Google's strategy is to integrate that technology into its other apps. It's coming to Google Photos first – which has a whopping 500 million active users – so you can search the objects and places inside the photos themselves.
Google's AI work is finally reaching a point where it's not a gimmick, and the projects it's working on are not trivial. Another project it announced is the Cloud TPU, which could be poised to change the AI game altogether.
Cloud TPU is a custom-designed processor built for training neural networks and actually using them in production. It's a huge deal, because it brings training time down from a day or more to a matter of hours – changing the way it can be used at scale.
The chip won't be on sale – for now – but available for use through the Google Cloud platform. With optimized processing power becoming quickly available for technology like Google Home, it's no wonder mis-recognized words have been able to be cut in half in just a year.
We're at an interesting cross-roads in technology right now. With tech giants competing on AI at high clip, innovations are only going to speed up. Just two years ago, the idea of a highly-accurate voice assistant sitting on the bench in your home was absurd, but is quickly becoming normal.
What's contrasting here is Apple's approach to AI. Google is accelerating away incredibly quickly because it's not restricting AI processing to on-device like Apple does.
We'll probably see more on this front at WWDC next month, but Apple's approach means it doesn't have enormous data sets to process like Google does, and is clearly resulting in slower progress. Siri is a great example of this, which right now looks incredibly basic compared to Google's Assistant.
I'm constantly reminded of this great post on understanding AI's impact on the world at large, and it really feels like we're standing on the precipice where it's about to accelerate away from us incredibly quickly.