Google, like almost every other tech company in 2024, has decided to push AI into every product it can. It’s not really surprising because Google was one of the original pioneers of consumer tech AI and it has been hiding it in plain sight for years. Now, because AI is the new hot buzzword, all bets are off.
Android & Chill
One of the web’s longest-running tech columns, Android & Chill is your Saturday discussion of Android, Google, and all things tech.
Much of what Google showed us will be coming soon or is already here. Other things are pie-in-the-sky projects that may never see the light of day. Remember Project Loon? Both real products and ones that will never come to market have one thing in common: they are potential privacy nightmares.
This, too, is nothing new to Google. Because of how the company operates, it has to be a bit on the invasive side. When our data is the force that drives revenue our data is going to be collected and analyzed. Nothing in life is free, even Gmail.
Much of what’s coming will be managed locally thanks to some flavor of Gemini Nano. I don’t want Google listening to my phone calls, but as long as it’s done on my phone and not in a cloud server room somewhere, I’m not going to freak out about it. Similarly, the “new” Gmail will use AI to help manage my inbox. That data is already in Google’s cloud, so I don’t care. I might even turn it on.
Some other examples were a bit more worrying.
Google sees all your stuff and Google never forgets
Google also showed off a couple of ideas that are privacy nightmares in the making. Soon, we may have to worry about Google’s object recognition just like we (should) worry about Google’s facial recognition.
Project Astra was possibly Google’s coolest announcement in its terrible I/O Keynote presentation. We saw an interactive assistant who could see and hear everything we could and then act on the information it was gathering.
A lot of people got laser-focused on a potential next-gen Google Glass that was shown in the demo, but I sprung to attention when Astra said, “Your glasses are on the table next to the red apple.” 🚨🚨
I’m not going to lie; this could be a helpful tool in a lot of ways for a lot of people. I jokingly remarked that we’ll never again lose the remote control, but there are real practical uses. Like, where is my EpiPen or emergency inhaler? Finding either of those is a bit more important than finding your keys in most situations.
But — and there is always a but — some serious issues could arise from this. Imagine a subpoena for Google user data asking if Astra ever saw any sort of contraband in your home. Anything from marijuana to ammunition to birth control could be incriminating, and if Google has access to this data, it will comply with a lawful subpoena. It has to.
I’m not condoning anything illegal because it’s just none of my damn business what you do inside your home. I’m also not saying a tool used in this way is always going to be bad — sometimes cops need all the help they can get. I’m just saying that it could happen.
Seemingly more innocent, but potentially far more dangerous, was Google knowing your car’s license plate number. My God, there are a bunch of ways this could be handy and helpful, but of course, it’s also problematic.
Google can scan all of your photos and see which ones are of cars. It can use location data to see which photos were taken as well as the number of times a specific car appears to determine which car is yours. Then it reads the license plate number and keeps track of it.
That sounds innocent until you think about how this data could be cross-referenced. A judge may sign a subpoena for photos or data about any cars in the location of a crime when the crime was being committed.
That might help catch a criminal, but it’s just as likely to get someone hauled into the police station to answer questions because they popped into Circle K for a snack or a drink right before someone robbed the place.
This type of object recognition, trained through your photos, isn’t going to stop at car license plates.
Tools like these are powerful, and that means they can be dangerous. Google is really good at developing these sorts of things, and I’m not going to think about how this tech could be used for military or law enforcement, but it would be in high demand.
That means Google has to do it right. And it can. Exactly what this tech is doing, an overview of how it does it, and clear and concise information about what data is retained and its encryption status is a must. If Google doesn’t start there, nothing else matters. Most of us probably won’t read any of it before we tap the OK button, but it has to be there in plain language.
The next step is making all of this opt-in. Unless you say, “I want it!” you’re not getting it. Google is usually pretty good about this, but there have been “mistakes” where it just dumps new and invasive things onto you.
Finally, it has to decouple these features from everything else. If there’s a problem, shut it down, fix it, then push the fix to everyone before turning it back on. We can’t afford to wait for Motorola to decide to update our phone to fix something this important.
I’m cautiously optimistic. I have no doubt that Google can pull off the cool new tricks it has shown us because it already does things just as intricately and amazingly. I’m also certain that tools like Astra can be useful in many situations, and I’ll probably use them, providing the terms feel good to me.
Unfortunately, I’m worried that Google will have privacy blunders along the way. It wouldn’t be a Google product without them.