by Luca Passani, @Scientia_CTO , CTO @ScientiaMobile
You don’t need to be an art expert to know what Surrealism is. But if you are not too familiar with the term, don’t worry, Google is there to show us what Surrealism is!
“a 20th-century avant-garde movement in art and literature which sought to release the creative potential of the unconscious mind, for example by the irrational juxtaposition of images.”
People take search engines for granted today, but if you, like me, have witnessed the web being born, this is still pretty impressive.
Just another click and Google will offer great examples of the art that best “expresses that irrational juxtaposition of images”.
Yep. Exactly what I meant. All pretty surreal. The Surrealist movement made such an impact on today’s culture that “surreal” has come to signify bizarre, a great term to describe outlandish situations that occasionally (hopefully rarely) we encounter in our lives.
A few days ago, Google’s Chrome Dev Summit 2020 took place. Every presentation was online (and still is), courtesy of Covid. That’s what 2020 has come to imply because of the pandemic no one was able to foresee (hindsight 2020).
One of the presentations had my attention: “Introducing the Privacy Budget”, a presentation in which Google, the company that knows everything about each and everyone of us, wants to explain how they are going to cripple HTTP to make it impossible for everyone else in the industry to infer anything about users.
But I need to pause and proceed with order. Why was I interested in that presentation specifically? you may ask.
Some background. I am the CTO (and co-founder) of a company that offers Device Detection tools. Device Detection does NOT mean determining the identity of users. Rather, it means knowing what browser and device a user is… using to access your service. Content providers can leverage this information to tailor the user-experience for those browsers and devices. Without going into too much detail, device detection relies on the time-honored HTTP protocol. One HTTP header in particular, the User-Agent string.
It appears that Google wants to remove the User-Agent string, or, more precisely, they want to “freeze it”, i.e. making sure that it will no longer identify Android devices, which, without getting too technical, is about the same as removing it for the purpose of Device Detection.
As Google explains, companies with a ton of Machine Learning/Artificial Intelligence machinery can harness the User-Agent string, along with properties returned by other JavaScript libraries, to create pseudo-user IDs that (approximately) identify users, i.e. a technology people in the Ad Tech industry calls fingerprinting.
In essence, Google wants to make fingerprinting impossible and make sure that companies in the digital ecosystem no longer have a chance to associate users with a profile of some kind. And, according to their infinite wisdom, killing the User-Agent string is the way to achieve that.
Snap back to the presentation. I hear the presenter speak and I watch the slides go by. Also here, I see a great example of Surrealism being delivered by Google.
To summarize.
Google,
the company that knows virtually everything about each of us (in addition to Facebook),
the company that tracks our every online (and offline!) move each step of the way,
the company that has us sign in on all services they offer (each one of them so useful by now that we can’t live without them) on all the devices we have (laptop, tablet, phone, SmartTV and Google Home),
is telling us that
they don’t like that other companies can identify users too
fingerprinting is BAD, BAD, BAD,
they’ll go out of their way to prevent fingerprinting.
Wow. THAT’S SURREAL!
In short, if you pardon a little military analogy, Google is allowed to have guns, …and cannons,…and bombs… and nuclear weapons. But everyone else is not supposed to have anything, even just a hammer to hang a picture on the wall. And that’s because, in the parallel world that Google is depicting, they are the only good guys in town and nobody else should be trusted with even a teeny tiny fraction of the possibilities and benefits that Google itself has enjoyed over the years.
[Round of applause here for this amazing piece of Surrealist art]
There’s a big pie, the digital economy. It’s a pie that has created and should keep on creating tens of millions of jobs for people and companies around the planet. Google and the other big guys are already taking the lion’s share of the pie, leaving just crumbs to everyone else. But that’s still not good enough. Google wants everything!
Ok, so… If fingerprinting is so bad, why don’t we avoid Surrealism and just make fingerprinting illegal?
Because, for better or for worse, the internet is made of services that are paid by ads which bid for our attention.
Until we find another incentive for companies to provide these services cost-free, we still need ads. Ad networks fund the internet, and for these economic engines to continue running well, they need to compete within a healthy market. Using privacy as a fig leaf, Facebook and Google have effectively monopolized the Ad Tech market.
If Google really cared about user privacy how about they simply discard profile information every three months and collect it anew if the user still agrees? How is that?
What about banning the use of personal information to serve ads of any kind? How about that?
Now, these seem to me like the proposals from someone who genuinely cares about user privacy. Removing the User-Agent string, which holds virtually no identifying information, is just going to further stifle healthy market competition while only Google benefits.
Regulating user-profiling and giving all actors a level playing field, that’s the way to go.