The dark patterns in "your choice" of cookies
Where deception is the rule rather than the exception
Dark patterns are magic. They work by deception. Misdirection. They are also endemic in user interfaces where the creators want (and typically get) people to do things they really don’t want to do. Deceptive Design has a long list of examples. None, however, are more common than what the perps call “your choice” of cookies.
For example, here’s a pop-over on the front of 10 Breakthrough Technologies 2023, in MIT Technology Review, as it appears on my computer in California:
The easiest thing to do—and what the site clearly wants me to do—is click on the “Accept cookies” button. The second-easiest thing is to click on the little “x” to make the popover go away (which it will). The third easiest is to click on “Cookie settings.” That gets me this:
The slider for “Sale of Personal Data” is pushed to the right, presumably meaning ON. That’s the default.
Now let’s say I click on the “+” to see what’s under “Sale of Personal Data.” When I do that, I get this:
“Sale of Personal Data is expanded, but to where I can’t see it without scrolling down. and even then, not all of it appears. First I get this—
—still with “Sale of Personal Data” defaulted to ON, and “Confirm my choices” persisting at the bottom of the frame. Scroll down the rest of the way and I get this:
This is where I learn (though I already know, because I follow this kind of thing) that the California Consumer Privacy Act (CCPA) applies here. I also need to go through a gauntlet like this with every freaking website, using systems like this that the websites and their third parties (OneTrust in this case) alone provide.
Here is what’s key: The law, and the lawyers interpreting it for cookie gauntlet suppliers, presume that you have no more independence and agency than the sites alone provide. And this is not exclusive to California. It is the presumption behind ALL privacy law in the world as it pertains to the parties those laws variously call “consumers” (California) or “data subjects” (Europe).
Now let’s ride on a VPN out of California to some other state in the U.S. (In this case, New York.) The first popover is the same as the one up top in California. Clicking on “Cookie settings” there brings up this gauntlet:
Again, everything is defaulted to ON, but there is a choice e to “Reject all,” and it’s the same color as “Confirm my choices.” So the design pattern here is less dark than it would be if “Reject all” had a blank background.
Now let’s fly our VPN to The Netherlands. Using a different browser (visiting the same page afresh), the first popover, identical again, stays on screen while clicking “Cookie settings” brings up the second popover in the same window, but this time with the cookie choices defaulted to OFF, in gray:
Interesting also to see the UK Cookiepedia Giving Consent to Cookies page in the little link that appears, and then disappears, in the bottom left corner of the page. (If I hadn’t done the screen grab I wouldn't have had a URL to type into a fresh browser tab.)
So, what would you do when confronted by a popover cookie notice in front of an article like this?
Click the little x in the corner of the first popover to make it go away?
Ignore or navigate around the popover?
"Accept cookies" in the first pop-up?
"Confirm my choices" in the second popup without changing "Sale of Personal Data"?
"Confirm my choices" in the second popup after switching "Sale of Personal Data" to OFF (pushing the blue toggle to the left, turning it gray), but without hitting the + on that choice first?
Hit the + on "Sale of Personal Data," to unpack what's under it, before choosing to allow the sale of personal data by leaving the toggle defaulted to ON?
Hit the + on "Sale of Personal Data," to unpack what's under it, before choosing to deny the sale of personal data by switching the toggle to OFF, turning the switch from blue to gray?
The best guess about where information about you goes on that page if you do anything other than #5 or #7, and have no tracking protection in your browser, appears in this PageXray, courtesy of Fou Analytics:
And maybe much or all of that happens anyway, regardless of what "choices" you make. You have no way to tell other than by looking into your cookie collection (which is no picnic) and trying to figure out what got shoved in there by the website and its third parties. You'll probably still know little or nothing about what those cookies do. Or did.
This "system" for sites to comply with the CCPA and/or the GDPR (and sold to website operators by the likes of OneTrust) is pure theater. At best it’s something site operators believe they have to do because that’s what lawyers tell them to do. At worst, it’s an insincere fig leaf over the intention to continue tracking you and collecting data for the site operator and its many third parties. So they can aim advertising at you. Personally.
To be fair, Technology Review's participation in surveillance capitalism is way below average for an online publication.
For average, take a PageXray of TheAtlantic.com. For above average, dig what SmithsonianMag.com is up to:
Going left to right, that’s a distribution delta for personal information about you.
Going back to the subject of the Technology Review article, my wish for the biggest Breakthrough Technology in 2023 is an easy and standardized way for any of us to control, audit, and maintain all our agreements with websites and suppliers. That must begin with offsite (or all) tracking turned OFF by default. And for any of us to make sure it’s turned off at scale, meaning across every site we encounter.
That kind of thing won't be coming from Big Tech (Apple, Amazon, Facebook, Google, etc.) or Big Gov (the .eu, the .us, states, or provinces). All Big Govs have done so far is limit (or try to limit) what Big Tech can do. None of their efforts have created space for developers to give forms of agency to ordinary folk, for them to use at scale on the Web.
The solutions we need—ones with the light we need to make dark patterns disappear—will have to come from those ordinary folk, plus friendly geeks, nonprofits, and standards organizations that are not in the direct or indirect employ of digital advertising or other personal data harvesting businesses.
Two examples of work already underway:
IEEE P7012 Standard for Machine Readable Personal Privacy Terms.
Customer Commons’ #NoStalking, which is the first of those Personal Privacy Terms.
I invite readers to help with either or both of those.