I originally wrote this on the ProjectVRM mailing list in January of 2020. I made some edits to fix errors and clunky phrasingI didn’t like. It is a rant and a series of observations and complaints derived from after dinner chats/walks with my significant other (who is also a nerd). This is a weak-tea attempt at the kind of amazing threads Cory Doctorow puts out. 

I still hold out hope (for privacy, for decentralized identity, for companies realizing their user trust is worth way more than this quarter’s numbers). But unless there are changes across the digital world (people, policy, corps, orgs), it is looking pretty dark.

TLDR: 

There is a reason why AR is a favorite technology for Black Mirror screenwriters. 

Where generally available augmented reality and anonymity in public is going is bad and it is going to happen unless the users start demanding better and the Bigs (GAMAM+) decide that treating customers better is a competitive priority. 

My (Dark) Future of AR:

Generally available Augmented Reality will be a game changer for user experience, utility and engagement. The devices will be indistinguishable from glasses and everyone will wear them. 

The individual will wear their AR all the time, capturing sound, visuals, location and other data points at all times as they go about their day. They will only very rarely take it off (how often do you turn off your mobile phone?), capturing what they see, maybe what they hear, and everyone around them in the background, geolocated and timestamped. 

Every user of this technology will have new capabilities (superpowers!):

  • Turn by turn directions in your field of view
  • Visually search their field of view during the time they were in a gallery a week ago (time travel!)
  • Find live performance details from a band’s billboard (image recognition!)
  • Product recognition on the shelves of the grocery store (computer-vision driven dynamic shopping lists!) 
  • Know when someone from your LinkedIn connections is also in a room you are in, along with where they are working now (presence! status! social!). 

Data (images, audio, location, direction, etc.) will be directly captured. Any data exhaust (metadata, timestamps, device data, sounds in the background, individuals and objects in the background) will be hoovered up by whoever is providing you the “service”. All of this data (direct and indirect) will probably be out of your control or awareness. Compare it to the real world: do you know every organization that has data *about* you right now? What happens when that is 1000x. 

Thanks to all of this data being vacuumed up and processed and parsed and bought and sold, Police (state, fed, local, contract security, etc.) WILL get new superpowers too. They can and will request all of the feeds from Amazon and Google and Apple for a specific location at a specific time, Because your location is in public, all three will have a harder time resisting (no expectation of privacy, remember?). Most of these requests will be completely legitimate and focused on crime or public safety. There will definitely be requests that are unethical, invalid and illegal and citizens will rarely find out about these. Technology can and will be misused in banal and horrifying ways.

GAMAM* make significant revenue from advertising. AR puts commercial realtime data collection on steroids.

“What product did he look at? For how long? Where was he? Let’s offer him a discount in realtime!”

The negative impacts won’t be for everyone, though. If I had a million dollars I would definitely take the bet where Elon Musk, Eric Schmidt, the Collisons, Sergey, Larry, Bezos and Tim Cook and other celebrities will all have the ability to “opt out” of being captured and processed. The rest of us will not get to opt out unless we pay $$$ – continuing to bring the old prediction “it isn’t how much privacy you have a right to, it is how much privacy you can afford” to life. 

You won’t know who is recording you and have to assume it is happening all of the time. 

We aren’t ready

Generally available augmented reality has societal / civil impacts we aren’t prepared for. We didn’t learn any lessons over the last 25 years regarding digital technology and privacy. AR isn’t the current online world where you can opt out, run an adblocker, run a VPN, not buy from Amazon, delete your a social media account, compartmentalize browsers (one for work, one for research, one for personal), etc. AR is an overlay onto the real world, where everyone will be indirectly watching everyone else… for someone else’s benefit. I used the following example discussing this challenge with a friend:

  • 2 teens took a selfie on 37th street and 8th avenue in Manhattan to celebrate their trip to NYC. 
  • In the background of their selfie a recovering heroin addict steps out of a methadone clinic on the block. His friends and coworkers don’t know he has a problem but he is working hard to get clean. 
  • The teens posted the photo online
  • That vacation photo was scraped by ClearView AI or another company using similar tech with less public exposure
  • Once captured, it would be trivial to identify him
  • Months or years later (remember, there is no expiration date on this data and data gets cheaper and cheaper every day) he applies for a job and is rejected during the background check. 
  • Why? Because the background check vendor used by his prospective employer pays for a service that compares his photo to an index of “questionable locations and times/dates” including protest marches, known drug locations, riots, and methadone clinics. That data is then processed by an algorithm that scores him as a risk and he doesn’t get the job. 

“Redlining” isn’t a horrible practice of the past, with AR we can do it in new and awful ways. 

Indirect data leakage is real: we leak other people’s data all the time. With AR, the panopticon is us: you and me and everyone around us who will be using this tech in their daily lives. This isn’t the state or Google watching us – AR is tech where the surveillance is user generated from my being able to get turn by turn directions in my personal HeadsUp Display. GAFAM are downstream and will exploit all that sweet sweet data. 

This is going from Surveillance to “Sous-veillance”… but on steroids because we can’t opt out of going to work, or walking down the street, or running to the grocery, or riding the subway to a job interview or, or going to a protest march, or going to an AA meeting, or, or, or living our lives. A  rebuttal to the, “I don’t have to worry about surveillance because I have nothing to hide”is that  *we all* have to fight for privacy and reduced surveillance, especially those who have nothing to hide because some of our fellow humans are in marginalized communities who cannot fight for themselves and because this data can and will impact us in ways we can’t identify. The convenience of reading emails while walking to work shouldn’t possibly out someone walking into an AA meeting, or walking out of a halfway house, etc. 

No consumer, once they get the PERSONAL, INTIMATE value and the utility out of AR, will want to have the functionality of their AR platform limited or taken away by any law about privacy – even one that protects *their* privacy. This very neatly turns everyone using generally available AR technology into a surveillance node. 

The panopticon is us. 

There is a reason AR is a favorite plot device in Black Mirror. 

It is going to be up to us. 

For me, AR is the most “oh crap” thing out there, right now. I love the potential of the technology, yet I am concerned about how it will be abused if we aren’t VERY careful and VERY proactive, and based on how things have been going for the last 20+ years. I have a hard time being positive on where this is going to go. 

There are a ton of people working on privacy in AR/VR/XR. The industry is still working on the “grammar” or “vocabulary” for these new XR-driven futures and there are a lot of people and organized efforts to prevent some of the problems mentioned above. We don’t have societal-level agreements on what is and is not acceptable when it comes to personal data NOW. In a lot of cases the industry is looking forward to ham-handedly trying to stuff Web2, Web1 and pre-Web business models (advertising) into this very sleek, new, super-powered platform. Governments love personal data even though they are legislating on it (in some effective and not effective ways). 

The tech (fashion, infrastructure) is moving much faster than culture and governance can react. 

My belief, in respect to generally available Augmented Reality and the potential negative impacts on the public, is we are all in this together and the solution isn’t a tech or policy or legislative or user solution but a collective one. We talk about privacy a lot and what THEY (govs, adtech, websites, hardware, iot, services, etc.) are doing to US, but what about what we are doing to each other? Yup, individuals need to claim control over their digital lives, selves and data. Yes, Self Sovereign Identity as default state would help. 

To prevent the potential dystopias mentioned above, we need aggressive engagement by Users. ALL of us need to act in ways that protect our privacy/identity/data/digital self as well as those around us. WE are leaking our friends’ identity, correlated attributes, and data every single day. Not intentionally, but via our own digital (and soon physical thanks to AR) data exhaust. We need to demand to be treated better by the companies, orgs and govs we interact with on a daily basis. 

Governments need to get their act together in regards to policy and legislation. There needs to be real consequences for bad behavior and poor stewardship of users data. 

Businesses need to start listening to their customers and treating them like customers and not sheep to be shorn. Maybe companies like AVAST can step up and bring their security/privacy-know how to help users level-up. Maybe a company like Facebook can pivot and “have the user’s back” in this future.

IIW, MyData, CustomerCommons, VRM, and the Decentralized/Self Soverign Identity communities are all working towards changing this for the good of everyone. 

At the end of the day, along with need a *Digital Spring* where people stand up and say “no more” to all the BS happening right now (adtech, lack of agency, abysmal data practices, lack of liberty for digital selves) before we get to a world where user generated surveillance is commonplace.

(Yes dear reader, algorithms are a big part of this issue and I am only focused on the AR part of the problem with this piece. The problem is a big awful venn diagram of issues and actors with different incentives).

Leave a Reply