privacy is a value we can lose
Sometimes, I think about the fact that society at large could just stop caring about data protection and privacy, and there goes everything that I worked towards and am passionate about. Humbling.
These laws are young. Not that people didn’t want privacy before, it’s just that as more data was collected, recorded and then processed via the earliest information processing systems (card punch systems and early computers), more needed to be protected. The more is written down and stored, the more this need arises.
1890 saw the right to privacy emerge in the US. Later on, people were understandably more wary of governments collecting data on citizens after WW1+2. Still, the world’s oldest data protection law is from 1970. Important law work around the idea of protecting personal data happened from then onwards - Germany’s Volkszählungsurteil 1983, the US’ HIPAA 1996, the EU’s 1995 Data Protection Directive and 2002 ePrivacy Directive, the General Data Protection Regulation (GDPR) being passed in 2016 and going into effect in 2018, just to name the big, well-known ones.
But governments, priorities and views change. This could be a blip in history.
You can already see a sort of resignation in many people (“They track us all anyway, what does it matter? I have nothing to hide.” etc.) and the selling of data is becoming a very established and acceptable practice.
The air around it is sort of like: “Oh well, we want to use these services and advertise on them, and they have a lot of costs associated with hosting billions of users and we want to target our ads better. If that’s the price we have to pay, so be it.”. I already wrote about data being the cookie jar, where even non-data-broker businesses now want your data to sell as an additional income stream.
Everyone nowadays has an incentive to collect as much data as possible, not just to sell it to AI companies for a good sum, but also to potentially train their own AI. Businesses feel pressured to implement AI into anything they can, which raises the risk of employees entering sensitive data into it and sending it straight to OpenAI et. al. for training - that is, if they aren't using unapproved, so-called "shadow AI" on shady websites or wrappers, unclear who receives the data.
Data protection officers and other privacy professionals feel coerced into going along with some dicey setups and risky processing activities because they can’t afford being seen as a Luddite (or a technophobe, if you think the former is an unjust slur!) who will advise against everything and “hinder progress”, aka cost saving via AI replacement. Even I was told that I should probably make a LinkedIn account so companies would get a signal that I am not the "activist type" and I have a higher chance of being hired!
Governments are also shifting more right and fascist in a lot of places, which goes hand-in-hand with less protections, deregulation, and increased surveillance and criminalization. These types of parties and leaders do not care about upholding privacy if it means they get to target groups more easily - just look at how ICE tracks people in the US.
I wrote about the EU’s Digital Omnibus a while ago, which threatens to severely weaken the GDPR. The parties backing this deregulation and even asking for more are far-right parties and fascist tech bros.
The unfortunate reality is: What would have raised eyebrows just 10-20 years ago is shrugged at now. We got used to a level of data harvesting that used to be unacceptable. I wonder sometimes if, or rather when, we will reach a point at which privacy is no longer even valued on paper.
A point at which EU governments value total surveillance under the guise of digitalization, immigration control and protection of kids over heeding the EU Charta of Fundamental Rights, specifically Article 7 and 8, which guarantee the right to respect for private and family life and protect personal data.
A point at which the majority sees so much value in extreme data harvesting tech like social media, smartphones and AI that no cost is too great and they’d rather give up privacy than lose access or have a slightly worse tool.
People sometimes say to me that this field is so safe, as tech will increase and always be very integrated into our lives. I wouldn’t be so sure about the first part; it assumes that everyone will always see personal data as worth protecting, and I don’t think that’s a given.
Privacy and control over your own data is not a natural law, it's a social and political choice that only exists as long as people care enough to defend it.
Reply via email
Published