The Big Data Grab
This article was first published on LinkedIn. Subscribe to Rupert Lee-Browne's 'Future of Payments' newsletter to read his articles as soon as they are published.
I bought a T shirt the other day. Pretty unremarkable except that the assistant in the Armani store at Westfield (a very special type of Hades) was more interested in getting my name and email address than checking I was 110% happy with my purchase. Why? The store maintained it’s to my benefit so they can provide a personalised service and offers. But the reality is that they, like most organisations now, want to grab as much data about me as possible. However it is presented, I remain unconvinced that it is any benefit to shoppers. If anything, it’s to our detriment – particularly when it comes to having control over what personal information is kept by others and therefore at significant risk of being leaked or stolen.
And it’s not just shopping – every aspect of life now involves giving away private data in the name of service, security, compliance, whatever the latest BS reason. Whichever way we look, our privacy is being eroded. But whilst we consumers work on the basis that there is a level of protection afforded to us by government, the direction of travel is not good. On 10th of September 2021 the UK Government launched a consultation that made it clear that while we had until then codified GDPR into UK law, some changes were coming. And indeed, it’s recently become clear that we will see some changes to privacy laws in this country.
While this is all presented as something that will make life easier for all of us, we have to wonder if that’s strictly true – or if it is more clearly geared towards benefitting some of us. While there is nothing egregious in the report, reading between the lines uncovers a slow erosion of our privacy rights. One such example is cookies. Currently all websites accessible to users from Europe and the UK must ask for our consent to place cookies on our machines. Under the new regulations, “non-invasive” cookies would once again be legal for businesses to place whether we like it or not.
This might not seem like a big deal to those who don’t spend a lot of time keeping themselves a jour with cybersecurity and the impacts it can have on our personal and professional lives. I, and many among us who do keep abreast of changes, are more concerned, however.
What’s particularly concerning is the apathy the public are slowly succumbing to about our data, and big companies’ reliance on this apathy to work in a way which may be easier for them, but which in no way benefits the public. Current privacy laws are designed to keep individuals’ data safe, and by necessity mean that certain things will be harder for companies.
New iterations of privacy laws seem to have been designed to increase opportunities for companies to grab onto as much data as they’d like, with no regards to how this may affect the public – and, perhaps more worryingly, to benefit large-scale AI models.
Very few understand the potential benefits and limitations to AI algorithms, and indeed how they may change the landscape over the coming ten years. Most likely AI represents huge opportunities, but with significant challenges that we must tackle – and ideally before those challenges become critical.
What we do know is that AI relies on previous data to train its algorithms to make future predictions. And while giving it access to people’s sensitive data will likely make its predictions more specific, the issues with giving it unfettered access seem much more looming, and much more impactful than the potential benefits.
In the government’s response to their consultation, much reference is made to avoiding “impossible standards” of behaviour. But running a business myself, which does adhere to these “impossible standards,” I know they’re very much possible to achieve. Are they sometimes more difficult to adhere to? Of course. But do we really want just anyone to be able to run a company which handles people’s sensitive data?
It seems to me that if some people find these standards impossible, that’s a clear sign that they may not understand the importance of keeping people’s data safe and secure at all times. And if these “impossible standards” mean that there are fewer companies who provide services, but we can be assured that they all understand the gravity of the situation, I’m okay with that.
There’s so many more things to say about the topic, not least the accompanying discussion around data security, but I’ll end on this: we must remain vigilant to ensure that we don’t let more and more of our personal data slip out of our hands almost by accident. There’s the analogy of the boiling frog: if you drop a frog into a pot of boiling water, it will naturally jump out. But if you place the frog into a pot of water which slowly starts boiling, it will remain in the pot without noticing when it reaches the point of no return. Let’s not act like the frog in regards to our own privacy, our own data, and the consequences of letting companies run rampant through our private lives.