I have a little brother – but funnily enough, it's very possible I'll soon gain a Big Brother, one we'll all have in common. As governments across Europe and beyond push to introduce AI-driven surveillance tools in the name of safety, we should ask ourselves: how much are we truly willing to sacrifice for security?

It was recently announced that Luxembourg police will soon gain a new tool: automatic license plate recognition. But it doesn't stop there. The system also records the entire vehicle, the driver, and any passengers. Yes, this technology has existed in other European countries for years – Luxembourg is only just catching up, quite a bit behind its neighbours – but I couldn’t help feeling slightly uneasy. The kind of feeling you get when you realise there’s a CCTV camera above you, even if you’re doing nothing wrong.

"Data is the gold of the 21st century." That phrase seems to have been taken to heart by companies, governments, and social media platforms alike. Almost everything about you can now be collected: your face, your name, your voice, your preferences, your sexual orientation, even whether you prefer the train or the car for your holidays. Nothing feels sacred anymore, and everything can be sold or hacked.

Companies collect data to sell you things. After all, if you're not paying for the product, you are the product. Every click, like, and search is a way to target you with ads. But now governments have picked up the same playbook, collecting data not to sell you shoes, a new TV or a holiday, but to monitor, predict, and control.

And of course, the excuse is always the same: data is collected to fight crime and protect citizens. Take the UK, for example, with its Online Safety Act. The bill was enforced in July of this year to protect children and filter "harmful content." But the measures require age checks where you need to scan your face and provide identity documents. Why the quotation marks around harmful? Because the term can mean anything, depending on who defines it – and to what end.

Age checks are now required for all sorts of things: Spotify, blogs, even Wikipedia or your favourite fantasy football page. Places where people discuss addiction, eating disorders, or war – everything now requires age verification. And that means uploading your personal documents, over and over again, to random websites.

Every move you make online, every website you visit, logged and stored. Say 'bye bye' to your privacy. But don't worry, I'm sure uploading your ID, face, and passport to random sites and trusting third-party verification systems will be totally risk-free. Data breaches are so rare, after all. Your personal information will be perfectly safe in the hands of any website, right? Well, I have bad news for you. To give you an example, Discord had a massive data breach this week, where almost 2 million users have had their government ID hacked and leaked...

In terms of filtering, this type of work is done by AI – which brings its own problems. AI doesn't have the capacity for critical thinking, which means that even now, something as harmless as a famous painting can be censored or flagged. Heaven forbid our fragile eyes be exposed to Francisco Goya or Picasso.

Obviously, the goal is respectable. We absolutely need to protect children online, especially in a world where access to everything is just a click away and screen time keeps climbing. But what I don't trust are the companies and websites collecting this data – either selling it immediately, losing it to hackers, or leaking it themselves.

In Europe, there's growing talk of Chat Control. This week, European leaders once again reviewed the proposal introduced in 2022, aimed at combating the online distribution of child abuse material. In short: every photo, message, and file you send would be automatically scanned before encryption.

MEPs insist there are enough safeguards to prevent overreach into citizens' privacy – yet critics argue the proposal opens the door to mass surveillance.

Of course, EU politicians have exempted themselves from this surveillance under "professional secrecy" rules. Privacy is now a luxury. Chanel who? I have privacy, darling.

The intention may be noble, but the implications of such sweeping surveillance tools in the wrong hands could prove profoundly dangerous.

In a time when the far right is increasingly leading the orchestra in many places, and when politicians cling to control above all else – both in Europe and beyond – such a tool could easily be used to surveil and punish anyone who dares to voice a differing opinion, profoundly undermining our freedom of speech, and also, simply said, our freedom.

We're already seeing hints of this mindset across the Atlantic. After the Charlie Kirk shooting, for instance, Vice President Vance stated that anyone celebrating his death should "face consequences." Of course, the attack itself was abhorrent, and gun violence affects people indiscriminately. But the principle remains: once tools like Chat Control exist, they can – and inevitably will – be used by those in power to monitor, intimidate, and silence dissent. The introduction of such a tool in Europe would normalise it across its borders, with consequences for everyone.

Digging even deeper, these scanning systems introduce new vulnerabilities: they create backdoors that criminals or hostile actors can exploit, effectively weakening encryption and compromising everyone's data security.

Even at the United Nations, child protection experts have warned that blanket surveillance is not an effective tool to protect victims.

While the European Commission continues to debate the proposal, I'm relieved to know that Luxembourg stands firmly opposed. We must ask ourselves how much freedom and privacy we are truly willing to surrender in the name of security. The right to privacy is a fundamental value – one that defines who we are as a democratic society. It is not something the European Union should so readily, or willingly, sacrifice.