Technology, which often offers benefits for humanity, is always abused in service of capitalism
Jem Bendell is Professor of Sustainability Leadership, University of Cumbria, UK. He has recently published the book ‘Breaking Together – a freedom-loving response to collapse’.
Cross-posted from Jem Bendell’s blog
Do you ever feel a quietly gnawing discomfort at the direction technology is taking us? Not just a concern about screen addiction or misinformation, but a deeper unease: that a world is being built in which our presence, thoughts, and behaviours are constantly detected, catalogued and analysed, often without us even knowing? Perhaps it’s the sense that the tools of surveillance, often accepted for personal convenience or public security, are being normalized in all aspects of our lives. Perhaps it feels like a tide: as if an inevitable force of nature, rather than a set of human choices.
I have known that feeling of uneasy resignation for some years. But recently I came across a new study which snapped me out of that torpor. Suddenly I wanted to be clearer on what I think is unacceptable, what should be resisted, and to identify some small steps to take. Consequently, I realise this issue of technosurveillance should be firmly on the political agendas of any serious political party or individual politician. So I want to share with you some ideas on that and why it matters within my particular niche of the environment, metacrisis, and societal collapse.
The digital straw that broke the proverbial camel’s back was some news from La Sapienza University in Rome. It helped me understand that the threat of technoauthoritarianism continues to grow. Scientists at the Italian university have developed a method, nicknamed “WhoFi”, which uses the subtle disturbances we make when passing through WiFi signals to identify us with over 95% accuracy, even through walls. This is not surveillance in the form we’ve grown used to, such as cameras, facial recognition, or GPS. It is something more ambient, invisible, and insidious, especially given the widespread roll out of 5G, which uses similar signals. We don’t need to carry a phone or walk in front of a lens. Simply by existing within the reach of a WiFi or 5G signal, our bodily presence can be sensed, learned, remembered, and tracked. Let’s call it our ‘radio-biometric data’.
Ever since the Covid-19 lockdowns, which imposed restrictions on movement, people have become aware that our normal lives and freedoms can be interrupted overnight. Although there are differing views on the effectiveness, proportionality and justice of such policies, the fact they were used means that questions of techno-surveillance and basic human rights are not theoretical. Looking overseas at how geolocation capabilities are used to kill whole families of people that one military force has deemed hostile, should bring this issue further into focus.
The infrastructure for this monitoring is already nearly everywhere. Walking around my Mum’s rural village last week, I was sad to see an old oak tree had been cut down to make way for a 5G cable. That reminded me that the roll out of 5G has been rapid and aggressive, despite some sober science on the potential health risks (here and here). Already, most of our cities, homes, offices, and public spaces are soaked in WiFi and blanketed in 5G. As the signals are near ubiquitous, so is the potential for this new kind of ambient identification and tracking. Now that the research shows it can be done, the incentive to do it, for retail analytics, law enforcement, crisis management, private investigation, espionage, prisoner monitoring, or border control, is obvious. It appears that our question isn’t if this capacity will be deployed, but how widely and how secretly.
Despite the scale and gravity of this shift, there are almost no regulations preventing or controlling it. Even in regions like the European Union, where data protection laws are robust (such as the General Data Protection Regulation, or GDPR), enforcement has focused on visible technologies: cookies on our devices, camera feeds, and facial recognition databases. But technologies like WhoFi, that use radio signal analytics to derive behavioral insights, currently slip through the cracks. Identifying people through our radiofrequency ‘signatures’ and movements should count as radio-biometric data – and regulators need to catch up.
In other parts of the world, particularly where private tech companies or authoritarian governments dominate, the regulatory framework is even weaker, or non-existent. And public awareness? Minimal. Most people don’t even know this is possible, let alone happening. I didn’t.
But it doesn’t have to stay this way. That uneasy, paranoid feeling you might carry – of being watched and indexed without your knowledge – is not delusional. It’s rational. But it can also be transformed into a sense of purpose, even solidarity. Instead of hiding in fatalism, we can move into action, and in doing so, try to restore some dignity to our relationship with the digital world.
To begin with, we can be proud of our expectations for personal privacy and political accountability. These expectations stem from our belief that every one of us deserves the freedom to think, say, and do whatever we like, so long as others aren’t fairly determined to be significantly harmed by that freedom. Therefore, any curbs on our rights need to be agreed through an open political process. This view comes from a recognition of the general goodness of people, an understanding of the dangers arising from centralised power, and a belief that free societies respond to changing circumstances more wisely and fairly. Any politics which doesn’t centre such rights and freedoms is both ethically and intellectually flawed. In my book Breaking Together, I explain that this view is as true in the environmental field as it is in any other. Our pleasure and enthusiasm for the utility of technological advances, including WiFi internet and mobile telephony, does not mean we need to drop or deprioritise any of our core principles.
With these values and aspirations clear, we can push back against the fatalistic narratives that hamper action against technoauthoritarian trends. One of those narratives is the story of the inevitability of the supremacy of technology. In Breaking Together, I explain how the assumption that technologies will always spread draws upon fundamentally flawed theories. Recent histories of humanity prior to imperial conquests and colonialism show that humanity has more wisdom than to always adopt and deploy any technology without restraint. In addition, we have evidence that human societies for millennia had the intelligence to steward the energy flow of the whole ecosystem, rather than simply maximise what they receive from it. “In sum, we know that humans can live with an approach to power that requires collaboration with the rest of life, not its domination, and that this was the case for the vast majority of homo sapiens’ time on Earth.”
Another fatalistic narrative is that any concern for privacy and accountability is to succumb to the embarrassing agenda of paranoid conspiracists. I am well aware of misleading conspiracies which lead to many people dropping their concerns for environmental conservation and corporate accountability. One example is the claim that policies for ‘15 minute cities’ are not about improving the lives of the working poor but are trying to trap us in our neighbourhoods. Even my book launch in Glastonbury was targeted by one such conspiracist, who claimed in ‘Conservative Woman’ that I chose the town to challenge the anti-net-zero movement (rather than because it was near my parents). Her claim was particularly odd given that I am a critic of the carbon-centrism of mainstream environmentalism. Let’s not be deterred by purveyors of clickbait claptrap: what to do about creeping technoauthoritarianism is too important to be left to people who aren’t serious analysts, commentators or community organisers.
Another fatalistic story exists in my own field of metacrisis and collapse. In Breaking Together, I explained that there are people “who dismiss political engagement” as they “argue we can just wait for the structures of markets and government to collapse. However, that ignores how aggressive responses by large private institutions and state entities will be part of the lived experience of collapse. Although focusing on local action can provide an immediate sense of achievement, that can be delusional as global and national changes sweep away those local successes.” That’s why one of my doomster friends, John Doyle, was so passionate about his work on these issues at the European Commission, before he retired, and then tragically died, last month.
Beyond pushing back against these fatalisms, and articulating our values, we need to become specific about the kind of policies we want to see. For instance, we need a regulatory firewall – a clear legal line that says passive identification with biometrics collected via wireless signals is not acceptable without consent. What could that imply? Prohibiting the collection and storage of signal-based behavioral or identity signatures without explicit, informed consent, and then banning commercial tools that perform such sensing in public or semi-public spaces. As a start, data protection authorities in the EU and UK could begin treating signal-based identity recognition as a biometric practice requiring prior assessment, justification and consent.
National laws would not be enough. Just like climate change or cybercrime, ambient surveillance is a transnational issue. The companies building this tech may be in one country, the signals emitted in another, and the data analyzed in a third. That’s why we also need an international convention – a binding agreement between states that ambient biometric sensing, especially without consent, is a violation of human rights and privacy. Such a convention could be negotiated under the auspices of the International Telecommunication Union (ITU), the United Nations Human Rights Council (UNHRC), or even as an annex to the Council of Europe’s Convention 108+. I believe that last one is the only legally binding international treaty on data protection, which some imagine could be why US governments have been undermining the economic power of the EU.
Admittedly, an international effort is a large task. But it begins, as all political movements do, with public pressure. That means you and me.
If this issue unsettles you, don’t suppress it. Let it move you. And stop leaving it to the ‘tin hat brigade’, who will mislead people and not support practical action. You could start small:
- Write to your political representative, asking them what oversight exists over WiFi- and 5G-based surveillance in your country. Link to this article or others that provide the information.
- Support or join campaigns for digital privacy and human rights, such as Access Now, Privacy International, or the Electronic Frontier Foundation (EFF).
- Contact international bodies, and experts like the UN Special Rapporteur on the Right to Privacy, to express support for them raising the issue of passive biometric surveillance in wireless environments.
And talk about it. With friends, family, colleagues – and even fellow doomsters in my Metacrisis Meetings. This is not just a tech issue – it’s a civilizational one. If we allow ourselves to be silently indexed and identified wherever we go, without choice or awareness, we erode the very possibility of a free society. But if we act, and demand limits, we might still hold back the extent to which technology is used to make societal disruption and collapse a nastier affair.
History is full of moments when people refused to normalise the unacceptable. This can be one of them.
Be the first to comment