Welcome to your taste of DRW's fortnightly newsletter!
This week: the growing environmental impact of AI data centres, DRW’s top podcast and article picks, a look at AI surveillance of Australian protests, and finally, a call-out to our readers.
Digital rights around the world
📍Albania - Albania has an AI-generated female MP, who has announced she is pregnant with 83 AI babies. The babies will assist in different government departments. When questioned on the use cases for these children, the PM explained confusingly “this child will say what was said when [the member] is not in the hall and if [the member’s] name was mentioned and [they] have to counterattack someone who mentioned you for the wrong reasons”. Gotcha.
📍Ghana - The President has promised citizens that new misinformation and cybersecurity bills are “not meant to restrict expression, but to safeguard citizens and uphold truth in the digital realm.”
📍Kenya - Human Rights Watch has raised concerns that Kenya’s new cybercrime law could be used to punish online dissent, risking the state's constitutionally enshrined freedom of expression.
📍Tanzania - A 5 day internet shutdown conducted for the recent election cost the country USD 238 Million in lost productivity, trade and digital services.
The environmental impact of the AI boom
Australia has more than 275 data centres, and the federal government is actively pursuing the development of additional facilities, positioning the country as “the ideal location to build green data centres.” Microsoft and Amazon have each announced major Australian data centre expansions, with Microsoft investing AUD 5 billion in 2023 and Amazon allocating AUD 20 billion for 2025–2029.
Data centres process, store, and distribute vast amounts of information and are essential for keeping the internet running. But AI is reshaping their role: AI models require enormous quantities of data to train and refine, and constant access to that data to respond to user queries. This surge in computational demand is driving massive growth in data centre construction.
Data centres have a tendency to guzzle large quantities of water in order to maintain safe operating temperatures, sometimes leaving entire communities without water. Globally, AI infrastructure is projected to use six times more water than the entire nation of Denmark. AI developers such as Amazon also have a history of lying about how water is being used. For a country like Australia, long affected by drought, this raises serious concerns.
The energy burden is equally significant. In 2023, data centres consumed more electricity than Saudi Arabia and almost as much as France. A single data centre can consume enough electricity each year to power 50,000 homes. Experts have warned that Australia’s energy grid is ill-prepared for the rapid increase in demand driven by AI. Consuming this much energy results in huge carbon emissions. Indeed, projected data centre energy use has already forced the Albanese government to soften its emissions reduction targets.
As Australia races to position itself as a hub for “green” data, it must also reckon with the hidden costs of powering artificial intelligence. Without clear limits on energy and water use, the promise of innovation risks deepening the country’s environmental vulnerabilities.
What is DRW reading?
How Australia’s national security chief used AI to write speeches and ‘personnel communications’
Spoiler alert: He used an AI chatbot. The privacy and national security implications of using a blackbox AI for such sensitive information are mind boggling.
The Australian Digital Inclusion Index
The index has been updated! While Australia’s digital inclusion score has continued to improve, around 1 in 5 Australians still experience a level of digital exclusion with this number doubling among Indigenous Australians.
We are reading the feedback from our AI roadshow events!
We asked attendees how democracy could survive AI, and we have been busy analysing your responses, here are some findings;
30% of all responses mentioned regulation of AI.
15% mentioned education.
13% mentioned accountability (with a focus split between tech bro accountability, personal accountability and tech sector accountability).
13% mentioned collective action.
A big thank you to everyone who attended!
What is DRW listening to?
‘Digital Rights Activist Lizzie O'Shea Knows How to Change the Future’ - 5AM StoryTalk
DRW’s Chair Lizzie O’Shea discusses digital rights activism and how it connects to the rights of artists with Cole Haddon, a screenwriter and journalist. Lizzie provides some ideas on how artists and the public can fight back.
‘Tech Mirror’ - ANU Tech Policy Design Centre
Tech mirror is a 5-part mini-series, with an episode dropping each Tuesday/Thursday for the next couple of weeks. Episodes will cover topics ranging from the harms of emerging technology, the politics of tech and the next steps.
How is AI used to surveil Australian protests?
Melbourne Council has faced criticism over its plan to install 100 new CCTV cameras with 60 of them privately owned. As the site of most major protests in Victoria, the council’s proposal raises concerns that the cameras could capture and identify protesters. The council has also indicated it may use facial recognition on the footage, deepening privacy risks. Surveillance tools that collect biometric data and track protesters without consent expose political beliefs, enable disproportionate policing, and create a chilling effect that discourages democratic participation.
At the federal level, the Department of Home Affairs’ contract with Babel Street grants access to Locate X, a smartphone tracking tool capable of identifying and tracking every device present at a protest. Home Affairs has justified its use of the tool by claiming it relies on “commercially and publicly available information,” exploiting a major loophole in Australian privacy law.
Meanwhile, private companies such as Clearview AI scrape images from protests to build facial recognition databases. Despite an order from the Office of the Australian Information Commissioner requiring Clearview to cease operations in Australia and delete local data, there is little evidence it has complied.
Interested? Read more in our Submission to the UN here.
Upcoming campaign: AI scribes in Healthcare
We’re gearing up to launch a new campaign to determine the role AI transcription technology has on the efficacy of healthcare, personal privacy implications, and issues of bias in AI systems.
Do you have lived experience of harm from AI transcription in general practice or specialist healthcare? We’d love to hear from you.
Sharing your experience with us will help us as we shape the campaign and try to bring about positive change to uphold standards of care and limit privacy and information security risks posed by the adoption of AI scribe tools in healthcare.
Please reply to this email with your lived experience to help us develop our campaign.
In solidarity
Lucinda and the whole Digital Rights Watch team
P.S. Regular donations from our supporters sustain us. If you have the means, we always deeply appreciate donations of any size. If you would rather volunteer your time and expertise sign up here!
P.P.S. Know someone who might be interested in our newsletter? Send them this link to sign-up