Interview with Dr Miah Hammond-Errey

Digital Rights Advocate, Kate Bower

In October, I had the pleasure of seeing national security analyst, Dr Miah Hammond-Errey speak on a panel about mis- and dis-information at this year’s SXSW Sydney. I was impressed with her nuanced and informed take on the topic and how she described the data-extractive business models of digital platforms as key to understanding and therefore tackling mis- and dis-information. It aligned strongly with our thinking at Digital Rights Watch, that we need to disrupt the business models of Big Tech and digital platforms, rather than rely on content moderation as a solution to the mis- and dis-information problem, and the best way to do that is by strong and meaningful reform of our privacy law. In this interview, Miah reflects on the unlikely pairing of national security and privacy, the role of human rights and how we might regulate Big Tech.


Kate: Your background is in intelligence and national security, how did you become interested in privacy?

Miah: It is a good question! As a national security analyst, I am perhaps an unexpected privacy advocate. The short answer is that I had the luxury of analysing technology and security for my PhD and leadership roles and recognised that without addressing privacy vulnerabilities we cannot resolve the security vulnerabilities either.

The longer answer is that I spent 18 years leading federal government analysis and liaison activities in Australia, Europe, and Asia and awarded an Operations Medal for intelligence. During my work I saw the impact of technology on security and was offered a PhD scholarship. My research is now published as a book, Big Data, Emerging Technologies and Intelligence: National Security Disrupted. Privacy is an essential consideration in big data and ‘no book on big data would be complete without discussing privacy’, the start of the privacy chapter!

Having the opportunity to explore technology and security depth for a PhD and then leading teams focused on it gave me a privileged insight into the technology ecosystem broadly. It led me to see that without addressing tech specific privacy concerns, there will be no resolving the security vulnerabilities inherent in the technology ecosystem. 

More broadly, the way new technologies are diffusing through society prompts consideration of what kind of world we want to live in and create. I established Strat Futures Pty Limited, to empower and inspire leaders in security and technology for safer nations. I want to help create a safer, more secure and equitable world. 

Kate: We’re experiencing a bit of a tech policy bonanza with the Federal government moving forward quickly on cybersecurity, online safety, mis and dis info codes (now dumped) and a social media ban but moving at a glacial pace when it comes to privacy reform, and regulating biometrics and AI. From your perspective, what are we getting right about regulating technology and conversely what are we getting wrong?

Miah: It was a busy end to parliament sitting this year! I am very pleased to see tech policy high on the agenda, even though I don’t agree with all the proposals. The scope of reform is vast, and it isn’t possible to be across it all. In general, I think looking at legislative and policy reform through the lens of a technology as an ecosystem–and developing a more nuanced understanding of it–is helpful! 

The government has made immense progress in cybersecurity (the strategy and 2024 Act) and significantly improving online harms regulation, which I think is to be applauded. In the longer term, I think there has been wide ranging and impressive policy work in critical infrastructure, cyber and critical technology affairs, aspects of technology in national security and cybersecurity.

I’d love to see tranche two of privacy reform—now—that disrupts the extractive data economy and resolves some of the privacy and security vulnerabilities. I’d like to see a more nuanced approach to social media for young people – something I’ve covered here, arguing we should aim higher with a more nuanced, comprehensive and adaptable policy levers. 

We need more work on mis and dis-information and social cohesion as well as increased investment in research and policy development around information influence and interference and countering state sponsored information operations. I’m also closely following recent competition legislation and policy announcements and various legal action in Australia and the US. 

Domestic responses are key to establishing a distinctly Australian response. Obviously though, given the global nature of technology aspects of this will need to be progressed bilaterally and in multi-lateral forums. 

I’d love to see tranche two of privacy reform—now—that disrupts the extractive data economy and resolves some of the privacy and security vulnerabilities. I’d like to see a more nuanced approach to social media for young people.

Kate: National security concerns are sometimes used as an argument to restrict human rights and enable government overreach; how do you think about human rights in the context of national security?

Miah: I think sometimes, especially in politics, the intention of security gets a bit muddied. National security at its core is about the values that a society chooses to protect and invest in. Making that judgement is an inherently political act. National security includes physical safety, geopolitical security and broader national and economic interests but can also include culture, diversity, acceptance and equity. 

In an interview with Sir David Omand for my podcast Technology & Security, he described human rights and security as coming from the same bucket, which I think is a helpful analogy. He said, “it’s a balance within the basket of human rights where, you do see nations potentially going off the rails is when they somehow see security as separate from and in a different dimension than human rights.” 

In a democracy, national security concerns need to be addressed in a way that is consistent with the human rights context of that country. All democracies are different. Technologies are also shifting the context of both human rights and national security, changing their scope and increasing the pace and scale of change. This makes it an important time to clearly reaffirm and refine Australia’s position on both. I think Australia needs to be secure and a place where human rights are respected and that the two can coexist. 

Kate: The recent US election has been another example of Big Tech’s influence, particularly the way Elon Musk used his platform and his wealth to influence the electorate, do you think Big Tech has become too big to regulate (successfully)?

Miah: This is a big question! In short, it will be very tough, but I believe still possible. 

National governments need to approach technology problems as individual issues, rather than focusing on the behaviour of companies and individuals. For example, as national security threats, non-competitive behaviour, specific social harms, data breaches, privacy and digital service standards. I think this is happening fairly well–albeit very slowly! We can see this happening in legislative reform, many cases brought before courts and regulatory actions across the US, Europe, UK and Australia among others. 

I’ve written on Musk’s impact on X most recently in the context of Bluesky and previously regarding state propaganda.  The decline of the platform both as a credible platform and in usership, changes the landscape as disillusioned users are searching for alternative platforms. 

Of course, technology (development, use and regulation) is a global affair. Trump’s election will absolutely alter US and global technology development, access and regulation. Some argue it will merge tech and state power, where the interests of select technology companies become indistinguishable from US government policy. 

There are many key tech companies, individuals and agendas in the US who will seek to influence the Trump administration’s policy. What stands out for me is likely uncertainty, volatility and a focus on the short term (markets, investments and futures) that will characterise policy in the US for a while.  

Dr Miah Hammond-Errey is the founding CEO of Strat Futures Pty Limited and host of the Technology & Security podcast. She guides organisations and advises leaders on emerging technologies, intelligence, data, national security, cybersecurity and leadership. Her recent book is called Big Data, Emerging Technologies and Intelligence: National Security Disrupted. Dr Hammond-Errey spent eighteen years leading federal government activities in Australia, Europe, and Asia and was awarded an Operations Medal. She previously established the Emerging Technology Program at the US Studies Centre at the University of Sydney and ran the information operations team at the Australian Strategic Policy Institute. She is member of Australian Institute of Company Directors, teaches postgraduate cyber security at the University of Sydney and is affiliated with the Deakin University Cyber Research and Innovation Centre.