Guide: What does it mean for a city to commit to digital rights?

Thank you to everyone who has written to their local Council urging them to sign on to be a Digital Rights City!

If you are eager to help your community transform, but are not sure what that means in practice, this post is for you! We will also be sharing it with the Councils who have responded positively (and there are quite a few!) and asked for more information.

You can download a PDF version of this page here. If you would like a high quality printable version, you can send us an email at

At Digital Rights Watch, we believe local government has a huge opportunity to shape the way we all experience human rights on a local level.

In 2018, the Cities for Digital Rights initiative was launched in Barcelona, Amsterdam and New York City with the support of the United Nations Human Settlements Program. More jurisdictions around the world have signed on since; Sydney was the first in Australia. Others are taking it further still—​San Francisco recently moved to ban all use of facial recognition surveillance.

I’m interested in getting my council involved, what are they signing up for?

To be part of the initiative, cities sign on to the UN Declaration of Cities for Digital Rights. In doing this, cities make a commitment to embed human rights into their smart city plans, digital transformation initiatives, and frankly, all other areas of activity. 

 There are five key pillars:

  1. Universal and equal access to the internet, and digital literacy
  2. Privacy, data protection and security
  3. Transparency, accountability and non-discrimination of data, content and algorithms
  4. Participatory Democracy, diversity and inclusion
  5. Open and ethical digital service standards

We’ve broken down each of these areas in more detail to help you know what you’d be committing to, and start taking the first steps to transforming your city into one that is truly smart. Of course, there is no one-size-fits-all approach, and each council may uphold each pillar in different ways.

Universal and equal access to the internet, and digital literacy

The internet is a part of our everyday lives, and is often (and increasingly) the default way to access essential government services and support. The potential benefits of internet access and digital services are big, but for those benefits to be realised, everyone must have equal and affordable access.

In practice, this means universal broadband to ensure those who wish to access the internet from home are able to. Beyond that, it means providing secure internet access in public spaces. There are many reasons why people may not access the internet from home - they may be experiencing homelessness, it may not be safe for them to browse in their home environment, or they may not have the infrastructure or income to gain access. As such, council-run public spaces such as libraries and community centres should offer free access to the internet. Steps should be taken to ensure the internet provided is as secure as possible, to ensure the safety of those using it.
Providing access is only one side of the coin—for people to be able to benefit from equal access to the internet, they need to have a level of understanding of how to use it. Digital literacy is an important and empowering tool of modern life, and can play a part in people’s ability to exercise their own agency. The digital divide refers to the uneven distribution of access to the internet between people and groups. Location, education, socio-economic status and disability are all factors that can influence someone’s ability to have meaningful access to the internet and digital services. If left unchecked, the digital divide, or, the gap between those who have access to information and services and those who do not, will continue to widen, which then exacerbates existing inequalities. Free community programs to increase digital literacy skills, offered at a range of abilities, in different locations and in various languages are an important part of providing equal access to the internet and digital services in a meaningful way. Ideally, these programs would go beyond the basics of how to use the internet and access services, to also include how people can protect themselves online. This should be accompanied with public education programs and campaigns to raise awareness of digital literacy.

Privacy, data protection and security

Privacy is a human right in and of itself, and it’s also integral to how people are able to realise other human rights. People should have the ability to exercise meaningful control over who has access to information about them. As our lives are increasingly online, the issue of keeping our information private and secure is urgent.

When individuals interact with local governments both online and offline, they are often required to provide information about themselves to access important services and programs. While many of these systems are online, very real world harm can be caused to people when their information is misused, mishandled, or falls in the hands of those who shouldn’t have access to it. People should always be able to opt to interact with the government and access services in an analogue format so they aren’t forced to share information online. Privacy risk assessments should be completed for all digital tools and connected services. This includes a process of applying privacy principles and determining the risk to individuals’ privacy. Privacy should be considered for technologies used, such as CCTV, apps, and online portals, but also for programs and services, such as how information is collected, used and shared at community centers. Privacy risk assessments should never be used as a tick-box compliance exercise, but rather, should genuinely play a role in a local government’s decision making process about if and how this tool or service should be implemented.
The information your local council holds is valuable, and should be treated as such. This means putting in processes to make sure that personal information collected both online and offline is protected. There should be a policy in place for how long certain types of data are stored, how they are stored, and who maintains access to them.
If we do not know what’s happening with our information, it’s very hard to make informed choices or exercise meaningful control! In practice this means governments need to be really clear with people about what information is collected and when. Of course this also means creating a plain english privacy policy (we shouldn’t need a law degree to understand our rights), but it also means making sure that council staff are trained to have conversations with people about why their information is being collected, what it will be used for and who will have access to it.

Transparency, accountability of algorithms

As more connected services become essential to our everyday lives, it becomes increasingly difficult to understand how everything operates and how it may impact us.

Local governments may want to start using more advanced forms of data processing to benefit their communities, but such initiatives must have human rights by design and default. The way that algorithms and automated processing work can be opaque to many people, so councils wishing to use this technology need to take extra steps to ensure that they are transparent and accountable to the public, and make effort to communicate clearly, in plain language and without technical jargon about how their technological systems work and impact people.
It is important that a process is in place to increase the accountability of any algorithmic decision-making processes. This means ensuring that people have the ability to question, request a review of, or appeal a decision that impacts them. Publishing open data related to city services and technologies, and ensuring all projects and open source and properly audited is another opportunity for individuals to be able to understand, question and change unfair systems. Or for academics to study them! Any data publicly released must be scrubbed and carefully anonymised so that it does not contain any personally identifying information.

Participatory Democracy, diversity and inclusion

Technology can be a wonderful tool when used in creative, safe, and ethical ways. Local governments should take advantage of technology and develop digital tools to help people understand their rights and participate in democracy. They can also leverage existing technology platforms to increase community participation.*

*But don’t forget about the privacy risk assessment mentioned previously!

It’s important to ensure all digital platforms are accessible. Some questions to consider may be: Is the service or communication compatible with a screen reader? Is the contrast strong enough and font large enough that it is easily readable? Is there an audible version available? How can this be presented in other languages? Beyond accessibility requirements, any services provided via a digital platform should remain available in an offline analogue format, and in different languages, to meet the needs of the community. Communities ought to have the chance to express their opinions, concerns and suggestions to shape the way that local councils develop smart city plans and digital innovations. For example, if a city is planning on introducing new tools or digital services into the community, there should be an open and transparent process in place that provides people with the opportunity to voice their opinions.

Open and ethical digital service standards

This means that regardless of whether a person if using a fancy new smartphone, a hand-me-down tablet, or a well-loved public desktop computer, they should all be able to access digital services. It also means that people should have the ability to choose to interact with their government or council without the use of technology at all!
Each council will be a little bit different—which means ethical considerations will also vary! Time spent developing ethical standards that can be applied across services and technologies is a great way to take steps to reduce potential harm on individuals and develop a thriving community both online and offline. Maintaining a public register of contracts with private entities, including technology vendors, is an important part of being transparent about digital services. Doing so enables everyone to understand what kind of technology is being used, who it is made by, and which companies are involved. We always encourage the use of open source technologies wherever possible.