Too Much Information: Dating Apps and AI

Posted on May 13, 2026 by
Too Much Information: Dating Apps and AI

As a woman on dating apps, I have received my fair share of jarring openers like:

“What is the little shit doing tonight?”, “Can you be forced to me?”, and my personal least-favourite: “I go crazy for 5ft women with braids who make me feel like committing a felony.”

Sadly, these are all real comments.

Unsolicited sexual comments are the visible tip of dating apps’ misogyny iceberg. Between 2017 and 2022, 75% of Australian women on dating apps were subjected to online sexual violence. 33% of female Australian dating app users experienced people they matched with extending violence beyond the screen by lying about sexual health status, stalking, taking non-consensual pictures, or sexually assaulting them.

Dating apps play a significant role in gender-based violence by providing perpetrators direct access to potential victims.

A closer look at dating app data practices shows that dating apps expose users to additional harms by monetising their personal data, a problem exacerbated by emerging AI tools.

Some of the same systems designed to reduce gender-based harm illustrate the complex relationship between safety and privacy. Biometric face scans are promoted as a way to stop banned users from re-entering platforms. However they also introduce privacy risks that jeopardise the safety of users.

Upon providing a facescan, software scans your face, and to make sure it is not just a picture, collects your facial geometric data. Hinge retains photos from this scan and retains the right to use them, and their estimation of your age in their AI models. Hinge defines your facial geometry data as a “FaceMap” and states “FaceMaps and FaceVectors are retained for the lifetime of an account”. Simultaneously, Hinge assures users that “[Your facial geometry templates are deleted within 24 hours](https://help.hinge.co/hc/en-us/articles/10303221435539-What-is-Selfie-Verification)”. Such ambiguity is not reassuring given the sensitivity of our biometric data.

Biometric data must be kept with the strictest security. Unlike a password or a token, your face cannot be easily changed. If a faceprint has been compromised, foreign governments, ex-partners, strangers, and corporations could use it to identify and track an individual. When privacy is not prioritised, protective safety features merely swap out one risk with another.

To get matches, dating app users need to provide personal information when completing their profile. Users share how often they drink or use drugs, their political views, the age brackets they will date, racial preferences, religion, their sexual orientation, history, kinks, and HIV status. However, dating apps consider this information fair game when it comes to their own marketing: like organising sweepstakes and contests.

While the apps do declare their use of users’ data, only 6% of Australians read terms and conditions. That’s understandable: it would take 14 hours a day to read all the Ts and Cs that apply to us. What many people don’t realise is that dating apps collect far more data than the “about me” information that users willingly share.

Dating apps record more than you might think.

Every swipe, hesitation, undo, typing patterns and message. 25% of dating sites collect metadata from users’ photos and videos, including where and when a photo was taken. This information is used to decipher which characteristics appeal to users. The resulting enriched dataset is incredibly valuable to advertisers, who can use it to micro-target advertising. The data reveals what sort of features a user finds attractive and thus which face, body type or aesthetic will sell them products. The data exposes a users’ attention span and hesitation patterns: used to determine which ads they will find engaging. Users’ political beliefs can be used to frame campaign messages to persuade them.

80% of dating apps cash-in on users’ intimate data by selling it to third parties. While “third parties” usually refers to advertisers, it can also mean governments, journalists, and even Catholic organisations outing gay priests.

Dating apps are acutely aware of the value of this data. Match Group controls roughly two-thirds of the global dating app market, concentrating vast amounts of intimate user information under a single corporate umbrella. CEO and Director of Match Group, Spencer Rascoff, previously served on the board of Palantir. Palantir specialises in data analytics and AI systems, often providing their software to be used to commit atrocities such as the mass deportations of undocumented migrants.

Given Rascoff’s background, Match Group’s AI transformation is unsurprising. It envisions AI “influencing everything from profile creation to matching and connecting for dates, literally everything.

Under the terms and conditions of the dating apps, your user data (including your face in some circumstances) can be used to develop new services and tools, such as being used to train AI. Dating apps often require access to a user’s photo library in order to upload pictures. If AI tools are integrated into these apps, those tools may also be able to access the same photos. This raises the risk that personal images could be analysed, stored, or even used to train AI systems, meaning the app could gain far broader access to a person’s private photo library than the user realises.

The integration of AI into dating apps exacerbates existing ethical and privacy concerns.

AI trains itself on the Internet and therefore absorbs and reproduces the same biases. Without deliberate safeguards to prevent this, the AI reflects these biases in its output. In helping users choose a profile photo, an AI assistant might favour images where the individual appears more eurocentric or reflective of existing gender norms. This would reinforce existing biases against minorities in the app and in society more widely.

Many dating apps rank users based upon their ‘desirability’ which they calculate through user-engagement with the profile. This system entrenches users’ biases into the algorithm. People of colour receive less engagement than white people and we would then expect the algorithm to give them a lower desirability score. People with low desirability scores are then shown to others with low desirability scores, creating a segregated user experience.

AI companions, particularly romantic ones are riddled with ethical, safety and privacy concerns. Grindr has already implemented an “AI wingman” and we expect to see more AI dating companions in the near future.

Grindr has called its AI “surprisingly flirtatious". This could be because the AI was trained on some 111 billion private messages. Grindr has not shied away from its AI’s sexual appetite: reports suggest the AI will be marketed as an ‘AI boyfriend’ that users can flirt and sext with.

80% of dating apps can share and sell data relating to any user interactions with AI companions. Under the Privacy Act, sensitive information such as health information, sexual activity, or political beliefs is subject to stricter rules. This kind of information can generally only be collected with consent, and any use or disclosure must be connected to the purpose for which it was collected, unless further consent is obtained. Companies that have obtained consent and disclosed that data will be shared with third parties, can use this sensitive information for targeted advertising or disclosed to third parties.

Information disclosed in conversation with a chatbot about suicidal thoughts, kinks, or political beliefs could be retained, analysed, and shared with third parties, including insurers, recruitment agencies, or other commercial partners.

This kind of data could be used in profiling or risk assessments that influence decisions about insurance premiums, employment, or access to services, creating significant risks for users who may not expect such sensitive disclosures to be used in this way.

Messages sent to Grindr’s AI wingman can be used by marketing partners to directly target users with ads. The commercialisation of this kind of intimate data is what enabled a Catholic group to identify and publicly out gay priests who were using Grindr.

Most users don’t read terms and conditions, demonstrating the need for a “fair and reasonable” test that protects us against invasive data sharing.

Dating apps’ disregard for users’ privacy is the same disregard for women’s safety.

True story: Hinge once told me I’m “most compatible” with a known neo-Nazi. How could it commit such an obviously dumb mistake when it knows so much about me and my politics? The answer is simple: the dating app business model depends on people coming back and creating more data. If mismatched pairings keep users engaged, then mismatching is exactly what it will do, even when the cost is user’s safety. A system that prioritises profit over the rights of their customers will not hesitate to jeopardise the safety of their users.

Users should not have to choose between sacrificing their privacy and using dating apps. That is a false dichotomy created by data-hungry platforms. Dating apps could design their services to minimise data collection and protect users by default, but they rarely do. Instead, they push the limits of what they can get away with, hiding behind unread terms and conditions and vague consent boxes.

Rather than hoping that multi-billion-dollar corporations will design ethical products, the Australian government should do what it has already promised and fix Australian privacy law.

One such promised privacy reform is the right to delete. This would allow Australians to tell companies to erase the data held about them. The right to delete would be especially valuable for LGBTQ+ dating app users who are concerned about their data outing them.

However, a right to delete poses a challenge to dating apps. A serial harasser could request the deletion of their data from a dating app, including the evidence of their behaviour. The absence of this data could result in their readmission into the app or create evidentiary barriers should a victim of their behaviour pursue legal or police action. The Privacy Act acknowledges this complexity and provides for permitted general situations, where divergence from the Australian Privacy Principles is allowed. Lessening a threat to safety is one of these permitted general situations, and would need to be provided in a future right to delete.

The GDPR grants Europeans a right to erasure of their personal data. However, this right is subject to exceptions, including where the retention of data is necessary for the establishment of legal claims. Harassing messages may constitute evidence of unlawful conduct and can therefore be exempted from the deletion request. This requires the dating app to be proactive in the identification and retention of harmful messages.

As stewards of this data, dating apps have an important role to play in advocating for the safety and privacy of their users. Dating apps need to address this challenge with a more active approach towards women’s safety and privacy.

The introduction of a fair and reasonable test would reduce ambiguity around data use and potential deletion, requiring that any collection, use, or sharing of personal data is genuinely justified, not just technically permitted.

Mandating accessible, clear and understandable privacy policies would help dating app users to make informed decisions about their data practices.

Dating apps are uniquely positioned at the intersection of safety and privacy. The challenge of balancing these concerns is real, but it is not insurmountable. Protecting users from harm and respecting their privacy are not competing goals, they are mutually reinforcing. Designing systems that minimise data collection, safeguard sensitive information, and retain only what is necessary to prevent harm is both possible and essential. Without this shift, dating apps will continue to reproduce the very harms they claim to address, leaving users to bear the cost of a system that prioritises profit over both safety and rights.