A gruesome killing in her own family inspired South African Leonora Tima to create a digital platform where people, mostly women, can talk about and track abuse.
Leonora's relative was just 19 years old, and nine months pregnant, when she was killed, her body dumped on the side of a highway near Cape Town in 2020.
I work in the development sector, so I've seen violence, Leonora says. But what stood out for me was that my family member's violent death was seen as so normal in South African society.
Her death wasn't published by any news outlet because the sheer volume of these cases in our country is such that it doesn't qualify as news.
The killer was never caught and what Leonora saw as the silent acceptance of a woman's violent death became the catalyst for her app, Gender Rights in Tech (Grit), which features a chatbot called Zuzi.
This is one of the first free AI tools made by African creators to tackle gender-based violence.
This is an African solution co-designed with African communities, says Leonora.
The aim is to offer support and help gather evidence that could later be used in legal cases against abusers.
The initiative is gaining interest among international women's rights activists, although some caution that chatbots should not be used to replace human support, emphasizing that survivors need empathy, understanding, and emotional connection that only a trained professional can provide.
Leonora and her small team visited communities in the townships around her home in Cape Town, speaking to residents about their experiences of abuse and the ways technology fits into their lives.
They asked more than 800 people how they used their phones and social media to talk about violence, and what stopped them from seeking help.
Leonora found that people wanted to talk about their abuse, but they were wary of traditional routes like the police.
Some women would post about it on Facebook and even tag their abuser, only to be served with defamation papers, she says.
She felt that existing systems were failing victims twice, first in failing to prevent the violence itself, and then again when victims tried to speak up.
With financial and technical support from Mozilla, the Gates Foundation, and the Patrick McGovern Foundation, Leonora and her team began developing Grit, a mobile app that could help people record, report and get a response to abuse while it was happening.
The app is free to use, though it requires mobile data to download it. Leonora's team says it has 13,000 users, and had about 10,000 requests for help in September.
At its core, Grit is built around three key features.
On the home screen is a large, circular help button. When pressed, it automatically starts recording 20 seconds of audio, capturing what's happening around the user. At the same time, it triggers an alert to a private rapid-response call center - professional response companies are common in South Africa - where a trained operator calls the user.
If the caller needs immediate help, the response team either sends someone to the scene themselves or contacts an organization local to the victim who can go to their aid.
The app was built with the needs of abuse survivors at its core, says Leonora: We need to earn people's trust. These are communities that are often ignored. We are asking a lot from people when it comes to sharing data.
When asked whether the help feature has been misused, she admits there have been a few curious presses - people testing to see if it really works - but nothing she'd call abuse of the system.
People are cautious. They're testing us as much as we're testing the tech, she says.
The second element of Grit is the vault, which Leonora says is a secure digital space where users can store evidence of abuse, dated and encrypted, for possible use later in legal proceedings.
Photos, screenshots, and voice recordings can all be uploaded and saved privately, protecting crucial evidence from deletion or tampering.
Sometimes women take photos of injuries or save threatening messages, but those can get lost or deleted, Leonora says. The vault means that evidence isn't just sitting on a phone that could be taken away or destroyed.
This month, Grit will expand again with the launch of its third feature - Zuzi, an AI-powered chatbot designed to listen, advise, and guide users to local community support.
We asked people: 'Should it be a woman? Should it be a man? Should it be a robot? Should it sound like a lawyer, a social worker, a journalist, or another authority figure? Leonora explains.
People told them that they wanted Zuzi to be an aunt figure - someone warm and trustworthy, who they could confide in without fear of judgment.
Although built primarily for women experiencing abuse, during the testing phase, Zuzi has also been used by men seeking help.
Some conversations are from perpetrators, men asking Zuzi to teach them how to get help with their anger issues, which they often direct at their partners, Leonora explains. There are also men who are victims of violence and have used Zuzi to talk more openly about their experience.
People like talking to AI because they don't feel judged by it, she adds. It's not a human.
UN Women reports that South Africa experiences some of the world's highest levels of gender-based violence (GBV), with a femicide rate that is five times higher than the global average. Between 2015 and 2020, an average of seven women were killed every day, according to South African police.
Leonora is clear that the success of AI to tackle gender-based violence depends not just on engineering, but on who gets to design technology in the first place.
A 2018 World Economic Forum report found that only 22% of AI professionals globally were women, a statistic that is still often cited.
AI as we know it now has been built with historic data that centres the voices of men, and white men in particular, Leonora says.
The answer is not only about having more women creators. We also need creators who are women of colour, more from the global south, and more from less privileged socio-economic backgrounds.
Only then, Leonora Tima concludes, can technology begin to represent the realities of those who use it.


















