A victim of child sexual abuse has begged Elon Musk to stop links offering images of her abuse being posted on his social media platform X. Hearing that my abuse - and the abuse of so many others - is still being circulated and commodified here is infuriating, says Zora (not her real name) who lives in the United States and was first abused more than 20 years ago.
Every time someone sells or shares child abuse material, they directly fuel the original, horrific abuse.
X says it has zero tolerance for child sexual abuse material and tackling those who exploit children remains a top priority.
The BBC found images of Zora while investigating the global trade of child sex abuse material, estimated to be worth billions of dollars by Childlight, the Global Child Safety Institute.
The material was among a cache of thousands of similar photos and videos being offered for sale on an X account. We got in contact with the trader through the messaging app Telegram, and this led us to a bank account linked to a person in Jakarta, Indonesia.
Zora was first abused by a family member. A collection of images of her abuse have become infamous among paedophiles who collect and trade such content. Many other victims face the same situation, as images of abuse continue to circulate today.
Zora is angered the trade continues to this day.
My body is not a commodity. It never has been, and it never will be, she says.
Those who distribute this material are not passive bystanders, they are complicit perpetrators.
Images of Zora's abuse were originally only available on the so-called dark web, but she now has to live with the reality that links are being openly promoted on X.
Social media platforms are trying to rid their platforms of illegal material, but the scale of the problem is enormous.
Last year the US National Center for Missing and Exploited Children (NCMEC), received more than 20 million mandatory reports from tech companies about incidents of child sexual abuse material (CSAM) - illegal images and videos on their platforms.
NCMEC attempts to identify victims and perpetrators, the organisation then contacts law enforcement.
We approached hacktivist group Anonymous, whose members are trying to combat the trade in child abuse images on X. One of them told us the situation was as bad as ever.
They tipped us off about a single account on X. It used a photo of the head and shoulders of a real child as its avatar. There was nothing obscene about it.
But the words and emojis in the account's bio made it clear the owner was selling child sexual abuse material and there was a link to an account on the messaging app Telegram.
Zora told us: I have tried over the years to overcome my past and not let it determine my future, but perpetrators and stalkers still find a way to view this filth.
When we told Zora her photos were being traded using X, she had this message for the platform's owner, Elon Musk: Our abuse is being shared, traded, and sold on the app you own. If you would act without hesitation to protect your own children, I beg you to do the same for the rest of us. The time to act is now.\
Every time someone sells or shares child abuse material, they directly fuel the original, horrific abuse.
X says it has zero tolerance for child sexual abuse material and tackling those who exploit children remains a top priority.
The BBC found images of Zora while investigating the global trade of child sex abuse material, estimated to be worth billions of dollars by Childlight, the Global Child Safety Institute.
The material was among a cache of thousands of similar photos and videos being offered for sale on an X account. We got in contact with the trader through the messaging app Telegram, and this led us to a bank account linked to a person in Jakarta, Indonesia.
Zora was first abused by a family member. A collection of images of her abuse have become infamous among paedophiles who collect and trade such content. Many other victims face the same situation, as images of abuse continue to circulate today.
Zora is angered the trade continues to this day.
My body is not a commodity. It never has been, and it never will be, she says.
Those who distribute this material are not passive bystanders, they are complicit perpetrators.
Images of Zora's abuse were originally only available on the so-called dark web, but she now has to live with the reality that links are being openly promoted on X.
Social media platforms are trying to rid their platforms of illegal material, but the scale of the problem is enormous.
Last year the US National Center for Missing and Exploited Children (NCMEC), received more than 20 million mandatory reports from tech companies about incidents of child sexual abuse material (CSAM) - illegal images and videos on their platforms.
NCMEC attempts to identify victims and perpetrators, the organisation then contacts law enforcement.
We approached hacktivist group Anonymous, whose members are trying to combat the trade in child abuse images on X. One of them told us the situation was as bad as ever.
They tipped us off about a single account on X. It used a photo of the head and shoulders of a real child as its avatar. There was nothing obscene about it.
But the words and emojis in the account's bio made it clear the owner was selling child sexual abuse material and there was a link to an account on the messaging app Telegram.
Zora told us: I have tried over the years to overcome my past and not let it determine my future, but perpetrators and stalkers still find a way to view this filth.
When we told Zora her photos were being traded using X, she had this message for the platform's owner, Elon Musk: Our abuse is being shared, traded, and sold on the app you own. If you would act without hesitation to protect your own children, I beg you to do the same for the rest of us. The time to act is now.\