Sexting, shared nudes and sextortion are part of a worrying trend among many South African and overseas kids and teens, despite big tech’s promises that it’s taking steps to protect children online. Peer pressure, bullying and even voluntary participation is on the rise. But parents can do something about it.
In 2022, an audit of websites containing sexual photos or videos of children found that 92% included content that children had created themselves. Almost 60% of self-generated sexual abuse images were of children aged between 11 and 13, and more than a third were of children aged seven to 10. The youngest victim was only three.
Children euphemistically call them “nudes”. They are naked photos or videos that children take and share of themselves. Sexting ranges in severity from images taken of their naked bodies, to those involving penetrative sexual activity, including sadism and bestiality.
They have become a pervasive part of our children’s lives.
In the documentary Childhood 2.0, American teens describe how many relationships now begin with the sharing of nudes. Chatting, holding hands, going on a date, hugs and kisses, and other courtship rituals are preceded by the exchange of naked photos.
And, like a job interview, the image becomes a gateway into the relationship, or the barrier to entry.
Teens interviewed for the documentary describe how girls who share nudes are called sluts and those who don’t, prudes. Boys, who often initiate contact by sending a “dick pick”, trade the nudes they receive like baseball cards.
While the perpetrators are often criminal syndicates, one of South Africa’s most notorious sextorters was a middle-aged Afrikaans housewife who preyed on boys in her community.
A study by ECPAT, a global network of civil society organisations dedicated to ending child sexual exploitation, included this testimony from a 17-year-old girl:
“I’ve been sent hundreds of dick pics I haven’t asked for. When I was younger, I chatted with dozens of paedophiles, I told them my age (11–13) and they wanted nudes… or [to] meet in real life to have sex.”
The report noted that “nude photographs of girls are viewed as a form of capital that gives status to the recipient”. This increases the pressure placed on girls “to share naked photos” and on boys “to share them if and when they receive them”.
For many of these boys, admission into peer groups is now predicated on sharing a nude or adding it to an online folder of images. These are then accessible to other boys who’ve gone through a similar rite of passage.
Peer pressure school practices
At an upmarket high school in Pretoria, Grade 10 boys formed a WhatsApp group, which boys were only allowed to join if they shared a naked photo of a girl. Girls describe being pressurised to share naked photos with their boyfriends so the boys could be accepted into the group. And at a prestigious boy’s school in Johannesburg, a Grade 8 boarder spent his school holidays trying to “collect nudes” to be included in a closed group of his new peers.
During a campaign called “Because you asked”, run by Jelly Beanz, who provide help for abused children, 12-and 13-year-old children shared their experience of sexting.
Weeks later, Nathan learnt that more than 50 Grade 9 boys at his school had been sextorted by the syndicate. Ashamed, humiliated, and unable to tell his parents, one boy had been hospitalised after attempting suicide.
One Grade 7 girl described how an older boy at school bullied, manipulated and threatened her, protesting that if she loved him, she’d share nudes. When she eventually capitulated, he told the whole grade, adding that she sleeps around. She felt judged and mortified, but afraid that if she sought help from adults, it would change their opinion of her and they would take away her phone.
Two Grade 6 girls reported sexting with “boys” on Snapchat and TikTok, who complimented them on their bodies, but turned out to be masturbating middle-aged men.
Another 12-year-old “met” a boy on Instagram who seemed cute and her age. He asked for her number, requested selfies to “see her beautiful face”, then asked for nudes. When she declined, he used Facebook and Instagram to add and message her friends and family members, instructing them to make the girl obey him, and asking her friends for naked pics. One, aged nine, complied.
Desperate, the 12-year-old thought if she sent him a picture of herself in her swimming costume, he’d stop harassing her. Instead, he posted the pic and her phone number on a pornographic site, then sent her a screenshot. After three calls from men asking to meet up, she finally sought help from her mother.
Read more in Daily Maverick: Attempts to stop the child-related pornography tsunami stalled indefinitely
Albeit regretful that her child hadn’t spoken up sooner, her mother comforted her and reported the abuser to the police who helped them remove the post and block the perpetrator from the child’s social media. But he was let off with a warning.
As this child experienced, once shared, many images and videos end up on sites containing child sexual abuse material.
Previously called “child pornography”, child sexual abuse material (CSAM) includes any images or videos which depict children under 18 engaged in sexually explicit acts.
Global internet watchdog
Global internet safety watchdog Internet Watch Foundation reports that 10 years ago there was no self-generated content on sites containing CSAM. But in 2022, it found 275,655 websites containing CSAM, 92% of which included photos or videos that children had generated themselves. Almost 60% of reported images were taken by children aged 11-13, more than a third were seven to 10-year-olds, and 4% of images were taken by three to six-year-olds.
The charity’s CEO explains: “That’s children in their bedrooms, where they’ve been tricked, coerced or encouraged into engaging in sexual activity, which is then recorded and shared by child sexual abuse websites.”
Reporting an image on a social media platform is often ineffectual, leading to ongoing revictimisation. A National Centre for Missing and Exploited Children (NCMEC) study found that one image was shared 490,000 times after it had been reported.
Experts explain that self-generated child abuse images can be shared intentionally, accidentally, through coercion, maliciously, or a combination of these factors.
Already common among girls, extortion of boys for images or money is becoming increasingly widespread.
The NCMEC describes sextortion as a form of child sexual exploitation in which children are blackmailed using nude or sexual images of themselves, in exchange for more illicit material, sexual acts or financial gain. The US Department of Justice reported that in 2022, more than 3,000 minors, primarily boys, had been targeted globally using financial sextortion, a sharp increase from previous years. More than a dozen victims had died by suicide.
In a 2023 study by Snap Inc, more than two thirds of the 6,000 Gen Z children and young adults surveyed across six countries said that they had been sextorted. In Canada, 92% of sextortion cases in 2023 involved boys or men. In the UK, sextortion cases reported to the police increased by 390% in two years. Experts believe that approximately 100 UK children a day are falling victim.
While the perpetrators are often criminal syndicates, one of South Africa’s most notorious sextorters was a middle-aged Afrikaans housewife who preyed on boys in her community.
And, in a worrying new trend, criminals are recruiting and paying children to sextort other children.
Britain’s National Crime Agency confirms how enticing sextortion scams are and that predators now use AI to make their conversations more convincing.
One affected boy is Nathan*, a Grade 9 learner, who was surprised when he was friended by an attractive unknown teenage girl on Instagram, who said she was new to the province and was given his details by a mutual friend.
Nathan, who’d never had a girlfriend, was delighted, and they chatted about school, family and friends every day, initially on Instagram, and then, after she told him her mom had restricted her Instagram screen-time, on WhatsApp. Before long, they were exchanging pictures of their daily activities. Then she asked if he’d like to see her in her bikini. A week after they “met”, in a flirty conversation, she asked if they could exchange nudes.
He only hesitated for a moment.
She responded with a heart, but minutes later texted that if he didn’t pay R2,000 within 24 hours “she’d” be sharing his picture on Instagram, and with his friends and family. Horrified, and under time pressure, he emptied his meagre bank account and begged to have the photo deleted.
Instead, follow-up messages instructed him to pay more or send pics of himself in specific sexual poses. After sharing a photo of himself masturbating, he contemplated taking his own life, before reluctantly choosing to tell his mom.
Despite her initial shock, she immediately sought professional help. Internet safety experts coached Nathan about how to respond to his extorter and how to block and report “her” to the specialised police cybercrimes unit.
Weeks later, Nathan learnt that more than 50 Grade 9 boys at his school had been sextorted by the syndicate. Ashamed, humiliated, and unable to tell his parents, one boy had been hospitalised after attempting suicide.
Not all children survive though. Ronan, 17, from Northern Ireland, committed suicide after being targeted on Facebook. Told to send the perpetrators £3,400 in cybercurrency, he protested that he was only a child, and begged for mercy. They responded: “Foolishness has a price. And you’ll pay. You have 48 hours. Time is running out.”
Tragically, Ronan saw no way out.
Sextorters routinely encourage suicide and threaten violence. One 16-year-old boy reported: “I’ve received death threats a couple of times, and they’ve also threatened to kill my family if I don’t send photographs.”
Nevertheless, not all children think sharing nudes is risky.
Nudes = in love
Unlike Nathan, Sindi*, 13, wasn’t tricked or manipulated into sexting. For her, exchanging “nudes” was a normal expression of being in love.
While only 5% of children surveyed in the South African Disrupting Harm study confessed to sexting in the previous year (compared to 50% of the children in the Scandinavian study, so there is probably under-reporting), half said they shared nudes because they were in love, or flirting and having fun.
Confirming the Scandinavian findings, these children deemed sexting unproblematic. They saw it as a positive experience, part of a healthy relationship, good for their self-esteem, and a way of “gaining affirmation and creating intimacy with people they like”.
To quote one 16-year-old girl, “it’s happened a hundred times… usually just flirting online and then you take more and more daring photographs until the guy asks for a nude and I’m like yeah, no big deal, if we’re both fine with it.”
Children who willingly share nudes aren’t immune from harm thereafter though.
Sindi and her boyfriend exchanged nudes for six months, using Snapchat to ensure that their parents didn’t find out. But she was unaware that her boyfriend had made screenshots of every image and shared her photos with his friends.
Then, after the relationship ended, his jealous new girlfriend found Sindi’s photos, shared them with Sindi’s parents and posted them onto a grade WhatsApp group. Before they could be deleted, they’d already been reshared and saved by many other children in the grade.
Devastated, Sindi was forced to leave the school.
As the lines between intentionality, accidental sharing, and malicious sexting blur, experts have identified another growing trend where children are filming other children having consensual sex, and then forwarding those videos.
Children also document and share the sexual assault and rape of other children, with devastating impact, as seen in the documentary Audrie and Daisy. The girls’ sexual assaults were filmed, and images circulated. It resulted in tragedy, first Audrie’s death by suicide, and then Daisy and her mother taking their own lives after the documentary aired.
While most children unwittingly share their images publicly, Tami*, discovered at age 12 that she could make money through TikTok videos of herself dancing naked.
US tech Senate hearings
Despite the CEO of TikTok testifying before the US Senate hearings that children under 13 are protected on the app, under sixteens automatically have privacy settings restricting their sharing to friends only, and under eighteens cannot live-stream or receive remuneration, Tami easily circumvented these restrictions.
Tami received thousands of likes from strangers for her publicly posted videos. Many asked if they could see more of her. As she slowly got braver, taking off pieces of clothing and making sexual moves, viewers began to tip her.
By age 15, she was paying for transport, groceries, airtime and new outfits with the proceeds of her videos.
In 2021, a Unicef SA and Department of Social Development commissioned survey by Unisa’s Bureau of Market Research found that one-third of SA children were at risk of online violence, exploitation and abuse, and 70% of children surveyed used the internet without parental consent.
Preventing or minimising the harm is therefore critical. Experts, who caution parents to avoid panic policing or believing that it’s “not my kid”, emphasise that it takes a team effort to keep children safe.
Denial, they say, is particularly damaging. Your child’s developing brain, unsupported, pitted against criminal syndicates, perpetrators, manipulation, and the multibillion-dollar pornography industry, isn’t a fair fight.
Dr Marita Rademeyer, a clinical psychologist who has worked with children and families affected by abuse for 30 years, says that given the exploding numbers of under thirteens sexting, parents should try to delay giving their child a smartphone and allowing access to social media, instead opting for an old-school handset for communication.
Rademeyer recommends that if children already have smartphones, parents limit their screen time.
In addition, experts advise parents to manage when their children are online, preventing use of devices late at night, and behind closed doors in bedrooms and bathrooms.
Internet safety tools can assist with managing use, content and contact. Tech giants Instagram, Facebook, Google, TikTok, Snapchat and Discord also have parent-partnering centres where parents can manage in-app screen time, which topics their child can access, who can view their posts, and friend requests.
Wired to push boundaries
Rademeyer stresses however that while parents often rely on fear and control to police sexting, the adolescent brain is wired to push boundaries. So, although restrictions and protective apps are effective for very young children, control is often unsuccessful for older children.
A relationship is critical for protecting children, and restrictions without relationship can drive the behaviour underground.
She explains that the problem isn’t just the naked pics, nor is it helpful to instruct children to simply avoid taking or sharing pics. Instead, she says, “it is the sexualisation of children online which desensitises them to the potential impact of sharing images of themselves.”
Rademeyer advises parents to begin speaking to their children early and often about sex and sexuality, starting as young as age five, and to chase the “why” behind their child’s behaviours.
Explaining that children often share nudes because they have a need for acceptance and belonging and because they are looking for affirmation that they aren’t getting elsewhere, she confirms that the need can easily outweigh even the strongest and most sustained safety messaging. By contrast, addressing that need can help minimise risky behaviours.
The tweens who participated in the ‘Because you asked’ study cautioned children to be careful about what they post online and who they follow and friend, to prepare for “bad things”, and to confide in a trusted adult.
But only 6% of children do tell caregivers about online child sexual exploitation and abuse. Many tell peers, but most tell no one for fear of humiliation, or their devices being confiscated.
Caregivers should therefore avoid warning, shaming and blaming, and banning phones or the child’s access to their online world, which results in secrecy and children concealing harm.
How to deal with disclosure
If your child discloses that they’ve been targeted, try to stay calm and remember that your child is the victim, not the villain.
Carefully document the interactions, tell the perpetrator you will be contacting the authorities, then block and report the account. If your child’s images and videos have already been posted online, use IWF’s “Report remove” site, or Take it Down to get the content flagged and removed from the internet.
Both services are focussed on children, but if you want an explicit image taken when you were over 18 deleted, get help at stopncii.org.
Lasting change will require legislation and litigation-driven accountability by big tech for the safety of its billions of users. Like the UK, a slew of legislation is in process in the US to keep tech companies accountable and allow them to be sued or fined if their users, especially children, are harmed.
But progress is proving slow, and in South Africa, it isn’t even on our legislative radar (proposed law reforms haven’t been actioned two years after drafting).
Millions of local children cannot wait for legislators to catch up, or for curriculum changes to educate them about inherent dangers of online use (which sometimes outweigh the positives). What’s urgently needed is a community, school, family and child partnership to keep them safe.
The time for looking away is over — if sexting is our children’s new normal, and adults remain in denial, they’ll reap the whirlwind and ultimately, so will we.
First published in the Daily Maverick 24.03.2024
Contact Childline on 116 (tollfree 24/7) if your child needs help with sexting or sextortion.
* names changed to protect the victims.