Time for looking away is over — sexting is our children’s new normal, adults are in denial

Time for looking away is over — sexting is our children’s new normal, adults are in denial

Sexting, shared nudes and sextortion are part of a worrying trend among many South African and overseas kids and teens, despite big tech’s promises that it’s taking steps to protect children online. Peer pressure, bullying and even voluntary participation is on the rise. But parents can do something about it.

In 2022, an audit of websites containing sexual photos or videos of children found that 92% included content that children had created themselves. Almost 60% of self-generated sexual abuse images were of children aged between 11 and 13, and more than a third were of children aged seven to 10. The youngest victim was only three.

Children euphemistically call them “nudes”. They are naked photos or videos that children take and share of themselves. Sexting ranges in severity from images taken of their naked bodies, to those involving penetrative sexual activity, including sadism and bestiality.

They have become a pervasive part of our children’s lives.

In the documentary Childhood 2.0, American teens describe how many relationships now begin with the sharing of nudes. Chatting, holding hands, going on a date, hugs and kisses, and other courtship rituals are preceded by the exchange of naked photos.

And, like a job interview, the image becomes a gateway into the relationship, or the barrier to entry.

Teens interviewed for the documentary describe how girls who share nudes are called sluts and those who don’t, prudes. Boys, who often initiate contact by sending a “dick pick”, trade the nudes they receive like baseball cards.

While the perpetrators are often criminal syndicates, one of South Africa’s most notorious sextorters was a middle-aged Afrikaans housewife who preyed on boys in her community.

study by ECPAT, a global network of civil society organisations dedicated to ending child sexual exploitation, included this testimony from a 17-year-old girl:

“I’ve been sent hundreds of dick pics I haven’t asked for. When I was younger, I chatted with dozens of paedophiles, I told them my age (11–13) and they wanted nudes… or [to] meet in real life to have sex.”

The report noted that “nude photographs of girls are viewed as a form of capital that gives status to the recipient”. This increases the pressure placed on girls “to share naked photos” and on boys “to share them if and when they receive them”.

For many of these boys, admission into peer groups is now predicated on sharing a nude or adding it to an online folder of images. These are then accessible to other boys who’ve gone through a similar rite of passage.

Peer pressure school practices

At an upmarket high school in Pretoria, Grade 10 boys formed a WhatsApp group, which boys were only allowed to join if they shared a naked photo of a girl. Girls describe being pressurised to share naked photos with their boyfriends so the boys could be accepted into the group. And at a prestigious boy’s school in Johannesburg, a Grade 8 boarder spent his school holidays trying to “collect nudes” to be included in a closed group of his new peers.

During a campaign called “Because you asked”, run by Jelly Beanz, who provide help for abused children, 12-and 13-year-old children shared their experience of sexting.

Weeks later, Nathan learnt that more than 50 Grade 9 boys at his school had been sextorted by the syndicate. Ashamed, humiliated, and unable to tell his parents, one boy had been hospitalised after attempting suicide.

One Grade 7 girl described how an older boy at school bullied, manipulated and threatened her, protesting that if she loved him, she’d share nudes. When she eventually capitulated, he told the whole grade, adding that she sleeps around. She felt judged and mortified, but afraid that if she sought help from adults, it would change their opinion of her and they would take away her phone.

Two Grade 6 girls reported sexting with “boys” on Snapchat and TikTok, who complimented them on their bodies, but turned out to be masturbating middle-aged men.

Another 12-year-old “met” a boy on Instagram who seemed cute and her age. He asked for her number, requested selfies to “see her beautiful face”, then asked for nudes. When she declined, he used Facebook and Instagram to add and message her friends and family members, instructing them to make the girl obey him, and asking her friends for naked pics. One, aged nine, complied.

Desperate, the 12-year-old thought if she sent him a picture of herself in her swimming costume, he’d stop harassing her. Instead, he posted the pic and her phone number on a pornographic site, then sent her a screenshot. After three calls from men asking to meet up, she finally sought help from her mother.

Read more in Daily Maverick: Attempts to stop the child-related pornography tsunami stalled indefinitely

Albeit regretful that her child hadn’t spoken up sooner, her mother comforted her and reported the abuser to the police who helped them remove the post and block the perpetrator from the child’s social media. But he was let off with a warning.

As this child experienced, once shared, many images and videos end up on sites containing child sexual abuse material.

Previously called “child pornography”, child sexual abuse material (CSAM) includes any images or videos which depict children under 18 engaged in sexually explicit acts.

Global internet watchdog

Global internet safety watchdog Internet Watch Foundation reports that 10 years ago there was no self-generated content on sites containing CSAM. But in 2022, it found 275,655 websites containing CSAM, 92% of which included photos or videos that children had generated themselves. Almost 60% of reported images were taken by children aged 11-13, more than a third were seven to 10-year-olds, and 4% of images were taken by three to six-year-olds.

The charity’s CEO explains: “That’s children in their bedrooms, where they’ve been tricked, coerced or encouraged into engaging in sexual activity, which is then recorded and shared by child sexual abuse websites.”

Reporting an image on a social media platform is often ineffectual, leading to ongoing revictimisation. A National Centre for Missing and Exploited Children (NCMEC) study found that one image was shared 490,000 times after it had been reported.

Experts explain that self-generated child abuse images can be shared intentionally, accidentally, through coercion, maliciously, or a combination of these factors.

Already common among girls, extortion of boys for images or money is becoming increasingly widespread.

The NCMEC describes sextortion as a form of child sexual exploitation in which children are blackmailed using nude or sexual images of themselves, in exchange for more illicit material, sexual acts or financial gain. The US Department of Justice reported that in 2022, more than 3,000 minors, primarily boys, had been targeted globally using financial sextortion, a sharp increase from previous years. More than a dozen victims had died by suicide.

In a 2023 study by Snap Inc, more than two thirds of the 6,000 Gen Z children and young adults surveyed across six countries said that they had been sextorted. In Canada, 92% of sextortion cases in 2023 involved boys or men. In the UK, sextortion cases reported to the police increased by 390% in two years. Experts believe that approximately 100 UK children a day are falling victim.

While the perpetrators are often criminal syndicates, one of South Africa’s most notorious sextorters was a middle-aged Afrikaans housewife who preyed on boys in her community.

And, in a worrying new trend, criminals are recruiting and paying children to sextort other children.

Britain’s National Crime Agency confirms how enticing sextortion scams are and that predators now use AI to make their conversations more convincing.

One affected boy is Nathan*, a Grade 9 learner, who was surprised when he was friended by an attractive unknown teenage girl on Instagram, who said she was new to the province and was given his details by a mutual friend.

Nathan, who’d never had a girlfriend, was delighted, and they chatted about school, family and friends every day, initially on Instagram, and then, after she told him her mom had restricted her Instagram screen-time, on WhatsApp. Before long, they were exchanging pictures of their daily activities. Then she asked if he’d like to see her in her bikini. A week after they “met”, in a flirty conversation, she asked if they could exchange nudes.

He only hesitated for a moment.

She responded with a heart, but minutes later texted that if he didn’t pay R2,000 within 24 hours “she’d” be sharing his picture on Instagram, and with his friends and family. Horrified, and under time pressure, he emptied his meagre bank account and begged to have the photo deleted.

Instead, follow-up messages instructed him to pay more or send pics of himself in specific sexual poses. After sharing a photo of himself masturbating, he contemplated taking his own life, before reluctantly choosing to tell his mom.

Despite her initial shock, she immediately sought professional help. Internet safety experts coached Nathan about how to respond to his extorter and how to block and report “her” to the specialised police cybercrimes unit.

Weeks later, Nathan learnt that more than 50 Grade 9 boys at his school had been sextorted by the syndicate. Ashamed, humiliated, and unable to tell his parents, one boy had been hospitalised after attempting suicide.

Not all children survive though. Ronan, 17, from Northern Ireland, committed suicide after being targeted on Facebook. Told to send the perpetrators £3,400 in cybercurrency, he protested that he was only a child, and begged for mercy. They responded: “Foolishness has a price. And you’ll pay. You have 48 hours. Time is running out.”

Tragically, Ronan saw no way out.

Sextorters routinely encourage suicide and threaten violence. One 16-year-old boy reported: “I’ve received death threats a couple of times, and they’ve also threatened to kill my family if I don’t send photographs.”

Nevertheless, not all children think sharing nudes is risky.

Nudes = in love

Unlike Nathan, Sindi*, 13, wasn’t tricked or manipulated into sexting. For her, exchanging “nudes” was a normal expression of being in love.

While only 5% of children surveyed in the South African Disrupting Harm study confessed to sexting in the previous year (compared to 50% of the children in the Scandinavian study, so there is probably under-reporting), half said they shared nudes because they were in love, or flirting and having fun.

Confirming the Scandinavian findings, these children deemed sexting unproblematic. They saw it as a positive experience, part of a healthy relationship, good for their self-esteem, and a way of “gaining affirmation and creating intimacy with people they like”.

To quote one 16-year-old girl, “it’s happened a hundred times… usually just flirting online and then you take more and more daring photographs until the guy asks for a nude and I’m like yeah, no big deal, if we’re both fine with it.”

Children who willingly share nudes aren’t immune from harm thereafter though.

Sindi and her boyfriend exchanged nudes for six months, using Snapchat to ensure that their parents didn’t find out. But she was unaware that her boyfriend had made screenshots of every image and shared her photos with his friends.

Then, after the relationship ended, his jealous new girlfriend found Sindi’s photos, shared them with Sindi’s parents and posted them onto a grade WhatsApp group. Before they could be deleted, they’d already been reshared and saved by many other children in the grade.

Devastated, Sindi was forced to leave the school.

As the lines between intentionality, accidental sharing, and malicious sexting blur, experts have identified another growing trend where children are filming other children having consensual sex, and then forwarding those videos.

Children also document and share the sexual assault and rape of other children, with devastating impact, as seen in the documentary Audrie and Daisy. The girls’ sexual assaults were filmed, and images circulated. It resulted in tragedy, first Audrie’s death by suicide, and then Daisy and her mother taking their own lives after the documentary aired.

While most children unwittingly share their images publicly, Tami*, discovered at age 12 that she could make money through TikTok videos of herself dancing naked.

US tech Senate hearings

Despite the CEO of TikTok testifying before the US Senate hearings that children under 13 are protected on the app, under sixteens automatically have privacy settings restricting their sharing to friends only, and under eighteens cannot live-stream or receive remuneration, Tami easily circumvented these restrictions.

Tami received thousands of likes from strangers for her publicly posted videos. Many asked if they could see more of her. As she slowly got braver, taking off pieces of clothing and making sexual moves, viewers began to tip her.

By age 15, she was paying for transport, groceries, airtime and new outfits with the proceeds of her videos.

In 2021, a Unicef SA and Department of Social Development commissioned survey by Unisa’s Bureau of Market Research found that one-third of SA children were at risk of online violence, exploitation and abuse, and 70% of children surveyed used the internet without parental consent.

Preventing or minimising the harm is therefore critical. Experts, who caution parents to avoid panic policing or believing that it’s “not my kid”, emphasise that it takes a team effort to keep children safe.

Denial, they say, is particularly damaging. Your child’s developing brain, unsupported, pitted against criminal syndicates, perpetrators, manipulation, and the multibillion-dollar pornography industry, isn’t a fair fight.

Dr Marita Rademeyer, a clinical psychologist who has worked with children and families affected by abuse for 30 years, says that given the exploding numbers of under thirteens sexting, parents should try to delay giving their child a smartphone and allowing access to social media, instead opting for an old-school handset for communication.

Rademeyer recommends that if children already have smartphones, parents limit their screen time.

In addition, experts advise parents to manage when their children are online, preventing use of devices late at night, and behind closed doors in bedrooms and bathrooms.

Internet safety tools can assist with managing use, content and contact. Tech giants Instagram, Facebook, Google, TikTok, Snapchat and Discord also have parent-partnering centres where parents can manage in-app screen time, which topics their child can access, who can view their posts, and friend requests.

Wired to push boundaries

Rademeyer stresses however that while parents often rely on fear and control to police sexting, the adolescent brain is wired to push boundaries. So, although restrictions and protective apps are effective for very young children, control is often unsuccessful for older children.

A relationship is critical for protecting children, and restrictions without relationship can drive the behaviour underground.

She explains that the problem isn’t just the naked pics, nor is it helpful to instruct children to simply avoid taking or sharing pics. Instead, she says, “it is the sexualisation of children online which desensitises them to the potential impact of sharing images of themselves.”

Rademeyer advises parents to begin speaking to their children early and often about sex and sexuality, starting as young as age five, and to chase the “why” behind their child’s behaviours.

Explaining that children often share nudes because they have a need for acceptance and belonging and because they are looking for affirmation that they aren’t getting elsewhere, she confirms that the need can easily outweigh even the strongest and most sustained safety messaging. By contrast, addressing that need can help minimise risky behaviours.

The tweens who participated in the ‘Because you asked’ study cautioned children to be careful about what they post online and who they follow and friend, to prepare for “bad things”, and to confide in a trusted adult.

But only 6% of children do tell caregivers about online child sexual exploitation and abuse. Many tell peers, but most tell no one for fear of humiliation, or their devices being confiscated.

Caregivers should therefore avoid warning, shaming and blaming, and banning phones or the child’s access to their online world, which results in secrecy and children concealing harm.

How to deal with disclosure

If your child discloses that they’ve been targeted, try to stay calm and remember that your child is the victim, not the villain.

Carefully document the interactions, tell the perpetrator you will be contacting the authorities, then block and report the account. If your child’s images and videos have already been posted online, use IWF’s “Report remove” site, or Take it Down to get the content flagged and removed from the internet.

Both services are focussed on children, but if you want an explicit image taken when you were over 18 deleted, get help at stopncii.org.

Lasting change will require legislation and litigation-driven accountability by big tech for the safety of its billions of users. Like the UK, a slew of legislation is in process in the US to keep tech companies accountable and allow them to be sued or fined if their users, especially children, are harmed.

But progress is proving slow, and in South Africa, it isn’t even on our legislative radar (proposed law reforms haven’t been actioned two years after drafting).

Millions of local children cannot wait for legislators to catch up, or for curriculum changes to educate them about inherent dangers of online use (which sometimes outweigh the positives). What’s urgently needed is a community, school, family and child partnership to keep them safe.

The time for looking away is over — if sexting is our children’s new normal, and adults remain in denial, they’ll reap the whirlwind and ultimately, so will we. 

First published in the Daily Maverick 24.03.2024

Contact Childline on 116 (tollfree 24/7) if your child needs help with sexting or sextortion.

* names changed to protect the victims.

Attempts to stop the child-related pornography tsunami stalled indefinitely

Attempts to stop the child-related pornography tsunami stalled indefinitely

More than half of the nine- to 17-year-olds in South Africa have seen sexual images on a phone or online device in the past year – 8% took naked photos or videos of themselves and two-thirds of those children shared them. Some experts report up to 15 cases of sextortion a day involving children. With earliest exposure to pornography happening at preschool, why is legislation designed to protect children gathering dust in the Department of Justice’s filing cabinets?

Listen to this article

It has been two and a half years since the South African Law Reform Commission published its report recommending amendments to key pieces of legislation related to children and pornography. 

The report contains its proposals to protect children from access to pornography, to manage sexting and sextortion, to criminalise all child sexual abuse material (previously known as child pornography) along with all aspects of live displays of child sexual abuse material, to clarify that sexual grooming can occur online as well as offline, and to make the policing of grooming easier. 

The report also recommends that all crimes listed in the Sexual Offences Act include criminal acts committed through the internet, webcams, mobile phones, and technology yet to be developed, and extends the obligation to report sexual abuse to online abuse. 

At the time of the report’s finalisation, South Africa was already struggling to manage a tidal wave of child exposure to pornography, of sexting and sextortion, online grooming and child sexual abuse material. But according to Dr Joan van Niekerk, who is part of the commission and sees daily the calamitous impact of delays on children, despite the extreme urgency to implement these recommendations, the Department of Justice is yet to introduce any of them to Parliament. 

And given the long lead times between legislation’s introduction and its promulgation and application, the delay is likely to extend well into the term of the seventh Parliament. 

That is, if it is introduced. At present, even that seems unlikely.

Negative online experiences

The 2022 Disrupting Harm survey reports that most children (95.3%) in South Africa have access to the internet via a mobile device, and 58% of children aged between nine and 17 access the internet every day. There was no use difference based on gender or whether the children were from urban or rural areas. The overwhelming majority (97%) use smartphones to access the internet.  

The report found that not only had 53% of those children seen sexual images online in the year before the study, but also that more than two-thirds (67%) of child participants who had seen sexual images were exposed to them on an online device.

Pornography is highly addictive, and that accidental exposure often leads to purposeful searching and can become compulsive in a short time.

South African children were found to engage in risky online behaviour and have negative online experiences, which increases their online vulnerability to exploitation and abuse. Children who participated in the survey reported that during the year preceding the survey, 40.1% of them had experienced unwanted exposure to sexual experiences and materials, while 20.4% experienced unwanted online sexual advances. Between 7% and 9% of children had been subjected to online child sexual abuse and exploitation such as having their sexual images shared without permission, being blackmailed or coerced to engage in sexual activity.

Marita Rademeyer, a clinical psychologist from Jelly Beanz, which is dedicated to helping children affected by online and in-person sexual abuse, sees these children daily. “Tannie Marita”, as she is affectionately known, works with children from primary school upwards. But her youngest client with a dependency on watching pornography was a five-year-old who had first seen pornography on her dad’s phone. 

A 2021/22 digital well-being survey targeting pupils in grades 4 to 11, conducted by Be in Touch Digital Marketing and peer reviewed by the Bureau of Market Research’s (BMR) Youth Research Unit, found that 60% of the children surveyed had first viewed pornography by age 10, 56% had first seen it at home and 34% had first seen it at a friend.  

Even more telling are stats about intentionality. Most of the children included in a 2016 BMR survey reported that they had first viewed pornography accidentally, 49.5% while they were surfing the internet for entertainment and 39% while researching content for schools. 

Rademeyer confirms how easily it can happen through a misspelt word, or Google misunderstanding the request. For example, a child may search for something innocent such as “Californian newt” (a kind of reptile). Google then corrects it to Californian nude and suddenly the child is inundated with inappropriate images. She says it’s also common for children to be exposed to pornography when their parents stream movies for them while they work, and a child accidentally clicks on a pop-up, which are plentiful.

One of the Jelly Beanz clients, Josh*, now an adult, says he was first exposed to pornography at the age of seven through a boy of 12. He describes watching pornography as “like eating sushi”. He says that, “at first you don’t like it, and then you crave it”. Even at the age of seven he remembers that it “produced big feelings”.

The impact of pornography viewing on children is disturbing. Citing important studies on children’s exposure to pornography, Rademeyer explains that pornography is highly addictive, and that accidental exposure often leads to purposeful searching and can become compulsive in a short time. She says that studies show that early introduction to pornography (ages seven to 11) results in significantly more depression and less satisfaction in adulthood than those exposed later or not at all.  

Pornography also has a negative impact on children’s thinking. Those who used pornography showed increased impulsiveness, poor decision-making, memory problems and decreased learning ability.  

According to Rademeyer, pornography becomes the main source of sex education for many children, and those who view it have more sexual partners, are less likely to use contraception, are more likely to have used alcohol or other substances in their sexual encounters, are more likely to contract sexually transmitted infections or become pregnant, and are more likely to sexually abuse siblings.

Josh confesses that he thought that girls “wanted sex all of the time”. He couldn’t understand why the boys he was watching seemed to “get it right”, but he didn’t. Another former Jelly Beanz client, Simon*, who began watching pornography at the age of 10, said that in his community, viewing pornography was a daily occurrence and that the boys spoke openly about it and about pleasuring themselves while they watched.  

98% of frontline workers interviewed for the study identified access and exposure to pornography as the greatest factor making children vulnerable to online sexual abuse and exploitation. 

He said that for his peers, kidnapping girls was considered okay and that boys would take what they wanted if girls constantly said “no”. As someone who watched gay pornography, however, he was terrified of anyone finding out. He said pornography also gave his peers negative messages about homosexuality and that it exacerbated the bullying of gay children.

Kate Farina from Be in Touch, which is dedicated to helping families navigate their kids’ online world, says that in a country with such extreme levels of gender-based violence, pornography worryingly normalises violence: “Boys are growing up believing that good sex is violent and painful, while girls are growing up believing they need to be compliant and submissive.” 

Farina says that at a private boy’s high school following a recent viewing of the Fight the New Drug documentary Brain, Heart, World, 77% of the boys surveyed agreed that pornography normalises sexual violence.

Moreover, 98% of frontline workers interviewed for the Disrupting Harm study identified access and exposure to pornography as the greatest factor making children vulnerable to online sexual abuse and exploitation.  

Jelly Beanz explained that pornography viewing among children is very difficult to police because parents are usually the last to know when their child has a problem. This is because children actively hide their tracks when using pornography and minimise their involvement in it. They also stress that caregivers cannot talk a child out of using pornography, and that punishment doesn’t change the behaviour, it sends it further underground.

It makes prevention of exposure essential. 

Legislative reform

It’s one of the chief motivators for the South African Law Reform Commission’s study on pornography and children, and its recommendations to amend legislation to protect children from exposure to pornography which is criminalised in Section 19 of the Sexual Offences Act. 

The report explains that the Films and Publications Act already compels internet service providers to register with the Film and Publications Board and to take steps to prevent their services from hosting or distributing child pornography, and to protect children from any crime committed against them (which should include exposure to pornography). Further, these injunctions to shield children from access to pornography are repeated in the South African Cellular Operators Association Code of Good Practice (SA Cellular Code) as well as the Wireless Application Service Providers’ Association Code of Conduct (Waspa Code).

But current legislation has clearly had little impact.

Drawing from international best practice, the commission therefore recommends that through amendments to the Sexual Offences Act, the government develops legislation that comprehensively criminalises the enticement of children to view or to make child sexual abuse material, along with anyone making pornography accessible to children.

The first part is applicable to all persons advertising, providing access to, distributing or enticing a child to view pornography. The amendment would criminalise all acts of exposing children to pornography and unsuitable content. 

The second part is applicable to anyone, including the manufacturer or distributor of any technology or device or an electronic communications service provider. The amendment would require a default block to be placed on all devices to prevent children being exposed to pornography. All devices (new and second-hand) would be issued with the block or must be returned to a default setting when they are sold or given to a minor. The block will prevent children accessing inappropriate content, but includes an opt-out possibility on proof that the buyer or user is 18 and older. 

It would then be a criminal offence to allow a child to engage with any device, mobile phone or technology with internet access, without ensuring that the default block is activated to prevent children being exposed to pornography or child sexual abuse material. It would also be illegal to uninstall the default block.

The recommendation further criminalises the use of misleading techniques on the internet, specifically the embedding of words or digital images into the source code of a website, an advertisement or domain name, to deceive a child into viewing or being exposed to child sexual abuse material or pornography. 

Moreover, it suggests that the Films and Publications Act be amended to provide for a clean-feed regime for material deemed unsuitable for children.

Limiting exposure

The commission is aware that legislative changes are not sufficient to protect children. It also recognised that Disrupting Harm findings confirmed the scarcity of online education for children. The study revealed that less than half (only 41.4%) of the child participants had ever received information on online safety.

The commission therefore recommended that the government “work with internet access and service providers to roll out a national awareness-raising campaign, underpinned by further research, to better inform parents, professionals and the public about what pornography is; young people’s access and exposure to pornography; and responsible safe use of the internet”.

These recommendations were workshopped with key role players in the internet, mobile phone, online safety and child protection space, and the commission recognised some practical concerns about implementation raised by these interested parties.  

In response, it proposed a three-pronged strategy for limiting children’s exposure to pornography.  

First, using legislation to include a block at the point of the end user. It acknowledged that this may not be fail-safe because, for example, many devices given to children are second-hand and enforcing the reinstatement of the block may be challenging. For this reason, its second strategy would be to persuade the major platforms to put codes in place so that mobile phones cannot link to their platforms without a pornography block when the phone is set up (again, noting that the block can be disabled if you’re an adult). Third, it recommends legislatively placing obligations on electronic communications service providers through the Independent Communication Authority of South Africa, and regulations.

The commission also presented a proposed legislative solution to children being criminally charged for sexting.  

Sexting is defined as the sending, receiving or forwarding sexually explicit messages, images or videos via an electronic device. The Disrupting Harm survey found that 84% of children felt that sending sexual content online was very risky and 68% strongly agreed that a person should not take these photos or videos or allow anyone else to do so.  

The risks of creating and sharing what children euphemistically call ‘nudes’ are significant… Of these risks, extortion, or sextortion as it is commonly known, is particularly rife. 

However, in practice, 8% of children surveyed confessed to having taken nude images or videos of themselves, and 5% said that they had allowed someone else to do so.

In addition, 8% of children surveyed said they had shared naked pictures or videos of themselves online in the past year. When asked why they did it, most children said they were in love, or flirting and having fun. Others said that they trusted the person or that they were worried that they would lose the person if they didn’t share.  

A worrying 21% said they did not think there was anything wrong with sharing.  

Equally concerning are the children who shared because they were pressured by friends, threatened or offered money or gifts in exchange for the images. 

Eight percent of children surveyed confessed to having pressured someone else to share naked videos or images. 

Globally, the Internet Watch Foundation reported that most self-generated child sexual abuse images are of 11- to 13-year-olds, but it noted a 360% increase in self-generated sexual imagery of seven- to 10-year-olds from 2020 to 2022.  

Seventy-eight percent of the 255,571 webpages it flagged during 2022 contained self-generated images.

The Sexual Offences Act and the Films and Publications Act criminalises the creation, production, procuring and possession of “child pornography” which is defined as “private sexual photographs and films” or “intimate images”.  

While there seems to be legal uncertainty about whether children may distribute consensual intimate images and private sexual photographs and films, and about children self-generating content, the Law Reform Commission document states that “legally, the primary consequence for a child who voluntarily generates sexual material of him or herself is that distributing this material or making or possessing material of another child”, even with that child’s consent, “may lead to a charge being brought against the child for any number of child pornography-related offences, including possessing or exposing another child to child pornography”. 

This could “result in a conviction for a serious criminal offence, although the child would be dealt with within the remit of the Child Justice Act 75 of 2008”.

In addition, the risks of creating and sharing what children euphemistically call “nudes” are significant. Chief among those identified by the commission are the “unintended circulation” of images, images or videos being used for “bullying, revenge or extortion”, the adverse impact on children’s “well-being, reputation and future prospects”, and complications for law enforcement.

Of these risks, extortion, or sextortion as it is commonly known, is particularly rife.  

Tackling ‘sextortion’

According to Farina, sextortion of children, classified as cyber extortion under the South African cyber laws, occurs when predators blackmail children into sending them sexual or nude pictures and videos. Targeting children as young as 10 and usually focused on those in their early teens, often boys, criminals use Snapchat, Tiktok, Instagram and Discord to befriend and then manipulate and coerce children into sending them naked photos or videos.  

Farina says that the criminal (it could be an organised syndicate or a common criminal) puts together an account on a platform being used by the child. Typically posing as a teen boy or girl, a bored housewife, modelling or sports agent, they follow the child and then start a chat on direct message (DM), before asking for the child’s WhatsApp details. If the child gives them their phone number, the criminal sends them an explicit image or video before requesting one in return.  

Once the child has sent a nude image or video, they are told that unless they pay the criminal or produce more content (the criminal will be specific about positions and props), the images will be released to their friends and family.  

Many children do not tell their parents out of embarrassment, shame or fear of disappointing them.  

The report recommended that all offences relating to child sexual abuse material and children’s exposure to pornography should be criminalised in the Sexual Offences Act and that wording around grooming be changed to include online grooming. 

Nevertheless, in mid-2023 Emma Sadleir from the Digital Law Company reported getting up to 15 calls in a day from parents whose children had been targeted and sought help. These are often the fortunate ones. Others, crippled by shame, don’t turn to a trusted adult and tragically, convinced these is no other recourse, commit suicide.

According to Farina, if sextortion occurs, the child’s family must screenshot the conversations on the social media platforms and on WhatsApp, because this counts as evidence to the police. Then, before blocking the predator, they should send this message: “I have spoken to my parents. They are reporting it to the police for investigation because you are in possession of child pornography.” 

She counsels families that when a nude is shared from a child to an adult, it is classified as child pornography, so they need to use the Crimestop number 0860 010 111 to report the matter to the police and ask for a detective from the Serial and Electronic Crime Investigation unit who investigates online child pornography-related matters.

Options to manage sextortion after it occurs are limited, so awareness and prevention are key. Farina’s top tip to stop sextortion is for children to set their account to private, and for parents to help vet followers. But she and other child protection experts emphasise that the best way to stop sextortion completely is to deter children from sharing naked images. 

Decriminalise ‘consensual sharing’

While recognising children’s right to self-expression, the Law Reform Commission viewed preventing what it terms “self-generated child sexual abuse material” as an important goal. It further sought to take into account Unicef’s position that “consensual self-generated child pornography by certain children should be decriminalised for personal use between consenting children”; the United Nations Convention on the Rights of the Child’s advisory to decriminalise “consensual sharing” of images between children; as well as expert opinion warning of the “serious but unintended consequences of these images falling into the wrong hands when distributed”. 

The commission therefore recommended decriminalising children showing a naked picture of themself to another child within the confines of a “consensual lawful relationship” provided it is their own image.

Children will, however, not be permitted to electronically send images of themselves to anyone, or forward images of any other child.  

It further recommended that once the child turns 18, there would be “no defence for the continued possession of the material”, so all naked self-images must be deleted. 

It also recommended education to help children understand the impact of taking and sharing naked pictures and videos. 

The commission noted children’s vulnerability and the need to protect them from being used for, or exploited through, child pornography (now termed child sexual abuse material). It cited statistics produced by the Internet Watch Foundation which, along with its partners, blocked at least 8.8 million attempts by UK internet users to access videos and images of children suffering sexual abuse in one month alone.  

This was confirmed by Europol which noted a significant increase in the demand for child sexual abuse material since the start of the Covid-19 pandemic. 

The WeProtect Global Alliance’s 2023 Global Threat Assessment Report revealed an 87% increase in reported child sexual abuse material cases since 2019, with more than 32 million reports globally.

The commission stressed that “internationally the need to define and criminalise this behaviour, using accurate terminology, has become increasingly pressing”.  

The report recommended that all offences relating to child sexual abuse material and children’s exposure to pornography should be criminalised in the Sexual Offences Act and that wording around grooming be changed to include online grooming. 

Along with self-generated child sexual abuse material, it highlighted content created for aesthetic or creative purposes and images created by parents, particularly in electronic format. It stressed that those creating the content should be “alerted to and educated about the consequence of and possibility of abuse of the images as a result of distributing such material”.  

Given the rise of sexual abuse and exploitation online, another crucial development is the proposed legal requirement for electronic communications service providers and financial institutions to report if their facilities are used in an offence involving child sexual abuse material, as is criminalising all aspects of the live sexual abuse displays including live streaming, attendance of the displays, viewing them, or procurement of children to participate.

The commission’s report contains months of planning, researching, drafting, workshopping and consulting with the country’s leading experts on online and in-person child sexual abuse.  

But Van Niekerk says that, incomprehensibly, there’s no plan of action to implement any of its legislative or non-legislative recommendations. 

While it gathers dust, the lives of many South African children will be defined, destroyed and even ended by exposure to pornography and child sexual abuse material, including self-generated content. As the country ends yet another 16 Days of Activism, it’s clear that even when we have the solutions for ending violence perpetrated against children, there’s no urgency to implement them. DM

* Names have been changed to protect identities.

First published in the Daily Maverick: 12.12:2023