Carole Osborne founded Aidos to safeguard the identities of staff and pupils in photos on school websites and social media.
Its software uses AI to scan images uploaded by schools and anonymise the faces of students.
She says 150 institutions across the country, including several in Norfolk, are on the waiting list to use the tech, which is launching later this month.
Aidos uses AI to scan images of students uploaded by schools before generating new versions that anonymise the faces of students (Image: Supplied)
The “growing threat”, she says, is that pictures of children are being “stolen” by criminals and photoshopped onto pornographic material, which they are then threatening to upload to what is known as the dark web.
Miss Osborne has been working with schools which she claims have reported being blackmailed to pay thousands of pounds via cryptocurrency to prevent explicit content from being posted online.
The Norwich mother of two teenage daughters said she first found out about the issue while working with schools for her creative agency, Borne.
She said many schools have told her that they have stopped posting images on their websites and social media altogether out of fear of falling victim to extortion.
She declined to reveal which Norfolk schools she is currently working with and which had already been targeted by criminals.
Carole Osborne, Aidos founder and CEO (Image: Supplied)
Schools will pay an annual subscription for access to Aidos’ software, which generates new, unidentifiable images of students – which have been slightly distorted by AI to change their features – while keeping the setting, context and expressions in photos.
“It’s a growing threat,” Miss Osborne said.
“Schools still want to be able to market themselves and their culture because it is important for parents to be able to look at school websites and see whether the culture fits them, and that children are having fun and are safe.
“They want to be able to retain that, but don’t want the faces of children being put at risk.
“We’re seeing children fearing that their image is being circulated on the dark web. They could be walking down the road and suddenly somebody could look at them for too long and they automatically think ‘have they seen the image of me?’
“They will also try to look for the image of themselves on the dark web. You can imagine the stuff that they are seeing on there and exposing themselves to.”
An example of an original image uploaded by schools compared to an image protected by Aidos (Image: Supplied)
It is not only children being targeted. Alderman Peel High School, in Wells, contacted Norfolk Police earlier this month after fake profiles of staff were posted on a sex website and began circulating among students and parents.
Believing the profiles were genuine, parents contacted the school to complain. The chief executive of the Wensum Trust, which runs the school, wrote to parents to confirm the profiles are fake.
The Internet Watch Foundation (IWF) is working to eliminate child sexual abuse imagery online.
It says that without urgent intervention AI tools will become “child sexual abuse machines”.
Its analysts have seen a “dangerous” and “frightening” 26,362pc rise in photo-realistic AI videos of child sexual abuse, which often include real and recognisable child victims.
In 2025, the IWF discovered 3,440 AI videos of child sexual abuse compared to just 13 in 2024.






