Published Wednesday, May 28, 2025
When you hear a parent’s pain firsthand, there’s no going back to business as usual.
At Cyber Dive, building technology that detects online grooming and nudity isn’t just part of the roadmap—it’s a responsibility. And for many of us, it’s deeply personal.
I’m not just an engineer. I’m an aunt, a friend, a human who’s seen firsthand what happens when a child clicks the wrong link or trusts the wrong stranger.
We know how overwhelming today’s digital world feels—especially for parents trying to balance freedom and safety. That’s why this isn’t just a breakdown of what we’ve built. It’s a window into why we built it.
Before anything was designed or developed, we listened. We asked parents what scared them, what confused them, and what they wished they could see in their child’s online life.
Over and over, we heard one thing loud and clear:
They were terrified of online grooming—but had no idea how to detect it.
Ricky Slade, 29, was exposed by a vigilante group for grooming what he believed to be a 10-year-old girl. These chats led to his arrest and later conviction for attempted sexual offenses.
Ricky Slade, 29, was exposed by a vigilante group for grooming what he believed to be a 10-year-old girl. These chats led to his arrest and later conviction for attempted sexual offenses.
“Please help. My kid is talking to a pedophile.”
It was not just a feature request. We knew it was a collective cry for help. So we got to work.
We studied. A lot. Case reports, research papers, survivor stories. Platforms like Snapchat and Instagram. What we found was gut-wrenching:
What started as research quickly turned personal. It hit home.
One line kept echoing through our heads: "If a stranger followed your child down the street, you’d act immediately. But what if they were chatting online—and you had no idea?”
That idea stayed with us—during dinner with family, in late-night conversations with friends, and in team meetings where product decisions turned into personal commitments.
As we were studying, we came across people and families who were victims of this situation. They lost their children. For some teens, they lost their childhood. It’s no longer what we hear in the news and read in the articles.
They were real.
One of them was Brandon Guffey.
When we met with South Carolina Representative and Cyber Dive advisor Brandon Guffey, everything shifted. He shared how his son lost his life to sextortion—a word he hadn’t even heard until it was too late.
Like you, Brandon Guffey thought he was doing everything right to protect his 17-year-old son, Gavin. He used screen time limits, monitored apps, and stayed vigilant.
But one day, Gavin became the victim of a sextortion scam on Instagram. A scammer pretended to be a girl Gavin’s age, gained his trust, and convinced him to send intimate photos.
And the blackmail started immediately.
Gavin sent what little money he had, begging for more time. But the threats continued. And things went for the worse.
Gavin took his own life.
Two weeks into Gavin’s funeral, Brandon received a chilling message from the scammer: "Did I tell you your son begged for his life?”
The tools Brandon relied on missed the signs. They couldn’t prevent the scam, stop the extortion, or protect his son in real time.
That’s when we realized the gap other kid phones miss.
South Carolina State Rep. Brandon Guffey stands before the Senate with a photo of his son Gavin, whose suicide followed a sextortion scam in 2022. Guffey is now pushing for the Kids Online Safety Act to hold platforms accountable and prevent future tragedies.
South Carolina State Rep. Brandon Guffey stands before the Senate with a photo of his son Gavin, whose suicide followed a sextortion scam in 2022. Guffey is now pushing for the Kids Online Safety Act to hold platforms accountable and prevent future tragedies.
We don’t want to just be the next “cool thing” anymore. We want to build a phone for real families, real grief, and real change.
Our CTO, Harshini Kanukuntla, stopped everything. She redirected our team to build what would become Nudity Prevention—not from a roadmap but from deep empathy and the need to act now.
Public headlines reflect a crisis: online child exploitation is rising. Over 12,600 sextortion cases and 20 known suicides in two years. Aqua One’s prevention tech was built to change that story.
Public headlines reflect a crisis: online child exploitation is rising. Over 12,600 sextortion cases and 20 known suicides in two years. Aqua One’s prevention tech was built to change that story.
We went back to the data:
Our engineers started with one goal: to create a tool that could accurately detect nudity and protect kids. But building something like this wasn’t simple. It took time, research, and a lot of heart.
Rashmi, a Cyber Dive Machine Learning Engineer, played a key role in developing Aqua One’s Nudity Prevention feature—designed to detect risk in real time and help protect kids before harm occurs.
Rashmi, a Cyber Dive Machine Learning Engineer, played a key role in developing Aqua One’s Nudity Prevention feature—designed to detect risk in real time and help protect kids before harm occurs.
First, we had to understand how kids' bodies change as they grow. This helped us know what to look for when teaching the system to spot unsafe content. We studied public image datasets and reviewed thousands of explicit images—hard work that required strong emotional focus, like the kind seen in medical or forensic jobs.
Next, we thought about how kids actually take photos:
Because of all these possibilities, the dataset had to be huge and varied. We needed the system to work in all kinds of situations.
Then came the question: What kind of AI model should we use?
Should it focus on actions? Body parts? Skin tones? Poses? Our team ran many tests using different ideas and combinations to see what worked best.
After training the system, we still had to make it faster and more accurate. At first, it took around 6 seconds to detect a risky photo. That was too slow. So we kept testing, changing, and improving until the tool could work in real-time.
But we didn’t stop there.
Every person on the Cyber Dive team helped test the tool. We tried using it in different rooms, with different lighting, while working, at home, or even while traveling. We tried to mimic how a child might use their phone.
Through this, we found mistakes the system made—and fixed them. Over and over, we improved the model so that it could be smarter, faster, and more trustworthy.
In the end, every department came together to make this tool work—from start to finish. Was it easy? No. Was it perfect? Not yet. But we’re not done.
Currently, we’re talking with parents who purchased Aqua One to gather feedback. They say a few bugs occasionally show up here and there. We expected that, and we will work on it. That’s part of the process.
After all, we had nothing to model to. Aqua One Nudity Prevention is a first-of-its-kind product born from countless trials and errors with one goal in heart: to protect kids, families, and their future.
Uncomfortable? Yes. Tiring? Definitely. But it’s all about the purpose.
This wasn’t just about a smart filter preventing nude images. It was about preventing family pains and childhood trauma.
Nudity Prevention gives parents a way to step in before it’s too late. And kids the layer of protection they don’t know they needed.
Cyber Dive Co-Founders Derek Jackson and Jeff Gottfurcht in Times Square, where Aqua One took over the Nasdaq billboard with a bold reminder: online predators don’t look like villains. They look like anyone—and they’re texting our kids.
Cyber Dive Co-Founders Derek Jackson and Jeff Gottfurcht in Times Square, where Aqua One took over the Nasdaq billboard with a bold reminder: online predators don’t look like villains. They look like anyone—and they’re texting our kids.
At Cyber Dive, we don’t build for control. We build for connection.
Every line of code, every feature we ship, is shaped by empathy, research, and real conversations with families like yours.
But just like how we built it in the lab, it starts with understanding. With empathy.
Guiding your kids through the internet is the same way you taught them to cross the street: one conversation, one question, one boundary at a time.
That’s what mindful parenting in a digital age looks like.
We show up because we know you do. Because you’re trying. Because you want to raise safe, self-aware, digitally responsible kids. And we believe you shouldn’t have to do that alone.
This is more than a product. It’s a mission.
And we’re honored to walk that path with you.
Learn more about Aqua One here.
Rashmi Bongirwar
I'm an engineer, yes, but first, I'm a people centric thinker and an aunt. These titles shape everything I do. I've seen what can happen when kids stumble into the wrong corners of the internet, and I'm not here to stand by, I'm here to contribute to make life more humane and meaningful. As a Machine Learning Engineer, I build smarter systems to keep them safe because behind every data point is a real kid with everything to lose.
When I'm not wrangling algorithms, I'm recharging through rhythm and color: whether that's practicing a new dance routine, painting something that speaks louder than words, or getting completely absorbed in a book. I believe tech should have a heart and mine beats for the next generation.
Type 3 Achiever / INFJ Advocate
The internet moves fast—predators, loopholes, and digital dangers evolve daily. We track it all so you don’t have to.
No fluff. No fearmongering. Just the truth you need.
🔒 Join thousands of informed parents—sign up now.
© Cyber-Dive Corp. 2025