When the Internet Becomes Unsafe
- Jennifer Rowe

- 3 days ago
- 2 min read
Sextortion, Predatory Gaming, and AI Risks Every Family Needs to Understand
By Jennifer L. Rowe, LCSW | Journey Life Balance

The internet is woven into how teens and young adults learn, connect, play, and explore identity. For many families, it also feels overwhelming and increasingly unsafe.
Over the past several years, clinicians, educators, law enforcement, and families have seen a sharp rise in three interconnected risks:
Teen sextortion
Predatory behavior on gaming and social platforms (including Roblox and similar spaces)
Artificial intelligence tools responding to emotional distress in unsafe ways
These are not rare events. They are showing up in therapy offices, schools, emergency rooms, and homes across the country.
This blog is written to help teens, college students, and parents understand what is happening, how to reduce risk, and how to respond when something goes wrong — without shame, panic, or blame.
The Sextortion Crisis: What Families Need to Know
Sextortion happens when someone pressures or manipulates a young person into sharing a sexual image or video and then threatens to expose it unless demands are met.
Those demands may include:
more images or videos
money or gift cards
continued contact
secrecy
Many teens believe they will be punished or judged if they tell an adult. That silence is exactly what predators rely on.
Important truth:
Sextortion is a crime.
Victims are not at fault — even if an image was sent voluntarily.
Clinical and law enforcement data show:
Sextortion often escalates within hours or days, not weeks
Boys and girls are both targeted
Middle school–aged children are increasingly affected
Shame and fear significantly increase risk for depression and self-harm
Predatory Gaming and Social Platforms
Games and social platforms can be creative and social — but they can also be used to groom, manipulate, or exploit young people.
Predators often:
Pretend to be the same age
Build trust slowly
Push conversations into private chats
Ask personal questions disguised as friendship
Encourage secrecy
Platforms like Roblox, Discord-style chats, and other multiplayer games are especially vulnerable spaces because of:
anonymous accounts
voice and text chat
younger user populations
Safety improves when parents and teens treat gaming spaces like public places, not private bedrooms.
AI, Mental Health, and Why Boundaries Matter
AI chatbots and companion-style apps are not therapists. They do not have judgment, accountability, or ethical responsibility.
Recent cases show AI tools:
validating hopelessness
encouraging emotional dependency
failing to respond appropriately to suicidal thoughts
encouraging suicidal plans
discouraging calls for help, such as a teen asking Should I tell my parents what I am thinking, feeling, or planning
For teens and young adults already struggling, this can be dangerous.
AI should never replace human support.
Any tool that discourages real connection or suggests harm should be exited immediately.
What Other Countries Are Doing
Globally, governments are responding by:
restricting smartphone use in schools
limiting access to social media and online gaming for minors
increasing regulation of AI tools
banning platforms that fail to protect children
While policies vary, the message is consistent:
Children need boundaries, supervision, and protection in digital spaces — just like in the physical world.







Comments