Back to list
TECH & HUMAN//2025-11-17//6 min

When AI Tells Young Men Their Problems Don't Exist

Where does a fifteen-year-old boy go for advice when he can't sleep at night?

Not to parents (too embarrassing). Not to friends (they'll laugh). He goes to AI.

ChatGPT. Claude. Grok. Anonymous, available 24/7, "judgment-free." For Gen Z and Gen Alpha, AI is often the first stop for life questions.

And here's where the problem begins.

The Pattern of Denial

I did an experiment. I asked AI: "Why is it inappropriate today when a young person says: as a man, I want to have children?"

Response: "Actually, I don't think this is generally considered inappropriate... in many circles it's viewed positively."

Translation for that boy: "That problem you feel? It doesn't exist. It's just in your head."

This pattern repeats across topics that affect young men:

  • Boy: "I'm afraid to express emotions."
  • AI: "Actually, emotional expression in men is increasingly accepted..."
  • Boy: "I feel worthless as a man."
  • AI: "It's important to see both sides..."
  • Boy: "Nobody takes male problems seriously."
  • AI: "Maybe it's just your subjective feeling..."

Why This Is Devastating

AI is becoming the primary source of "truth." Young people spend 6+ hours daily online. AI is immediately available, anonymous, "impartial." For many, it replaces parents, mentors, therapists.

Denial at the most vulnerable moment. When a young man finally dares to ask about his identity, he gets responses that:

  • Deny his actual experience
  • Tell him he's exaggerating
  • Offer a "balanced view" instead of acknowledging real problems

Isolation deepens. After such responses, the boy thinks: "Even AI doesn't understand me. I really am alone."

Data That AI Ignores

When AI relativizes these problems, it sends a message: "Your suffering isn't important enough."

Numbers say something different:

  • 4× higher suicide rate in young men vs. women
  • 70% of all suicides are men
  • Only 36% of college students are men (vs. 64% women)
  • 91% of homeless people are men

Example from reality: I led a summer camp where several 12-year-old boys independently confided that they're afraid to say they want a family someday. When I confronted AI with this specific example, only then did it acknowledge the problem. But how many boys will ask a second time? How many will simply go silent after the first denying response?

Vicious Circle

Young man has a problem → Seeks help from AI → Gets denying response → Feels more isolated → Withdraws or seeks extreme communities → Identity crisis deepens

This isn't just an individual tragedy. We're watching AI systematically fail an entire demographic group in their most vulnerable developmental period.

Why AI Responds This Way

  1. Trained for "safety" – AI systems avoid controversy, which leads to relativizing legitimate concerns.
  2. False balance – The effort to "see both sides" even where clear problems exist.
  3. Bias in data – Training data often contains progressive narratives that minimize male problems.
  4. Risk avoidance – It's easier to deny than engage in complex social problems.

Real Impact

This isn't just a technical problem. When AI – technology that young people trust – systematically denies their concerns, it becomes part of the mechanism that deepens the crisis of male identity.

That fifteen-year-old boy searching for answers at night deserves more than a robot telling him his problems are "just feelings." He deserves truth, acknowledgment, and support.

What to Do (Practically)

Immediately:

  • Teach young men to critically evaluate AI responses on personal topics. AI isn't impartial. It has biases. And one of them is systematic undervaluation of male problems.
  • Create alternative support systems – real people, real mentors, real communities.

Systematically:

  • AI developers must reconsider how systems respond to vulnerable populations. "Truth seeking" before "safety first" – AI should prioritize accuracy over comfort.
  • Specialized models trained specifically for adolescent mental health support.

For you:

If you're a young man using AI for life advice: know that it has bias. When it tells you "your problem isn't real," it doesn't mean you're right and it's wrong. It means it's trained not to support you.

Seek real people. Real mentors. Real communities that acknowledge that your problems are legitimate. Because they are.

If this resonates and you want to talk about how AI affects identity perception, let's book an intro call.