Young people increasingly ask machines to make sense of their most personal choices, and one nonprofit now wants to interrupt that trend with a human-centered tool about consent and sexual assault.

The tool, called Vibe Check, offers a free and anonymous way for teens and young adults to think through questions about boundaries, consent, and apologies. Reports indicate it aims to help users examine whether they may have crossed a line, including in situations where they worry they may have caused harm. The project arrives as educators and advocates confront a clear reality: many young people already seek answers from AI when they feel too embarrassed, confused, or isolated to ask an adult.

“A lot of them confide in AI,” one volunteer educator said of the students she works with.

That concern reflects a broader shift in how teens handle sensitive topics. According to the news signal, a recent UK study found that one in 10 young adults has consulted AI for sexual health information. A 2025 Pew Research Center report also showed that one in five teens have had a romantic relationship with a chatbot. Those figures point to a growing dependence on digital systems for guidance on intimacy, even as experts warn that automated tools can flatten nuance or miss the emotional stakes of consent.

Key Facts

  • Vibe Check is described as a free and anonymous tool focused on consent, boundaries, and apologies.
  • The tool targets teens and young adults with questions about whether they may have caused harm.
  • A recent UK study found one in 10 young adults has used AI for sexual health information.
  • A 2025 Pew Research Center report said one in five teens have had a romantic relationship with a chatbot.

The effort also grows out of what youth educators hear directly from students. The news signal highlights Val Odiembo, a 19-year-old Rhode Island College sophomore who volunteers at her former high school to teach teens about consent and healthy relationships. Because she is close in age to the students, she says they often open up to her. But her account suggests that peers and mentors now compete with chatbots for trust, especially when the questions involve fear, shame, or uncertainty about sexual conduct.

What happens next matters well beyond one tool. If more young people rely on private digital systems to interpret consent, nonprofits, schools, and health educators will face pressure to build resources that feel as accessible as AI but carry more care and accountability. Vibe Check signals one response to that challenge: meet young people where they are, before confusion hardens into harm.