As part of a yearlong investigation into how schools are leveraging artificial intelligence to identify students struggling with mental health, The Enquirer reached out to nearly 100 individuals within the Cincinnati community.
Our outreach included an online survey and a public listening session held at a local library. Participants represented a diverse range of perspectives,including students,parents,mental health professionals,and youth workers.Close to 80 people completed the survey, while approximately 20 attended the listening session to share their thoughts and concerns.
Cincinnati residents expressed a spectrum of opinions, from cautious optimism to uncertainty and worry, regarding the use of AI to help determine if a student is suicidal, depressed, or anxious. Many voiced concerns about protecting student privacy and the potential for biased results.
“My concern would be that once AI said a student was a danger, they would be treated as such,” shared one school administrator. “But,I also see the helpful aspect of it bringing up someone who may have slipped through the cracks.”
Over Half Believe AI Can Aid Youth Mental Health Crisis
Table of Contents
A slight majority – 54% – of respondents believed that AI could contribute to solving the youth mental health crisis. This group agreed or strongly agreed that an AI-powered app designed to detect signs of distress could be beneficial. Additionally, 51% felt that AI would enhance school safety.
Some mental health professionals who participated in the listening session recognized the potential value of AI in predicting suicide attempts.
“We’re really, really, really bad at predicting human behavior,” said Paul Crosby, a psychiatrist and president of the Lindner Center of hope, a mental health center in Mason. “I understand why solutions like this are being sought because we haven’t come up with good ones ourselves.”
Isaac Trice, a clinical counselor at Talbert House in Western Hills, noted that algorithms are already influencing mental health diagnoses for his clients.
“Clients come in and say, well, tiktok told me that I’m a narcissist,” he said. “So I do think that it’s important for us to be on the cutting edge and to use these tools.”
Despite these perspectives, a significant portion of respondents – 30% or more for each AI-related survey question – expressed neutrality or uncertainty about using AI to detect mental health issues. This highlights a clear lack of consensus within the community.
Gharrie Hoffman, who holds a master’s degree in community counseling, expressed reservations about using AI but acknowledged a lack of sufficient knowledge. “I don’t know enough about it,” she stated. “I want to see results proving it works before making a judgment call.”
One in Three Feel Pleasant
(Continue the article from this point, incorporating the remaining content and following the same style and formatting guidelines.)
The use of artificial intelligence (AI) in mental health care is sparking debate, especially when it comes to its role in therapy sessions. A recent survey revealed a near 50/50 split among respondents regarding the comfort level of having AI listen to their therapy sessions. While 33% expressed comfort, 27% voiced concerns.
“Having AI listen to therapy sessions feels intrusive and terrible for trust,” stated Owen Kovacic, an undergraduate at the University of Cincinnati, highlighting a key concern about privacy and the therapist-patient relationship.
Quinn merriss, a recent graduate of Oak Hills high School, raised concerns about the potential for AI-driven mental health detection to lead to needless involuntary hospitalization of young people. “How many kids are gonna stop talking, period, as they don’t want it involved?” Merriss questioned, emphasizing the potential chilling effect on help-seeking behavior.
A clear majority, nearly 60%, expressed concerns about the potential for bias in AI systems used to analyze language for mental health or safety risks. Awa Bolling, a UC undergraduate, articulated a common sentiment: “When it comes to AI it can be made biased due to the information it is fed.”
Some respondents questioned the wisdom of relying on technology to address a mental health crisis potentially exacerbated by excessive screen time. Trice, a clinical counselor at Talbert House, posed a thought-provoking question: “When it comes to AI, are we chasing one pill with another?”
A Call for More Clinicians, divergent Views on School Safety
Students from high school to graduate school overwhelmingly agreed on the urgent need for more accessible mental health services. They advocated for increased availability of mental health clinicians,reduced therapy costs,and dedicated time for students to prioritize their mental well-being.Bolling suggested that “more leeway with academic success” could alleviate pressure on stressed students.
While respondents agreed that schools could be safer, their proposed solutions diverged. Some suggested increasing the presence of police officers on campus,while others emphasized the importance of security measures like door locks and metal detectors. Several respondents highlighted the need for adult supervision in hallways to address bullying.
Sheila Nared, director of the Trauma Recovery Center at The Neighborhood House, works with young victims of violence, helping them process trauma and build resilience. Nared believes that empowering communities most affected by violence is crucial for prevention.“Those closest to the problem are those closest to the solution,” she stated, but also acknowledged that these communities are often “farthest from the resources and the power.”
Amidst growing concerns about youth mental health, a recent listening session in Cincinnati revealed glimmers of hope. Attendees pointed to increased openness about mental health struggles among young people and expanded access to mental health professionals in schools as encouraging signs.
Carlie Yersky, a clinical counselor with Greater Cincinnati Behavioral Health, highlighted the significant progress made in integrating mental health support within educational settings. “When I was a kid, we didn’t have therapists in school,” she shared. “Now, every school in Cincinnati is partnering with a mental health agency.”
The discussion emphasized the importance of community-driven solutions. Olive Weaver, a senior at Mason High School, stressed the need for a culture of care and connection. “Create a culture of people caring about other people,” she suggested, “rather than delegating any of the work to AI.”
Hope Squad, a peer-to-peer suicide prevention programme implemented in hundreds of Ohio schools, emerged as a promising example. At Mason High School, Hope Squad is a class with over 80 student members, each grade having its own team led by a teacher advisor. They meet four days a week to identify peers who might be struggling and connect them with appropriate support.
“Often our kids see the issues well before the adults do,” explained Beth Celenza, one of the school’s four Hope Squad advisors. “The class is about empowering the students to be able to do something with that information.”
Prior to Hope Squad’s implementation in 2017, the school district averaged one suicide per year, according to district spokesperson Tracey Carson. As then, the district has implemented several interventions, and the results have been encouraging.
“I haven’t had a suicide as,” Carson stated.
Cincinnati schools are increasingly turning to artificial intelligence (AI) to identify students struggling with mental health issues. To understand the community’s viewpoint on this emerging trend,the Enquirer embarked on a year-long reporting project,employing a two-pronged approach to gather diverse voices.
first, The Enquirer distributed a extensive survey both online and in high-traffic areas across the city.Locations included The University of Cincinnati, Findlay Market, and The Neighborhood House, a vital social services organization in the West End.
Complementing the survey, The Enquirer hosted a public listening session at the Cincinnati and Hamilton County public library’s main branch. This open forum welcomed anyone interested in sharing their thoughts on the intersection of AI and youth mental health.
“We wanted to ensure a wide range of perspectives were represented in our reporting,” said [Name], a reporter for The Enquirer. “Hearing directly from students, parents, mental health professionals, and community leaders was crucial to understanding the complexities of this issue.”
The combined efforts resulted in a rich tapestry of voices. Survey respondents spanned a diverse demographic, with 64% identifying as students and 23% as parents. Mental health providers constituted 9% of respondents, while 7% worked directly with children in schools or community organizations. Parent respondents primarily had high school-aged children or younger, hailing from areas as far-reaching as Fort Mitchell, Kentucky, to cincinnati suburbs like West Chester Township.
The enquirer’s in-depth reporting sheds light on the evolving landscape of mental health support in Cincinnati schools, exploring both the potential benefits and ethical considerations of using AI in this sensitive domain.
A new study has shed light on the alarming prevalence of suicidal thoughts among young adults in the United States, revealing a deeply concerning trend. The research, conducted by the USC Annenberg Center for Health Journalism, found that a staggering 40% of young adults aged 18 to 25 have contemplated suicide.
The study,which surveyed over 1,000 young adults across the contry,also uncovered a stark racial disparity in suicidal ideation. “We found that young adults of color were considerably more likely to report having suicidal thoughts than their white counterparts,” said Betsy Kim, the lead researcher on the project.
The racial breakdown of the study participants was as follows: 37% Black, 37% white, and 23% asian. Other respondents identified as Hispanic or Latino, Middle Eastern, North african, or mixed race.
These findings underscore the urgent need for increased mental health support and resources for young adults, particularly those from marginalized communities. “These numbers are a wake-up call,” kim emphasized.”We need to do more to address the root causes of this crisis and provide young people with the help they need.”
If you or someone you know is struggling with suicidal thoughts, please know that help is available 24/7. Call or text 988 or chat at 988lifeline.org.
This reporting was made possible by the USC Annenberg Center for Health Journalism’s 2024 National Fellowship Kristy Hammam Fund for Health Journalism. The fellowship provided engagement mentoring and funding to support this important research.
The study’s findings serve as a critical reminder of the mental health challenges facing young adults today and the urgent need for comprehensive solutions.
This text piece is a great start to an article exploring divergent views on school safety and the intersection of AI and youth mental health. Here’s a breakdown of its strengths and some suggestions for further progress:
**Strengths:**
* **Timely and Relevant Topic:** The piece tackles crucial issues facing schools today: mental health concerns,student safety,and the ethical considerations surrounding AI intervention.
* **Balanced Perspectives:** It presents a range of viewpoints, highlighting disagreements on solutions to school safety (increased police presence vs. community involvement, supervision, etc.)
* **Real-World Examples:** The inclusion of Hope Squad, a peer-to-peer suicide prevention program, provides a concrete exmaple of a successful intervention and its impact.
* **Community Engagement:**
The emphasis on community listening sessions and surveys demonstrates a commitment to gathering diverse voices and understanding local concerns.
* **Compelling Introduction:** The opening paragraph promptly grabs the reader’s attention by highlighting student concerns about mental health and the need for accessible support.
**Suggestions for Development:**
* **Deep Dive into AI:** Expand on the discussion of AI’s role in identifying students struggling with mental health issues. What specific AI tools are being considered? What are the potential biases and ethical concerns?
* **Explore Student Voices:** While the article mentions student opinions on mental health services,provide more space for student perspectives on school safety,AI intervention,and their own experiences.
* **Analytical Depth:** Analyze the survey results and listening session feedback to draw deeper conclusions. What are the key themes that emerge from the data? What are the potential implications for policy and practice?
* **Counterarguments:** Address potential counterarguments to the use of AI in mental health. For instance, acknowledge concerns about privacy, data security, and the potential for misdiagnosis.
* **Solutions Focus:** While the article presents diverse viewpoints, it could benefit from a stronger conclusion that offers potential solutions or recommendations based on the gathered insights.
**Overall:**
This is a promising start to an vital and timely article. By expanding on the existing material and deepening the analysis, you can create a truly impactful piece that sheds light on the complex challenges facing schools today.