The escalating adolescent mental health crisis has been met with a new array of options for direct youth access to mental health care online via apps, virtual visits with health professionals, and other digital therapies. What do youth think about these resources and how do they engage with them? How can youth, parents, and caregivers identify apps that are effective and appropriate?
Children and Screens held the #AskTheExperts webinar “Mental Health Apps, ChatBots, and Digital Therapy for Kids: What You Need to Know” on Wednesday December 11, 2024 at 12pm ET. An expert panel of psychologists, researchers, and behavioral health experts shared the current landscape of online mental health care for youth. They explored how these resources are being accessed by diverse youth from both privileged and underserved communities, how they reflect traditional models, what new possibilities are emerging, and the benefits and potential risks for youth seeking support online.
Speakers
-
Vaile Wright, PhD
Senior Director, Health Care Innovation, American Psychological AssociationModerator -
Henry Willis, PhD
Assistant Professor, Clinical Psychology Program, University of Maryland, College Park -
Elsa Friis, PhD
Head of Mental Health, Alongside -
Ava Shropshire
Youth Advisor, Alongside; Neuroscience Student at Washington University -
Stephen Schueller, PhD
Professor of Psychological Science and Informatics, University of California, Irvine -
Nicole Martinez-Martin, JD, PhD
Assistant Professor, Department of Pediatrics, Stanford Center for Biomedical Ethics
Resources Mentioned During the Webinar
- Promoting mental health in children and adolescents through digital technology: a systematic review and meta-analysis (Scholarly article)
- Interventions with Digital Tools for Mental Health Promotion among 11–18 Year Olds: A Systematic Review and Meta-Analysis (Scholarly article)
- One Mind PsyberGuide (Website)
- Understanding Mental Health Apps for Youth: Focus Group Study With Latinx Youth (Scholarly article)
- Digital tools and solutions for teen mental health
- Society for Digital Mental Health (Website)
- Let's Get Technical (Website)
- Developing culturally-adapted mobile mental health interventions: A mixed methods approach (Scholarly article)
- JoyNet (Tool/App)
- Alongside (Tool/App)
- 2024 Youth Mental Health Report
- Little Otter (Tool/App)
- Headspace (Tool/App)
0:00:11 – Introductions by Chief Science Officer of Children and Screens, Dimitri Christakis
0:01:48 – Moderator Vaile Wright on the youth mental health crisis and opportunities afforded by digital tools.
0:04:42 – Stephen Schueller on the evaluation of mobile mental health applications.
0:12:49 – Moderator follow-up: How can consumers figure out what apps are helpful and not harmful when searching for a mental health app?
0:15:46 – Henry Willis on addressing mental health disparities with digital tools and designing culturally-relevant digital mental health tools.
0:28:42 -Moderator follow-up: How can you engage communities that are difficult to reach in this work?
0:31:33 – Nicole Martinez-Martin on the ethics of digital mental health tools and artificial intelligence.
0:43:32 – Moderator follow-up: Do the data privacy policies of apps make it clear who owns the data?
0:47:06 – Elsa Friis and Ava Shropshire on co-design and bringing youth voices into digital mental health tools.
0:57:42 – Moderator follow-up: How can youth and adolescents engage in mental healthcare without inviting a stigma?
0:59:02 – The panel addresses questions from the audience.
0:59:11 – Q&A: What are the differences between digital tools developed for providing mental healthcare versus those that are designed for entertainment?
1:05:40 – Q&A: How can we address ethical issues related to AI chatbots and digital tools more broadly?
1:08:41 – Q&A: Are online therapies as effective as face-to-face for youth and adolescents?
1:11:11 – Q&A: What’s the youngest recommended age for online therapy or mobile health tools? Are there differences in effectiveness or safety of these products based on the age of the child?
1:15:12 – Q&A: Given the lack of research on these digital tools, is some support better than no support at all?
1:19:45 – Q&A: What advice would you give parents who have concerns about their children using these kinds of apps?
1:22:37 – Q&A: Are there any specific mental health apps that are recommended for youth?
1:23:40 – Q&A: What’s one takeaway for the audience?
1:25:33 – Closing remarks from moderator Vaile Wright.
[Dr. Dimitri Christakis]: Hello and welcome to today’s Ask the Experts webinar–mental health apps, chatbots, and digital therapy for kids: What you need to know. I am Dimitri Christakis, Chief Science Officer, Children and Screens, Institute of Digital Media and Child Development. Thank you for joining us for this important discussion today. As the adolescent mental health crisis continues to grow, a wave of digital solutions ranging from apps and virtual therapy sessions to chatbots has emerged to provide youth with new ways to access care. But how are these tools being used? What do young people think of them? And how can families and caregivers determine which ones are effective and appropriate? Today, our panel of experts will help us navigate this evolving landscape. They’ll examine how digital mental health resources for youth are being accessed by youth from both underserved and privileged communities. How these approaches compare to traditional models, and the exciting possibilities, along with the potential risks of online mental health support. Now, I’d like to introduce you to today’s moderator, Dr. Vaile Wright. Dr Wright is the senior director of health care innovation at the American Psychological Association. She’s a licensed psychologist and researcher, focusing on developing strategies to leverage technology and data to address issues within health care. This includes increasing access, measuring care, and optimizing treatment delivery at both the individual and systems level. Welcome Vaile!
[Dr. Vaile Wright]: Thank you Dimitri. I’d like to welcome everybody. And thank you so much for being here today. I am extremely excited to be serving as your moderator for this amazing panel. As Dmitri said, you know, I don’t think it’s going to surprise anybody on this webinar when I say “we’re experiencing a mental health crisis”. This was true before the pandemic, and it’s only been exacerbated since then. Complex problems like this need multiple solutions. But I do think one of those is going to be emerging technologies. And in this case, the things we’ll be talking about today, the different types of digital tools such as chatbots, online therapy, and different apps that could be used to improve access to care, reach of care, efficiency and personalized care, but only if we get it right. And what I mean by that is we have to make sure that whatever technologies are being developed are safe and effective, and that the people who are going to use them know how to evaluate for those things. We need to ensure that whatever is developed is culturally relevant for different groups, so that it can actually be responsive to the needs of the individuals that are suffering. We have to ensure that everything being developed is through an ethical lens, particularly as we see the increase of artificial intelligence being incorporated into these tools. And even if we get all those things right, we have to make sure that they’re engaging, because we could create the best technology to address mental health needs of youth; but if the youth aren’t going to use it, then we’re still not meeting our goal. So that’s why I’m really excited to have our panelists here today to start talking about many of these issues. How it’s going to go is each of our panelists is going to give a brief presentation. And then once that’s completed, we’ll come back together as a panel to answer the questions posed by the audience, either before the webinar, or during the webinar itself. And so with that, let’s kick it off and please help me in welcoming our first presenter. Dr. Stephen Schueller is a professor in the departments of Psychological Science and Informatics at the University of California, Irvine. He received his bachelor’s in psychology at the University of California, Riverside, his PhD in clinical psychology from the University of Pennsylvania, and completed his clinical internship and postdoctoral fellowship at the University of California, San Francisco as part of the Public Service and Minority Cluster. As a clinical psychologist and mental health services researcher, his work broadly looks at making mental health resources more available and accessible, especially through the use of technology. This includes the development, evaluation, and implementation of web and mobile-based interventions. At UC Irvine, he is the co-director of the Dissemination and Implementation Unit for the Institute of Clinical and Translational Science and a member of the Jacob CERES Center. He is a founding board member of the society for Digital Mental Health. I’ve presented with him in many, many forums and am excited to have him here today. Welcome, Stephen.
[Dr. Stephen Schueller]: Thank you so much. It’s always a pleasure to present with you Vaile.
I have a couple slides I’m going to share just to provide a really brief overview and some considerations about the evaluation of mental health apps and provide some overview of the landscape of what exists out there. So I just want to start off by saying, we have been working in this space for multiple decades now. We have hundreds of randomized controlled studies that demonstrate that digital mental health works, especially in common mental health issues like depression, anxiety, sleep. And these tools are being used around the world as front line treatments, in Australia and Europe. If you go to a general practitioner, it’s very likely that the first thing you’ll be given is not antidepressant medication, is not psychotherapy, but is one of these sort-of digital programs. And the reason is that there is strong evidence to suggest that these things are effective and in some circumstances may be as effective as psychotherapy or medications. You know, I do think one challenge, though, is that when we look at specifically, mental health apps and digital mental health interventions that have been developed for youth and adolescents, there’s many fewer studies that are out there. And so, you know, these are just two recent meta analyses. And you can see that there’s many fewer studies. And the effect sizes, the impacts are much smaller. In fact, a recent meta analysis by Wright and colleagues, looked at impacts across different aspects–depression, anxiety, stress and found that for some of these outcomes, there was actually indication that the control groups were better than the digital mental health interventions in the treatment groups. So this is really still an emerging area. A recent sort of review of reviews of digital mental health apps for youth and adolescents, found that, again, there was varied evidence, with strong evidence for computerized cognitive behavioral therapy, especially for depression, anxiety, but much less strong evidence in other areas. So this is an emerging field, this is a developing field. There are effective things that are out there, but there’s still a lot more we need to learn, especially around mental health apps for children. It’s also a large sort of field, and it can be hard to navigate. So, estimates suggest that there’s roughly 325,000 digital health apps that are out there. About 15 to 20,000 of those are focused on mental health. Only a small number of mental health apps have rigorous evidence demonstrating their efficacy. Maybe, you know, 150 out of those 15 to 20,000. And so, as I indicated, like there is strong evidence to suggest that mental health apps can be effective. There’s hundreds of randomized control studies, but there’s thousands of apps. And that leads to an imbalance. It makes finding sort of an effective app, like finding a needle in a haystack. The vast majority of mental health apps don’t have any direct evidence, any efficacy data, supporting their benefits. And some of these may be ineffective at best and at worst, damaging. And so I think, Vaile noted this sort of aspect of making sure that these tools are safe. I think that’s a very strong concern that we need to be thinking about as a field. This can make finding the right app a challenge, but not impossible. And the work that I’ve done in this aligns with a lot of other evaluation frameworks have been proposed for mental health apps. We really focused on three different dimensions. Credibility–Is there evidence that suggests that this thing can do what it says it does? Is it safe? Is it effective? We look at user experience–Is it easy to learn, easy to use? Is it engaging? Are you going to stick with it? Which is a major challenge for a lot of these mental health apps? A lot of people in the real world stop using them after a matter of a couple weeks. And then concerns around data security and privacy–what happens to the information I enter into the app? What are sort of the safeguards around making sure that people don’t get to have access to that data? And what are the considerations of who owns the data, which I think is a very important consideration when we’re putting sensitive information related to mental health into some of these digital platforms. You know, why do we think about three dimensions as opposed to one, you know, overall summary score. There’s no magic number for digital mental health interventions and mental health apps. There’s not, you know, a five star app that’s going to work for everyone. Different things work for different people. Different children are going to need sort-of different considerations. And so there might be sort of preferences you would put on sort-of user experience versus data security and privacy versus credibility, especially depending on the context. As I mentioned, there’s multiple efforts that exist and there’s considerable agreement across these different evaluation frameworks for some evaluation of the evidence base, the credibility, some consideration of the user experience, engagement, even efficacious things will not be impactful if you don’t use them. And considerations around data security and privacy. There are also some evaluation systems that look at the, you know, what are the features, what’s present in applications. But I think these are the sort of key characteristics and metrics that look at the quality of products rather than the features themselves. I think it’s also really important to center youth in this conversation, and understand what youth are looking for when they look for mental health apps. These insights coming from a study that I did with some colleagues where we actually worked with youth over a period of about five weeks, and we gave them different apps to use every week. And they came back and gave us their feedback on them. What did they like? What did they not like? What did they see that they could benefit from and see themselves using long term? And these were some of the key insights that the youth shared. Really that they wanted content that’s personalized and tailored to youth’s needs. They wanted apps and products that really focused on the problems that they were dealing with. With sleep, with relationships, with trying to form who they are and make connections, with imposter syndrome and finding their place in this world. And they wanted tools that can integrate with offline activities. They want to make sure that their life is not only on these apps. And, you know, the integration between the online and the offline is really important. They wanted content that’s reliable and accessible. They wanted tools for tracking their progress, understanding more about themselves and driving insights. They wanted things that were simple and easy to use, and they wanted apps that had sort of diversity and content rather than just a single-focus app. So they wanted to be able to have tools that were feature rich and could help them in sort of diverse aspects of their lives. And so, you know, I think that the metrics that I mentioned are really important, and it’s also just critically important to centre, youth’s voice in terms of understanding what they’re looking for. I want to kind of close here, before we move to questions with this, also saying that there’s other resources that are out there. We published a guide about a year back that actually looked at some different digital tools and solutions for teen mental health, and I wanted to pull this one quote from it–”There’s no one size fits all solution for mental health. Different tools that resonate with different people. Inform yourself about multiple options and listen to teens with regards to what they like and do not like”. Again, I think centering youth voice is so important. As Vaile mentioned I’m a member of the Society for Digital Mental Health and so that’s another great organization if you are interested in this topic specifically. And I also want to call out the American Psychological Association is doing really great work. I think their Let’s Get Technical column is just one aspect where they share some of this information. And so, this is just a brief presentation to provide some overview. But if this is an area you’re more interested in, I’d really encourage you to continue to seek out some of these resources because there is information that’s out there to help you sort of navigate this emerging space. And with that, I will stop sharing my screen.
[Dr. Vaile Wright]: Thank you Stephen, that’s incredible. So, you know, what I’m struck by is, the daunting task of trying to figure out amongst 20,000 apps–what would actually be not just effective, but also not harmful. What kinds of advice or guidance do you have for providers, for parents, for just consumers who are thinking about, you know, finding an online resource or app to address some of their emotional well-being or mental health needs?
[Dr. Stephen Schueller]: Yeah, I think the thing I’ll just really emphasize is that these are not replacements for face to face therapy. These are tools for expanding the portfolio of what we might be able to offer, teens and adolescents. So these tools may be helpful for some people, but it’s not going to be a panacea. They’re not going to solve all our problems. And so making sure that if a youth is using a tool, checking back in on them, following up, seeing if it benefits them, and making sure if they are not getting the benefit that they’re hoping for, that there’s an ability to kind of bring them into different services. I’ve looked at, you know, thousands of these apps over the years, and although some of them I have some concerns about, you know, the vast majority of them do not have like, you know, really content that makes me really concerned about the potential harms. But I am really worried about, you know, youth using a tool, not getting better and like, not going and seeking subsequent care because of that. And so I think thinking about these tools as existing within the continuum of care, checking back in and linking back into other resources, I think is just a really critical point. So don’t think about these things as a one stop shop or one size fits all solution. And follow up with, youth, even if they’re using one that, you know, seems well recommended from a sort of vetted source, your insurance company or some other type of organization that’s really recommended or supported this.
[Dr. Vaile Wright]: Thanks, yeah I think that’s super helpful. And I do think about that a lot of, of the potential for things when they don’t work to turn people off from seeking other help and really wanting to ensure that, you know, we encourage people to continue to find the right solution for them because as Stephen says, you know, one size just doesn’t fit all. Thank you so much! With that, we’re going to turn to our second panelist. Dr. Henry Willis is an assistant professor in the clinical psychology program at the University of Maryland, College Park. Dr. Willis received his B.S. in psychology from Howard University, his Master’s in clinical psychology from Columbia University, and his Ph.D. in clinical psychology from the University of North Carolina at Chapel Hill. At UMD College Park Dr. Willis founded and directs the Cultural Resilience, Equity and Technology Lab, which explores the relationship between online and offline racial discrimination and mental health outcomes, seeks to understand sociocultural, protective factors and how they impact psychopathology among African-Americans, creates cultural adaptations of evidence-based treatments, and utilizes mobile health technology to increase access to mental health treatments for underserved populations. Dr. Willis also assists in clinical training at the Hope center, a free mental health clinic in Harlem, New York. Please join me in welcoming Doctor Willis.
[Dr. Henry Willis]: Good afternoon everyone. I’m so excited to be with you all today. And I’m excited to talk about the use of digital therapeutics among diverse youth and how we can think about using digital tools to increase accessibility to mental health support for these underserved populations. So first, I want to talk a little bit about the potential for these digital interventions to address the existing disparities that we see in access to mental health services. And specifically, I focus my work on mobile health or digital interventions and not tele-psychology or virtual therapy sessions, which is intentional for a few reasons. So a lot of this came from some of the limitations of tele-psychology that I saw while working clinically in underserved areas, such as needing internet access or needing a data plan or private spaces. We also know that these are that tele-psychology, or virtual therapy, still involves a person on the other end. At least it does from now before AI takes our jobs. So those usual barriers that we see, access to a culturally humble therapist, or having insurance to access, therapy or boarding sessions, etc. are still issues for our most marginalized communities. And so I’m really passionate about finding solutions that don’t rely exclusively on one to 1 or 1 to group therapy. As it may be an effective way to scale up access to effective mental health interventions or support. So why might mobile health be helpful for reducing disparities in these populations in the first place? The main reason is that there are astounding disparities in access to traditional mental health care, and these disparities have only grown since the Covid-19 pandemic. For example, analysis of mental health-care utilization over the past decade have found that black utilization of mental health services has decreased exponentially over time, and this gap continues to widen. We’ve also seen in a review of the literature that racism and poverty also decreases the likelihood that youth of color receive, or even benefit from evidence-based treatments. And other studies have found that when there are state-level lack of providers, this is linked to fewer attempts to access mental health care, especially among sexual minority youth of color. Finally, we also know that the rate of suicide among black youth has been increasing over the past few years at greater rates than other racial ethnic groups and this underscores that urgent need to increase mental health care access. And so, as a result of these glaring disparities, increasing access to mental health care services, especially those that can address socio-cultural risk factors like exposure to racism, is an urgent public health concern. And this also highlights the need for novel solutions that can be both scalable and can be equitably distributed. And I believe that mobile health and digital interventions or therapeutics is one potential solution. But there are a few gaps in what we know about these therapeutics that are hindering their deployment in these interventions. And Dr. Schueller talked a lot about the effectiveness of these interventions, but there are some gaps that remain in these interventions’ ability to help black and brown youth. And first, we know that these youth are typically underrepresented in mobile health or digital intervention research, despite their high ownership of smartphones and tendency to use the internet, more than their white peers. And finally, we know that a lot of digital interventions that exist today don’t consider how social and cultural factors might influence mental health for youth of color or influence their perceptions, especially among families of mental health treatments. And so in order for these interventions to be effective, they have to be culturally tailored or at the least adapted to meet the unique experiences of youth of color, broadly. And so in my research, what I found or what I started to do was to first explore current attitudes towards mobile health interventions and then started to explore potential directions for culturally tailoring and adapting these interventions. And basically, I wanted to know if black youth in particular found some of the more current, popular mobile health options acceptable, or what were the needs for culturally tailoring new interventions. And so very briefly, in this study, I had youth evaluate for mental health apps that were very common and free at the time. Most of those were produced by the Department of Veterans Affairs, and they incorporated ACT and CBT approaches. And there was also a culturally tailored app called The Safe Place that existed, at the time. And so what we found was that although black youth describe some positive aspects of these mobile health interventions, none of the most popular apps actually fully met the needs of the group. So some of the positive qualities that you’d like that they wanted to be reproduced in other tailored apps include having inspirational quotes or the ability to assess their symptoms. They wanted something that was easy to use and easy to navigate. They wanted access to psychoeducation about their symptoms, the ability to set goals, and also to have notifications to remind them to use the digital therapeutics. But some of the negative qualities that were kind of, overwhelmingly discussed were the lack of features, especially the lack of culturally relevant features and resources. They also talked about the lack of accountability or just feeling as if sometimes those therapeutics were too overwhelming for them. So in light of those limitations, my work has also centered the the voices of black youth using qualitative methods to explore what these culturally relevant interventions could look like and what features should be included in order to fully meet the needs of youth of color. And so in a variety of focus groups conducted over about 6 or 7 months, I want to just briefly highlight some of the features that our youth talked about. So for example, they wanted activities to promote their own racial identity exploration. So we know that the development of positive racial ethnic identity beliefs during adolescence is very critical for optimal psychosocial functioning among youth of color. And so, you know, figuring out, how can we integrate these types of activities? Whether there are mini games or short readings describing the positive qualities of being a part of your racial ethnic group is an important need for digital therapeutics. Similarly, they wanted content related to racial identity development, such as positive narratives about the meaning of being black or being a part of their racial ethnic group. And they also wanted counter narratives that could help them combat some of the negative stereotypes that they have to navigate. They talked about the importance of having a safe space to discuss issues related to their identity, but also to discuss issues related to experiencing discrimination and even the ability to report and assess experiences of discrimination through digital interventions. Finally, they wanted access to concrete problem-solving skills and solutions for future experiences of discrimination. And those are experiences that occurred both in offline settings, the experiences of discrimination in schools, in their communities, but also, more overwhelmingly, experiences of online discrimination. So negative content or cyberbullying because of their race, that happens in these digital settings. And so using this feedback, my research lab is currently developing a prototype of such a digital therapeutic. And this is codenamed Phoenix, for the moment. And this integrates a lot of the findings that our youth discussed. So they wanted the ability to create their own personal profile, and more importantly, they wanted the ability to assess their symptoms, but also to assess what they might or might not know about therapy. We also assess what they might want to use the app for, as well as different values and things that are important to them, and we’re hoping to integrate AI capabilities so that we could tailor culturally-specific resources for youth as they navigate and use the app. And so this is an example of some of the culturally tailored resources that we started to integrate into the app, including access to culturally relevant readings that might help youth explore topics on a deeper level, such as exposure to police violence. And we also can’t ignore the offline need for support in our youth. They talk about wanting the ability to find therapists to either see in person or virtually. And so we’re also integrating the ability to find local mental health support. And finally, our youth talked about, again, the importance of having a community aspect. And I think it’s really highlighted the importance of community in communities of color. And so they wanted the ability to explore different topics related to mental health, as well as various social issues such as activism and discrimination. And so within this community aspect, we will have youth, have the ability to talk about these experiences, process it with other youth. And then eventually we want to integrate the capability of having mental health providers and other professionals also provide support. And finally, I just want to briefly highlight another digital therapeutic that my team and I have also been working on. And this is a co-creative project that was developed by a transdisciplinary research team including experts from psychiatry, social work, computer science, and community organizations. And we created and co-developed this app alongside a youth advisory council of black teens. And the reason we wanted to explore this was because we were seeing that black youth, in particular, were experiencing multiple brief events, and that black youth typically turn to online spaces like social media for mental health support. Yet few spaces existed that help them in culturally responsive ways. And so we really wanted to co-develop a community centered, culturally responsive product that could offer an online space for healing, joy and empowerment. And so I want to highlight some of the lessons we learned from our Black Youth Advisory Council in terms of why this was important. And so, again, we saw that these – you’ve talked about wanting to go to social media for distraction, but they were often confronted with toxic content, whether as hate-based content or cyberbullying. They also often wanted to find someone to talk to about their grief. But it was hard for them to find the right combination of distance and trust to be open with another person, and they were afraid of that risk of being cyber-bullied. And so the challenge that we were navigating was how to build a digital platform that could share content that doesn’t promote toxicity and cyberbullying. And how to create an algorithm that can curate, positive, resources for youth of color. And so this is where JoyNet is today. And this is a comprehensive design framework that includes a digital journey, core content, and integration of psychological concepts like attention-shifting and connectedness, validation and empathy. I do have some slides about how we develop the AI capabilities, but it involved training algorithms on culturally-specific TikTok videos about joy that our Youth Advisory Council helped us find, and curate. And we’re also still thinking about the ethics of using this platform, and how to protect youth in emerging technologies. And with that, I also just want to briefly highlight the importance of policy and knowing that technology and digital therapeutics isn’t necessarily a silver bullet. And we want to make sure we are not causing more harm than good. And so an example of this is thinking about how can we make sure that emerging technologies like VR and the metaverse isn’t reproducing existing disparities that we see in access to technology? And also how can we also hold tech companies accountable for designing products that might, you know, reproduce harm and how can we protect kids data, etc. And I’m happy to talk more about that later. Thank you for your time, and I look forward to the questions.
[Dr. Vaile Wright]: Thank you Dr. Willis. Clearly you and your lab are creating some really cool digital therapeutics and different digital tools. So knowing that you’re going to create something that is going to be ideally effective, safe and culturally relevant, how do you plan to engage these communities that can be hard to engage? For a variety of reasons.
[Dr. Henry Willis]: Yes. This is a really great question. And so there are a couple of approaches that my lab and others have taken. And I really hope other people developing these therapeutics can take in. One is to first create a plan of long-term engagement. And so when working with underserved communities, it can’t be just a drop in, collect data or drop in and just promote this app. We really have to think about, how can we create a sustainable plan to, really connect and, and really work with these communities long-term to build trust? Long-term. And I think the second strategy that we’ve used is connecting with trusted institutions and organizations and communities and one particular institution, that we typically reach out to our faith-based institutions, because typically that’s the first place where families in these marginalized communities go to when their families or children are experiencing mental or psychological distress. And so and, and a lot of time is faced by these faith-based institutions actually want more support and more resources to help their community members. So I think, you know, improving outreach to faith based institutions and other community-based organizations is a great way to start to reach these communities.
[Dr. Vaile Wright]: Yeah. That’s great. I think there’s a much greater recognition of the need to meet people where they’re at in their communities and not just expect them to always come through our doors as providers or as health care systems. So, amazing work. Thank you for being with us. And excited to talk more during the panel discussion. But with that, let me introduce our third panelist, Nicole Martinez-Martin received her J.D. from Howard Law School and her doctorate in social sciences in Comparative Development and Medical Anthropology from the University of Chicago. Her broader research interests concern the impact of new technologies on the treatment of vulnerable populations. She has served as a primary investigator for research projects examining ethical issues regarding machine learning in healthcare, digital health technology, digital contact tracing, and digital phenotyping. She has examined policy and regulatory issues related to privacy and data governance bias, and oversight of machine learning and digital health technology. Her career development grant, funded through the National Institute of Mental Health, focuses on the ethics of machine learning and digital mental health technology. Recent research has included examining bias, equity and inclusion as it pertains to machine learning and digital health, as well as social implications of privacy and data protections on marginalized groups. Please welcome Nicole Martinez-Martin.
[Dr. Nicole Martinez-Martin]: Hello. Thank you so much for including me in today’s conversation. So I am an ethicist. And, to explain that in terms of digital mental health, a broader sort of interest is looking at that–You have this promise of how digital mental health can improve aspects of mental health care. But at the same time, there are these risks. And so identifying the risks and, you know, thinking through what are ways that we can, you know, minimize these risks, so that we can, you know, realize the hoped-for benefits. And of course, in this area, one of the, you know, big issues that comes up is privacy. As I’m, you know, sure, people are aware at this point, you know, data, people’s personal data is, you know, a business that’s in billions of dollars, many of the, you know, digital interactions that we have, especially social media, whether it’s Facebook or otherwise, that at one level, a big part of that is, you know, harvesting, people’s personal data to be used in different ways. And this very much applies in the area of mental health apps. And so, for example, you know, you, in talking to clinicians, you know, I’ve seen many therapists, many are aware of thinking about confidentiality, you know, thinking about, what might happen to, the, you know, therapeutic data. And many times you’ll see on apps that they might, if they’re part of, you know, digital therapeutic that they may talk about, you know, confidentiality of, you know, what is said to the therapist online. And that certainly generally applies. But the bigger issue is that many of these apps are set up to collect a range of data from people. And even if it isn’t what’s considered protected health information. So even if you know it anonymized as excludes your name, for example, and you know, certain characteristics, there’s still a lot of data that they may collect, whether it’s location data, whether it’s, you know, simply the interaction with the app that becomes part of a profile, that runs the risk of it can be re-identified and attached to people. But even if it isn’t re-identified to a specific person, it can become part of a digital profile that follows that person. And that is a part of many apps, in terms of their, you know, data-collection processes. So certainly one thing to always think about is, you know, doing what you can to look through, what the data policy of many of these apps are, especially in terms of checking the privacy settings and looking to, you know, see, what kind of data you’re giving permission for them to collect. And I would say really try to limit it only to the most, to what they call the most necessary data. Because otherwise, you know, because of the way that profiles–data profiles can follow people. It has downstream impacts. These are profiles that can be used later on in ways that affect things like the pricing online, that affects, you know, insurance rates, mortgage rates. So for an example is that, you know, if a profile follows someone that attaches that they have anxiety, that they have depression, that they are young. They may be particularly vulnerable to data targeting processes where practices where certain things are marketed to them. You know, big purchases, for example, you know, there’s there’s been, data that, you know, someone who, has bipolar disorder that, they may sometimes get targeted for big purchases because someone in a manic state, may be more likely, to impulsively, you know, buy these large purposes, large, purchases. But also, you know, things like anxiety and depression. There have already been cases of things, where, you know, whether they are identified as such on Google or Facebook profiles that that it’s shared and they may be targeted for, mental health solutions that are not, in fact, evidence-based or even may be, harmful. And unfortunately, this is, something that can be particularly a problem, for people who are part of underserved group demographics, such as, you know, low income, racial or ethnic minorities, that the combination of how their data profile, gets, attached to them can have, you know, particular, impact in terms of, you know, pricing, educational profiles that follow them. So one, one element is, of course, checking privacy settings. But really thinking through which apps, are, you know, going to be given access, you know, even to, basic information, about someone, and, this also goes into, you know, trade offs that, a number of people I’ve talked to, both in terms of consumers using these apps and clinicians, have concerns about people, being put in a position of essentially trading their data, for access, to a mental health app. And then on top of that concern, whether that app is, in fact a lower quality of care, and this is, you know, as already has been talked about, there is still currently insufficient technical and clinical standards that get applied, in terms of evaluating apps, organizations or agencies such as the FDA, many of them, even though they may evaluate medical devices often, mental health apps are considered low-risk, you know, despite of course, many of us being aware of that mental health risks, certainly include serious risks, but many of them are categorized as low-risk. And therefore the FDA may only selectively enforce, or selectively, you know, examine them. And so there is still an issue in terms of policy, you know, where you find your information, and you know who is evaluating these. That’s where something like Digital Therapeutics, is used as a term, that is meant to evaluate or show that these have been evaluated and have a stronger evidence base. And that’s one indication, but also trusted sources, there may be that it’s created by a trusted organization. The VA, for example, is known to have a fairly stronger evidence base. Other organizations also, you know, you know, if you look through what kinds of recommendations they have, there may be certain organizations that are known to be, you know, higher priority, more trusted. Clinicians, unfortunately, many clinicians I’ve talked to tend to also talk to trusted other people. In terms of, you know, what tools to use. Because they themselves may have be overwhelmed by, you know, the sheer number of apps and many clinicians I’ve talked to, part of what guided them was looking at apps, that are help support specific, specific practices such as mindfulness, things that they thought of as low-potential for harm, but a tool that may, help, with, you know, particular things that a patient may need to work on. And this particularly applies, when it comes to, you know, teens, or younger people, of this need to have a better evaluation of what, what is needed and also has been talked about, you know, there, there are real concerns about, you know, who digital mental health tools work for that. There’s a number of different groups, whether it’s teens themselves, people with more serious mental health conditions, people according to race, ethnicity or language that they use where, there just may not be as much inclusion of these groups in research. And so there and there ends up being concerns about how these tools apply. And of course, the digital divide, among other issues, many, many, a much larger proportion of people who are of low socioeconomic status, get their care through community mental health, which also has less money for infrastructure for supporting these kinds of digital tools. And, you know, a big question that came up over and over again when I talked to, you know, whether it was clinicians or people developing is still a need for addressing what happens during a crisis. Again, this is why often these tools are considered more appropriate for mild to moderate cases. Because, you know, they’re still needs to be developed more best practices and especially concern about, you know, having recourse to a human therapist, when there is a you know, when there may be a crisis. And this can be especially concerning when it comes, for young people and mental health issues. So again, you know, in terms of oversight and accountability, this is still a work in progress policy-wise. But, you know, often checking that there are, you know, humans in the system, humans as a backdrop for either implementing the tools, using the tools. You all have been hearing wonderful, wonderful work by people involved in co-design, really reaching out to communities, you know, to help design these, to make sure that they’re appropriate for a number of different people. And especially, you know, checking data and privacy settings, so that there’s also not harms caused, by, you know, breaches and confidentiality or the ways, that, privacy, gets, said, you know, both in terms of our society at this moment, in terms of the particular apps, and especially caution, you know, as I said, there’s some of these some apps that are, in the digital therapeutic domain, that have evaluation as therapy. But you know, as there’s been more attention to in recent months, there’s real concern about apps that put themselves forward as substitutes for companionship, substitutes for care, that can end up when giving really harmful advice, in some cases advice towards self-harm or otherwise. And those are apps, you know, that in, you know, talking to, to teens working with children of, you know, really sort of steering and supporting what types of apps they go to and, and avoiding, those, you know, those apps, that are not vetted, can be very important. Thank you very much. And I look forward to your questions.
[Dr. Vaile Wright]: Thank you so much. So much to think about and I’m going to be real honest, when I download an app, I do not read the terms of service. And I’m guessing many people on this webinar don’t as well. So, Nicole, if I did, or read my data privacy, policies, would I know who owns my data?
[Dr. Nicole Martinez-Martin]: So, again, most people aren’t really able to, to look through it. Most people don’t. And it’s often worded in ways that are purposely confusing. So looking at it, it probably would be hard to know who owns your data. But often, you know, there is language in there. That would be along the lines of, you know, that lets you know, you know, how much the app claims to, you know, to, have ownership over your data and most and to some extent do, that that are not part of these, you know, sort of digital therapeutic class. And that’s where, you know, I would say, because it’s so confusing, many times what you want to do is go instead to the privacy settings. More and more, there are requirements, that, that you can look at, you know, just simply privacy settings and, and be able to set what they have access to, what they don’t. And I would say that’s also something where you very much want to talk to your teen or child. Many of them are more savvy than adults about, you know, and they found this out through studies and surveys, you know, through Pew and otherwise. That, they, they often are more aware of, you know, how to use privacy settings to block data. But that’s also where I’m saying there’s many apps that just simply probably should not be used, that it’s, it’s being shared with third parties. And, and that includes and I really should give this warning, even things like gender or gender identity, those sorts of things that seem like demographic information, they can be very concerning in terms of how they get tracked, and could even be used nowadays. But with law enforcement, government intervention or otherwise. And so, you know, those are things to be very cautious of.
[Dr. Vaile Wright]: I couldn’t agree more. Thank you again so much. And again, looking forward to our panel discussion. But first I will introduce our last two panelists for today’s talk. Dr. Elsa Friis is a licensed psychologist, researcher, and behavioral health advocate who specializes in utilizing technology to expand access to youth mental health support. She received her master’s in Global Health and PhD in clinical psychology from Duke University. Dr. Friis has published over 20 peer reviewed articles on the co-development, adaptation, and implementation of culturally salient youth interventions in the US and globally. Clinically, she has worked and trained in a variety of settings in North Carolina and Georgia, including schools, primary care clinics, academic medical centers, and hospitals. She specializes in family based approaches to treating anxiety, severe behavior concerns, and suicidal ideation. She firmly believes in co-creating products in partnership with the youth and stakeholders who will utilize them. She enjoys being able to leverage her clinical research and public health training to ensure Alongside is working in partnership with schools to provide evidence-informed support to students across the country. And she is joined by Ava Shropshire as she’s a teen mental health advocate and an alongside student advisor. She is a sophomore double majoring in philosophy, neuroscience, psychology and Communication Design with a minor in Human Communication interaction. Holy cow! She is passionate about the intersections between mental health, media, and communities of color, and strives to use her creativity to drive meaningful change. Welcome to both of you.
[Dr. Elsa Friis]: Thank you so much for that warm introduction. Every time we introduce Ava, I’m like, I just wish I was that cool when I was in college.But anyway, Ava I are so delighted to join you guys today to talk about how we elevate the youth voice in digital mental health programs.And we’re going to use our experience working at Alongside, which is a platform developed by clinicians like myself and teens, just like Ava, that provides personalized, preventative mental health support to students in grades four through 12. And you can see here where we landed in our development is we actually do a lot of our preventative mental health screening through an AI powered chat bot named Kiwi. That’s not where we started because tech should never be used for tech sake. We really want to utilize technology to really solve a problem that we’re seeing. And so when we started, we really started with the problem of how do we increase access to preventative mental health support to students in an equitable way across the country? And so we actually started simply with that question. We decided to partner with schools because students spend most of their time at schools. And this is a way that we can easily actually give access to support to youth. But first, we also talk to teens and to school professionals to really start there and say, what do you see as the challenges and what do you want? If we are helping provide preventative mental health support? And Ava, do you want to share a little bit more about our teen priorities we’ve heard?
[Ava Shropshire]: Yes. So what we heard from teens, when they prioritize, what they prioritize when utilizing digital products, is that they want confidentiality. They want something they can access at all times. They want something that won’t judge them or lecture them. They want something that is engaging. They want something that’s inclusive and understanding remains unbiased, can be personalized to their individual needs, can be gamified, and have activities that can engage with them, doesn’t feel out of touch or cringey, or like it’s trying too hard to relate to their audience. And lastly, they want something that actually works.
[Dr. Elsa Friis]: Awesome. And when we talked with our adult stakeholders. So this was school staff, school counselors, mental health professionals, parents, we heard some of the same things, but we also heard additional priorities around data security, making sure that this is part of a holistic system of care, making sure that we’re tracking how it’s being used by the youth, that it’s available, it’s culturally relevant and available in multiple languages. Well, one of the biggest things was don’t make my life harder as someone working in the school. At this time that we did this research, the counselor to student ratio nationally was 400 to 1 student. So if I have 400 students to keep track of and think about doing preventative support and counseling with, unless we’re cloning our amazing, amazing counselors, which I really wish we could do, we decided that we wanted to develop a digital product and not a solely human based product, to make sure that we could actually address that gap in our, ability to provide direct, in-person support. So now, once we have this concept, how do we elevate the youth voice and really thinking about taking that? And we have a general idea and actually putting that into a product. And so we incorporate the youth voice across kind of our ideation. What do we need to provide, prototyping and testing and iteration. And Ava joined us for our first cohort of Alongside interns who worked with us 20 hours a week over the summer as a summer job. And, Ava, I would love to hear about how your group, really addressed. How do we support students and understand their emotions?
[Ava Shropshire]: Yeah. So as you mentioned, we are part of the intern cohort, and we were put into groups to work on solving different tasks. So me and two other interns were tasked with the goal of figuring out how to support students and not only understanding their emotions, but regulating their emotions. So during the prototype phase, we worked towards creating our own individual mood tracker prototypes on Figma. So we all started off with very vastly different ideas on how we wanted it to look like, and then we combined our ideas together into that finalized prototype you see there. And then after we received testing and feedback from fellow interns and also from the product design team, who later worked on cleaning up the prototype, implementing it into the app, and working towards multiple new iterations to continue to keep it an engaging feature.And now, if you see on the third image, there have been a lot of different things added, such as personal insights for sleep, screen time and more.
[Dr. Elsa Friis]: Awesome. So once we also bring a product to scale and actually release it out into the world, we also have an opportunity to continue to elevate the youth voice by actually being able to look naturalistically at what students– how are students engaging with those products? What are they doing? What kind of support are they actually seeking out when we move it actually into the real world? And I will share a link. We have a report that has data from over 3000, student chats from students across the country. This is a very diverse sample. We’re actually underrepresented, regarding white and Asian students, and overrepresented in our Bipoc youth, which is a really amazing insight to see, a ton of diverse voices being able to be elevated. So we only have a little bit of time today. So we’re going to do a few quick highlights. Ava, you want to jump in with finding number one?
[Ava Shropshire]: Sure. So our first finding is that it’s not only social media reducing connection. About 40% of our chats tackle interpersonal relationships with topics ranging from conflict with friends, bullying and romantic relationships. And it can be easy to blame social media as the only cause for this reduced connection among students. But our chats show that interpersonal relationships are just difficult to navigate.
[Dr. Elsa Friis]: And a fun fact here is less than 1% of our chats actually mention social media in any way.
[Ava Shropshire]: Our second finding is that one of students’biggest barriers to getting help came down to confidentiality,and accessibility. So it really matters. And to our students, access means being available when they’re available, being able to have the tools, speaking their language, not having to face someone in person, having social and cultural understanding and remaining unbiased.
[Dr. Elsa Friis]: Now, a big question is when we’re talking about digital therapeutics and digital platforms, is their ability to frame digital products to be a stepping stone to actually get connected with in-person support? And one thing that we did find is that Alongside can be that stepping stone. So last year, we found that 37% of students decided to actually reach out and share a summary of what was going on with their counselor. We were able to make some product improvements, and this fall, 75% of students are reaching out for that in-person support in addition to getting some of that early intervention or preventative support on Alongside.
[Ava Shropshire]: And our last finding is that personalization is key in personalization, and mental health care ensures that students truly feel seen and know that their cultural identities, backgrounds, experiences, and the things that make them who they are matter, and that can often be a barrier to seeking help. Also, personalization allows the chats to feel more natural. In this example, we can see that Kiwi, our chat bot, is able to build upon what the student’s going through, and that can later be used as a pathway for actionable steps and keeping up with the student’s needs.
[Dr. Elsa Friis]: So some of our key, take homes or recommendations when we’re thinking about what types of support and care students are actually seeking out on digital products is, number one, they are recognizing that interpersonal relationships are hard. But if we’re coming from a top down adult view of just blaming social media on this is why kids are not having healthy relationships. We really also need to meet them where they’re at and help them learn the social skills on how to create really healthy relationships online and offline, because that’s what they’re seeking out as well. We also found that it’s really important to balance this between confidentiality and support. Those who are not quite ready to open up and get support with the fact that when we do create really simple ways for students to reach out and lower that barrier, or lower the anxiety around getting support from someone, in real life, we see that it is actually possible, and so we need to constantly find that balance. And finally, we are finding that personalization is key. So we need to move away from this one size fits all prevention and early intervention programs. And we now can leverage technology like AI to actually offer more personalized coaching that is student driven and might actually increase the impact of our interventions. Finally, if you’re sitting in here thinking again, this big question of how do we actually choose or evaluate digital products? One of the big things– ask youth first. Something we love to do at Alongside is we leverage student groups at schools to actually get together, try it out, and actually have them share with the stakeholders at their school what they’re liking, why they might use it, why they may not use it, and if it’s hard or it’s hard for students to share openly, ask them what their friends would think. Sometimes I don’t want to give bad news, but I will. I, after working with youth a long time there, are happy to tell you that their friends will think anything is stupid. So that is one way to make sure that we’re getting unbiased data. And then finally, if you would like to see more insights from that report, here is a brief link to that.
[Dr. Vaile Wright]: Great. Thank you both so much. So we’re going to pivot to the panel Q&A in just a moment, but real quickly, I just wanted to get your take on one area that maybe we haven’t touched on yet. You know, I think post-pandemic, we’ve really opened up discussions around mental health in ways that we didn’t before. But I still think some communities are worried about stigma. So how do we, you know, engage and care without inviting a stigma among particularly youth and adolescents?
[Dr. Elsa Friis]: I think that’s great. And I was actually just sitting down with a group of students at one of our schools, and it was fascinating to hear their thoughts on this and their definition of mental health. A lot of times when I’m talking with students, they’re really just talking about their daily, everyday challenges. And so a lot of this is how do we shift the narrative to actually use language and words that resonates with both our youth and with the adults who are making decisions. But I would love Ava to also chime in on how we can destigmatize mental health.
[Ava Shropshire]: Yeah, I definitely agree with what you said. I think especially coming from like cultural stigma, different cultural communities–a lot of it is rooted in past history. So having education, having more representation in the types of people that we see in mental health support and just having open discussions and conversations is a great way to destigmatize mental health support and access.
[Dr. Vaile Wright]: So now I’d like to welcome all of our panelists back to the screen. So, so, gosh, we’ve covered so much ground, so much interesting space. I kind of want to revisit when I’m calling the elephant in the room right now, which is this distinction between digital tools that were created with either health care or emotional wellness in mind, and digital tools that have been developed for entertainment purposes. But maybe, as Nicole mentioned earlier, being used as substitutes for some of this care and just kind of want to hear people’s thoughts on and on that space, because there’s been some very high profile news stories recently about the harms that can happen, particularly in youth. And so, happy to get some thoughts. I don’t know, Stephen, if you want to sort of start us off.
[Dr. Stephen Schueller]: Yeah. Well, I mean, I think it’s a real important thing to make this distinction between I think actually sort of three categories of things. Digital therapeutics, which are medical interventions that have evidence of clinical effectiveness. And there are some tools that are available now that have FDA clearance. And I think we’ll see more in coming years. And so I think, like, that is a category we should be aware of that. There are wellness products which are sort of generally focused on trying to promote our well-being in some ways. And there’s everything else out there that, you know, potentially could be useful for our mental health and well-being. And like, look, I’ll say that, when I’m having really bad days, I have a Spotify playlist that has some of my best mood boosting songs, you know, like, I’ve got, you know, all the T-Swift that gets me through the day. And when I go to that, that’s useful. But I–like, thinking that that’s therapy would be inappropriate. And so I do think, you know, there’s a role for technology to still be usefully helpful in promoting our mental health and well-being. And I’ve heard a lot from, like, youth about even social media, which I think it’s a really bad rap. Like, you know, youth who’ve created finsta accounts that, like, have everything that really just brings them joy and going to those things and times when they’re dealing with dark moments like these can be opportunities for good and benefit. But it’s not therapy. It’s not a treatment. It’s not these sort of digital therapeutics. And they’re not, you know, being evaluated at the same level. So I think, you know, treating one of those products like it is, you know, like going to ChatGPT is your therapist. Strongly do not recommend. And like that’s not what those products are made for. Not that they can’t be beneficial. I mean, I’m sure we’ve all seen movies or TV shows that can, like, really speak meaning to us and can be beneficial for our mental health, certain things that get us through those tough times, but that they’re not therapy and they’re not treatment. I think we really do need to make that distinction.
[Dr. Vaile Wright]: Yeah. Anybody else want to chime in on this particular topic?
[Dr. Elsa Friis]: I think one way to think about this is really think about the difference between a product and a program. So when we’re even thinking about kind of this AI space and we see something like character AI, which I would kind of more consider a product versus Alongside which I would consider a program with a company behind it. If you are a parent and you’re googling a product that’s or an app, you should be able to find a website. But behind that app, if it is a wellness or a digital therapeutic program, you should be able to find a website behind that that helps understand what is the goal, how is it supporting youth? Who’s behind this digital program or product? So that’s kind of the simplest way that I like to talk to parents about it. Is there a website explaining, are there people behind this that I can easily see? I also think when we’re thinking about the application of AI technology, what’s interesting in my space in particular is we hear these horrible stories of how we’ve had horrible outcomes of children engaging with a chat bot, and AI. AI is also the only reason that I’m actually able to identify kids who need more support. And so technology, the same technology isn’t always good or bad. It’s about how you use it and apply it.
[Dr. Henry Willis]: Yeah. I would also say for researchers, it’s important for us to investigate the why. Like why are you turning to these products versus the evidence based therapeutics. And you know, when we think about, like, the character AI, is it because they can create this other person that speaks to them, and relates to them? Or if we’re thinking about how you go to TikTok for mental health resources, is it because the people that are giving the information are representative of their experience and what they look like? And I think investigating this why can help us better improve the evidence based therapeutics that do exist, and that can hopefully combat the harm of these products.
[Dr. Stephen Schueller]: Yeah, I’ll actually just make a quick point jumping off of that, because I do think, you know, we talked in some of the presentations about, you know, trusted sources and credibility. I think the VA, DOD apps that were brought up are really great. But I also think, I’m always reminded about a statement that we were doing a focus group with individuals who were using mental health apps and asking about how they find them. And one participant brought up like, okay, if it’s associated with the university, my assumption is that’s probably going to be a pretty crappy app because it’s not fine and it’s not engaging. And I think that’s the reality. And I say this as a researcher, we build a lot of these things. And I don’t think historically we’ve done a great job making them sticky and interesting. And, you know, youth especially are, like, really aware of that. And so I do think we have to remember that we– yes, the research is important. Yes, the clinical evidence is important. And we also have to, you know, lean into what kids are looking for. So I think you know your point there, Henry, is a really great one. And I think, especially as academics and clinicians, I think we can’t hold too precious that, you know, we’ve got the evidence based treatments and that’s what matters. It’s got to be both. It’s got to combine those aspects. And so I think that’s one reason why it’s just so critical to center youth in here and listen to their voices and build things that they want to use. Because, again, the reality is, if we, even if we have a really efficacious product or program, it’s not going to be impactful if youth actually don’t engage in it.
[Dr. Vaile Wright]: Yeah, I really appreciate that. I think, you know, what I’m hearing is sort of the nuance that exists in the space, right? That technology, whether it’s social media or AI, none of these are all good or all bad, right? It’s how you use them, why you’re using them, and what manner of using and but just to kind of circle back. Nicole, I was wondering if you had any thoughts about how can we address some of these safety and ethical issues that we’re seeing in the news right now as it relates to particularly these AI chat bots but across the the spectrum of these digital tools?
[Dr. Nicole Martinez-Martin]: I think there’s a couple different levels. I mean, one certainly is, there’s a need. And since I’m an ethicist, I often speak at a policy level. But, you know, certainly, there is a need, there remains a need for, just a better, a better approach for sorting through. So I’ve spoken to clinicians who have no idea who to go to to figure out, you know, how evidence-based an app is, etc.. And so, you know, I, I still think that there is a need to, you know, sort of have that organization, whether it’s through sort of an APA who have made efforts in this area or otherwise to, to get a better, just vetting, of, you know, certain apps, that are useful. And, you know, I, I also I, I very much agree that, you know, you have a lot of apps that may be as evidence based but not as usable. Honestly, you know, there’s, you know, a lot of times clinicians have made apps and talk about how they used it to replicate what they’re doing. And then on the other side, you hear, I don’t want someone nagging me like a doctor, like, you know, and so, I mean, it’s a very critical part, but I would also say on the other side of it, and it really goes to when you’re a parent and your child, you know, needs help. A, part of the problem is, as you know, I think you know, Henry, Elsa, you know, Stephen, that they’ve all been kind of getting at is that there is a real problem with being able to afford and access help, and people will end up, you know, sort of interacting with, you know, a ChatGPT or otherwise or, you know, just out of the need to do it. But it’s really sort of recognizing that, you know, there may even be a website like Replica and Character AI had websites, you know, that would talk about, oh, we’re providing a friend or, or otherwise what you really still need to look for are, you know, what are the indicators of what its, its real purpose is, is its purpose really about, you know, providing some mental health care or is the purpose keeping you engaged in some fashion so that you will keep sharing data so that you will keep, you know, just going back to their website and, and that latter, really needs to be avoided. They are not created, they’re not only not created to support people, they’re created in ways that are actively harmful and so need to be avoided. Yeah. It’s, it’s certainly, sort of the Wild West out there in a lot of ways. And we’re thinking about these different types of tools. You know, one of the questions that we’ve been getting from our audience, really has more to do with kind of these online therapy platforms.
[Dr. Vaile Wright]: Obviously, you know, pre-pandemic telehealth was not really a word in most of our vernaculars. Now, of course, it’s in everybody’s, but one of the questions we often get asked even at APA is, you know, our online therapies, just as effective as face to face or in-person. And, you know, particularly when we’re talking about youth and adolescence, what do we know about the space? Does anybody want to sort of touch on that a little bit? Maybe. Henry, you do, since you talked a little bit about online in the beginning.
[Dr. Henry Willis]: Yeah, yeah, yeah. I mean, I think it can be– I think it can be just as effective. I think it can be, you know, more accessible in some contexts. I think the sticky point is really thinking about all that has to go into it, especially for, you know, underserved or marginalized communities. So again, you still need insurance. You still need a private space to have that therapy, which is a luxury, especially if you live in a place like a New York City or LA or, etc.. And, you know, especially when we’re thinking about teens, we also have to think about the availability of parents to also be involved in how, even if it’s virtual, it can still be hard to attend consistently in those therapies. But I have also seen it work very well. I have also seen it, you know, really be a way to kind of again, meet families where they are, especially kind of getting out of the therapy room and, you know, solving some of the other barriers we see, such as transportation, maybe even childcare in some context. And so I think it can be effective. I do think we need more structural policies and more structural kind of solutions to just making it more accessible for most families.
[Dr. Vaile Wright]: Yeah. I think, you know, there was a lot of promise with telehealth right after the pandemic. And I think while it did probably address some access barriers around access, it also exposed a lot of the disparities that exist in the space. And so, you know, the other thing that I hope that we continue to see is more research. So if we think it’s effective, you know, overall, which generally we do, but who is it most effective for and under what circumstances? And when we’re thinking developmentally about children and adolescents, I think that’s an important aspect of it as well. And so that kind of relates to some of the other questions that we’re getting, which is really around developmental age appropriateness for different tools. How, what’s the earliest age do you think that you would recommend, online therapy or any of these mental health apps for children? And are there any differences on, you know, either effectiveness or safety based on the age of the individual? And Elsa, I’m going to kick it to you first because this is such an easy question, right?
[Dr. Elsa Friis]: Such an easy question. You know there’s so many things to really think about on kind of that match and whether it is developmentally appropriate. I will say that I have worked, also, at a great company called Little Otter, which actually specializes in 0 to 12 care. Right? Doing teletherapy with a three year old is very interesting. It’s a little bit challenging there, but in our younger years, what we’re really going to be focusing on is how do we empower the parents with the skills to intervene. So for even our younger kiddos, it’s actually doing teletherapy with younger kiddos was actually such a joy because I actually got to get an insight into your home. How are you engaging when the kid is fully comfortable? You’re not coming to me and being like, they’re so well-behaved now, but at home they do this. Actually getting that insight into what’s going on in the home for our younger kids was incredibly helpful as being a therapist myself. And do we see attention spans coming into play if we’re doing direct therapy with teens? Of course. So you’ve got to figure out a way to make it more engaging. Doesn’t mean we can’t make it effective. But we got to think about how do we deliver more creatively for products themselves, we also just really want to make sure, you know, what is the level of oversight, the safety, data privacy, being really aware with all ages of kids and then making sure that those products are developmentally appropriate. We work with kids as young as in third grade is what we’re testing out, but we also know that we’re working alongside an in-person support and so we are able to make those developmental accommodations.
[Dr. Stephen Schueller]: Yeah, I’ll say in terms of the evidence base for mental health apps for younger kids, it’s really lacking. I showed some of those meta-analyses. One of them went down to, had some studies as young as six, but there’s very few studies there. And I think it’s just–it’s a more challenge to do research with younger kids. And I think a lot of the research and companies have shied away from it. So I think that the evidence base, it’s not that they don’t work. And the absence of evidence is not the evidence of absence. I think we just– we really don’t know. I do think that the developmental concerns is a really important one. We did build a product in collaboration with a local partner, which is an app for folks–kids, kindergarten through 12th grade. But there were different versions of it as it went up. And actually the kindergarten through third grade version had like a companion, coloring book. So it was some digital content, you know, mostly stories. And then you would work through some of those, those things. And so I do think as we think about, especially the younger kiddos thinking about how we create scaffolds around those, I have nine year olds and I will say they love mental health apps. And it wasn’t because of me. My wife gave them one, and they really seem to like it. And they get a lot out of it. But, there are very few studies that support that. And so I think it’s really interesting seeing what I see in my household and then thinking about like, well, where’s the evidence base to, to, line this up. So I think there’s some promise. There’s a couple studies. But again, the research is really emerging, especially for the younger kiddos.
[Dr. Vaile Wright]: And I think one of the challenges of course is that the research takes so much longer than the speed of technology. Right. And so, you know, I think calls for additional research are really critical. But I think we also need to be thinking innovatively about what that research looks like and how do we speed it up in a way that still has rigor. But that can actually be used in more real time, to address some of these really important questions that we’re getting asked. Another question I often get asked, particularly by media outlets, is, you know, given this lack of research: is just some digital support better than nothing at all? And, Henry, I’m going to kick that one off to you to start.
[Dr. Henry Willis]: Yeah. I mean, I do think it can be better than nothing at all. I think it can really be a starting place for a lot of people, whether that’s just increasing psychoeducation around, like what therapy is or what these, you know, what these basic symptoms are or how to even, you know, navigate therapy or the health care system. I do think it can be a starting point and it can check in there. And I think one of the things too, that we have to think about in terms of, you know, digital support versus nothing at all is really thinking about the role of technology literacy, too, among parents and, and families. And how can we also support, you know, parents and really kind of understanding what digital support is and, kind of, you know, really just how to navigate these technologies because they aren’t trained to do that. And I really like Elsa’s and Ava’s point that they brought up too in terms of what parents mean and in terms of not making my job harder. And I think a lot of times when we are creating these products, and thinking about digital support, we’re not often keeping that in mind too. So.
[Dr. Elsa Friis]: I also want to highlight that we can think of different levels of rigor of data and support. And so, working in schools, one of the metrics that we have is the Every Student Succeed act actually has guidelines and tiers around kind of your level of evidence for any interventions or proactive and preventative interventions that are provided in schools. And so your basic level is just even demonstrating that logic and rationale. And so even if we don’t have a time to test it yet, how are we tying that back to what we know works or doesn’t work? And then we can slowly build that evidence base as we grow.
[Dr. Stephen Schueller]: I mean, I will share a cautionary tale that comes from Greg Simon’s work with the Mental Health Research Network. So they evaluated a sort of digital dialectical behavior therapy tool, not for youth and adolescents. This was with adults, but for individuals who had, sort of thoughts of suicide. And they found that that tool was less effective than care, as usual. So it was not better than nothing. Now, important caveat: the people who were assigned to that condition actually didn’t use it much. So I don’t think the tool itself was bad. But I do think thinking about the care pathway. And so I think it does go back to, you know, one of the concerns I raised earlier, which is, you know, lack of later engaging in the care, I do think we have to think about where we put these tools into workflows and what we’re sort of providing and what the follow up is and what the engagement is. So I, you know, I worry about kiddos who get given something digital again, that’s not working and that they don’t engage better care. So I do think that there it’s, you know, I agree with your point, Henry. And I think that these tools can really be a sort of lifeline and a resource in places where no other resources exist. And at the same time, I also want to sort of hold up some of these cautionary tales so that we think about doing right by using these tools and embedding them into services. And I think we have to kind of hold both sides of that to think about, how do we use these tools effectively? While respecting the fact that, you know, they are a tool in the toolbox, they don’t replace everything. And I think we have to think about these continuums of care.
[Dr. Henry Willis]: Yeah. I’m really happy you brought that out. That’s a really great point. And we can look at these things as a silver bullet also in terms of fixing these problems and I think that also brings up the point of the importance of integrating that kind of training in our clinical and mental health programs. I think there is still a big gap in how we’re training mental health service providers and how to integrate these digital tools like Stephen is saying. And I do think we need to raise that up. Also, as you know, as technology is advancing and these tools are multiplying, we also have to equip clinicians with knowing again how to use these tools, engage them and care, etc.
[Dr. Stephen Schueller]: Yeah, the mental health of our youth is important. We should be investing in it with multiple resources. It should be therapists and it should be digital tools. It should be a variety of things. So I think, you know, really thinking about how these tools coexist with those resources, I think is the critical piece.
[Dr. Vaile Wright]: And Nicole, do you have any sort of maybe sort of specific advice for parents when they’re concerned about, you know, what they should be looking for if they’ve got children using these types of apps or they’re concerned about the screen time, sort of what what’s a practical thing that a parent can be thinking about?
[Dr. Nicole Martinez-Martin]: So, you know, again, I mean, besides, you know, what we’ve we’ve already talked about, you know, in terms of, you know, knowing what they’re specifically engaging with and, you know, doing doing some background on it. You know, again, you know, privacy settings. I know I keep harping on it, but I think I still think it cannot be understated how much the data component can end up having an impact. Both on what the quality of what, you know, the, the children or patients are receiving. But also in terms of other implications of it, I think, you know, I really think actually, I would go back to again, though, really thinking more of the, the system of care that that is that this is a part of, you know, that that we’re all talking about here. That, you know, relying on a tool alone, you really need to think about, you know, what other oversight is going to be involved if you are, hopefully, and I understand access is very hard, but, you know, hopefully have a therapist or someone, you know, who’s in the mix helping with the oversight that those kinds of things remain very important. And, you know, because I what I thought of when, Stephen was just talking about, you know, sometimes it coming down to even once you have a tool that has some effectiveness that you keep up with it. You know, I know when it comes to, you know, the children in my life, including myself when it comes to doing therapy. Therapy is hard work. That, you know, part of of what might keep people engaged is partly having that human there where it’s a lot easier to put a tool down. But having a human who is in some ways, you know, being able to work with you, listen to you, engage with you, helps is is very much a part of what, you know, sort of keeps people engaged. And so thinking of that larger network, you know, structure around, you know, what, what you are engaging with, you know, with the child, what the child is engaging with, you know, that supports their, their overall, you know, well-being and mental health.
[Dr. Vaile Wright]: Yeah. That’s excellent. Thank you. I want to acknowledge that many of the questions that we got asked today from our audience is, can you tell me about this app, or can you tell me about that app? And I get asked that a lot, too. I often defer those kinds of questions because these apps can change so frequently. So you could say it’s a good app one day, and maybe it’s not the next. But, in the spirit of this idea, and with just the couple minutes we have left to the panelists, have, you know, any specific apps that you would recommend for use, in this mental health space?
[Dr. Stephen Schueller]: I mean, I feel similarly, to how you feel Vaile.I try not to make any specific recommendations. Just because, as you mentioned, they change often. And, you know, I try not to endorse any specific products. You know, I’ll just say, like, again, from the voices of my kids, they like to use the headspace app.They love the Unicorn Island content. They love the Star Wars content. They used to love the Sesame Street content. And I think it goes into the, the, the point I made earlier about, trying to have content that’s really like, responsive to the youth and where they are. So I think, you know, just finding content and apps that really do like tailored to youth, I think is a critical piece. So and again, it’s not a be all and end all. But you know that’s content that they like. And I think they get some benefits from it.
[Dr. Vaile Wright]: Well we are almost just at–out of time. Any last thoughts from Elsa, Henry, Ava about just things. Your one takeaway that you’d like the audience to have from today. Elsa I’ll start with you.
[Dr. Elsa Friis]: I think, you know, one thing to just think about if you’re a parent if you’re in a school, or if you’re an administrator or teacher. I think one of the things to just keep thinking about in navigating this world, is we as adults have to educate ourselves as our kids are probably already like ten years ahead of us in this world. So they are already going on social media. They’re already seeking out TikTok videos to help diagnose their ADHD, and so they’re already there. So it’s up to us to really think about how we can continue to educate ourselves and meet them where they’re at in that happy middle ground.
[Dr. Vaile Wright]: Henry, quick, quick words.
[Dr. Henry Willis]: Yeah. I just you know, I think one of the common themes throughout all of our presentations is really the importance of centering youth. And so I just want to kind of continue to raise that, similar to what was said earlier, we can build a pretty app, but if nobody comes to use it, what’s the point? And so especially reaching out to those youth that might be most at risk, I think that’s really most important. And its next phase is really, uplifting those at risk youth and centering their voices and needs in this next phase of technology interventions.
[Dr. Vaile Wright]: And Ava, any quick thought to share with the group?
[Ava Shropshire]: Yeah. I guess the last thing I would say, just a reminder that digital therapeutics aren’t necessarily here to replace the human connection that we can have with therapy and mental health resources, but be that stepping stone, that pathway for students, for people from underserved communities, from people that lack accessibility, to really enter that world and hopefully reach that real life support.
[Dr. Vaile Wright]: Thank you, that’s a perfect ending. I appreciate that, and I’d like to thank the entire panel for this thoughtful discussion of the evolving landscape of mental health related digital solutions for use. This marks the conclusion of the Children and Screens Ask the Experts webinars for 2024. So we’ll see. Now in 2025, if you found today’s webinar insightful, we encourage you to keep these important conversations going. Your support ensures the Institute can keep bringing you expert guidance through future webinars. Donating is simple. Just scan the QR code on your screen right now. Click the link in the chat, or visit the Children and Screens website at childrenandscreens.org.