Date

The growing ease of access to social media and the internet has, unfortunately, also created increased opportunities for bad actors to groom, exploit, and abuse children online.  What do these experiences look like, especially with the rise of Generative AI, and how can we build both caregiver and child awareness to recognize situations and interactions that require extra vigilance or total avoidance?

Children and Screens held the #AskTheExperts webinar “Child Sexual Exploitation and Abuse Online: Virtual World, Real Victims” on Wednesday, September 18 at 12pm ET. Panelists from the Department of Justice, National Center for Missing and Exploited Children and other experts in youth experiences with online exploitation and abuse provided parents and caregivers with the information they need to help youth avoid the most common exploitation and abuse scenarios they may face online.

Speakers

  • Jennifer Newman

    Executive Director, National Center for Missing & Exploited Children
    Moderator
  • Steve Grocki

    Chief of the Child Exploitation and Obscenity Section (CEOS), Department of Justice
  • Dorrian Horsey, CIPP/US

    Attorney, Minc Law
  • Elizabeth Jeglic, PhD

    Professor of Psychology, John Jay College of Criminal Justice
  • Melissa Stroebel

    VP Research & Insights, Thorn

00:00:00 – Introductions by Chief Science Officer of Children and Screens, Dimitri Christakis 

00:01:14 – Jennifer Newman on the importance of parent education about child sexual exploitation to effectively engage their children early on this issue

00:11:43 – Melissa Stroebel on the prevalence of online child exploitation incidents, with survey results from young people

00:22:13 – Follow-up: What age should we start talking to children about online safety and exploitation?

00:24:15 – Elizabeth Jeglic on typical behaviors and characteristics of online sexual grooming and tips for open communication in families

00:37:03 – Dorrian Horsey on the issue of financial sextortion and related tips for parents 

00:47:27 – Follow-up: What recommendations would you give to mitigate risks from online child sextortion and exploitation?

00:50:51 – Steve Grocki on a holistic societal approach to tackling the issue of child exploitation online and new developments in AI-generated images

01:03:44 – Follow-up: How to understand different apps on the market and are some more dangerous than others?

01:10:02 – The panel addresses questions from the audience.

01:10:16 – What are specific risks and considerations for the neurodiverse community?

01:12:20 – How can we keep kids safe online while also allowing them to explore?

01:15:50 – How does law enforcement tackle abuse taking place within the home and children who have access to sexual materials? 

01:27:40 – What are final thoughts on how parents can tackle issues of prevention and messaging with their children?

01:32:43 – Wrap-up with Children and Screens’ Chief Science Officer Dimitri Christakis

[Dr. Dimitri Christakis]: Hello and welcome to today’s Ask the Experts webinar,  “Sexual Exploitation and Abuse Online: Virtual World, Real Victims.” I am Dimitri Christakis, Chief Science Officer at Children and Screens, filling in today for Executive Director Kris Perry. The growing ease of access to social media and the Internet has unfortunately also created increased opportunities for bad actors to groom, exploit and even abuse children online. What do these experiences look like, especially with the rise of generative AI? And how can we build both caregiver and child awareness to recognize situations that require extra vigilance or even total avoidance? What can parents do if their child becomes a victim of harassment or sextortion? Today, we’ll be exploring these critical issues with a panel of experts who will provide the information you need to help youth avoid the most common exploitation and abuse scenarios they may face online. Now, I would like to introduce you to today’s moderator, Jennifer Newman. Jennifer is an executive director of the National Center for Missing and Exploited Children, overseeing its Austin, Texas, branch office, the nation’s largest and most influential child protection organization. Welcome, Jennifer.

[Jennifer Newman]: Thank you so much. Appreciate the introduction, Dimitri. And really thank you to Children and Screens for providing this opportunity. And most importantly, thank you to you all for joining today or listening to this after the fact. Because really, education and awareness is really what is going to keep our kids safe as they navigate the digital space, understanding what kind of harms and risks they may encounter as a parent or as someone who cares about children in their lives. It’s important for us as the adults in this situation to be well-educated so that we can help them, help guide them through that, through this space. So, again, thanks to Children and Screens for the opportunity and then for you all for participating. Before we get over to our panelists, I just wanted to set the stage a little bit in terms of what we’re talking about today when we talk about online child sexual exploitation. So, you know, for many of the experts who are with us today, you know, we all remember maybe even five, ten, 15 years ago when child sexual exploitation really wasn’t talked about. It was something naturally that you know, the public and media wanted to avoid talking about. It’s an awful topic, but it doesn’t change it. It doesn’t make it better by not talking about it. So, the good news is that we’re starting to see a lot more coverage and discussion and awareness around online child sexual exploitation. So, that’s the good news. The bad news is as you as as we look at some of these headlines, which are from this year, it’s so varied and there’s so many terms when it comes to online child sexual exploitation that when you’re trying to educate yourself or talk to your kids about it, you have to navigate all of this as well. So, terms such as sextortion or sexting, online grooming, child sexual abuse material, deepfakes, now that, you know, as Dimitri mentioned, we’re talking about generative A.I. and that technology, what is all of this? How is it a threat to our children and what steps and guardrails can we put in place to help protect our kids again as they start living more and more online? You know, the phrase that we’ve often heard a lot is digital natives. Our kids are growing up with tablets and phones in their hands, and that’s not a criticism, but it’s just the way our world is evolving toward technology. And with that, we just have to be able to respond and also, again, kind of help our kids navigate that space. So, again, lots of different topics out there. We’re going to address some of them today. And you’re also going to hear about some great resources, some tips and tricks that you can put into place right now, when you’re talking to your kids, and also just help inform you on what’s out there. Because ultimately, this is a long game. You have to keep talking to your kids about this. This is not a one and done where you just sit down on a Saturday and talk to your child about online safety and then kind of check the box. The reality is that the technology is changing. How kids are interacting with technology is changing, which means that we always have to follow that to make sure that they’re staying safe. In terms of trends that we’re seeing, the National Center for Missing and Exploited Children, we operate the cyber tip line, and it’s been in existence since 1998 and this is when we receive reports of suspected child sexual exploitation for whether that’s from members of the public or electronic service providers. But you can see that last year alone we received over 36 million tips involving suspected child sexual exploitation. Of particular note, you can see in the past few years, the volume of online enticement reports has increased 300%. And again, this is really, you know, probably some of COVID and the pandemic and how we transitioned so much of what we did in real life into online. And we’ve kind of stayed in that pattern between schooling, being online, a lot of interaction being online. And then certainly again with all these great apps and platforms that are out there, you know, they’re intended for good, they’re meant to connect kids, they’re meant to connect people together. But unfortunately, they are also exploited by these offenders and use them as ways to target children. And so, we just really need to prepare ourselves. And when you think about sending your child out into the real world, you know, before the child goes, you need to send your child up to the mall or to go shopping for the day, go off to the movies. You know, we sit our kids down and we talk to them. We talk to them about, “Don’t take the shortcut through the dark alley.” We say, “Stay where it’s bright and it’s lit. Stay around people.” What to do if someone pulls over and tries to ask them to get help and get in their car. So, we are accustomed to that. We know that, that is instinctual for us to prepare our child that way. But unfortunately, sometimes, we don’t do that same thing when we talk about the digital world. All these platforms, that is a different place your child is going. And just because your child is at home and sitting next to you on the couch, they are in a different space. They’re in a different landscape. They are navigating a world that they need to be as prepared for and that’s really what we’re going to talk about today, is not necessarily stopping them from going online. The same you wouldn’t like, you know, you wouldn’t stop your child from leaving your house, but it’s how to equip them and how to prepare them for that. And so, you know, I think this is the first step. One thing we at National Center always talk to parents about is that you need to educate yourself first, because the other thing, too, is kids know a lot about technology and oftentimes they can know more than their parents. But that doesn’t necessarily mean you know, that shouldn’t be a blocker. Like don’t take that as a challenge. Hey, I know I am going to learn as much as I can about it. I want to be able to talk to my children about this and those kids in my life. Then, you need to talk to your kids. That is absolutely the best tool in the toolbox is communication with your children. You know, talking to them about what they’re seeing online. What have they experienced online? What of their friends? What’s going on in their school? Kids talk to each other a lot about this and it’s time for us to join that conversation with them. But this isn’t always about just policing or using technology to track your children. It’s really opening that line of communication, which then leads to engagement, which means, as I mentioned before, this is constant, continual regular engagement with your child. Check in with them. Are you on any new gaming platforms? Are you on any new social media ? Where all the kids meeting these days online. Make sure that you keep that conversation going and open on an ongoing basis. And what’s also really important to you is to prepare yourself to be a good parent or role model or an adult If a child does come to you because something did happen. You know, one thing we really try to talk to parents about is that it’s very easy to react with that fear and anger, which, of course, you know, is just human reaction. But that will scare or deter a child from disclosing to you or asking for your help. And so it’s also how do we put ourselves in the right mind frame that if a child does come to us and say, “Hey, this person reached out to me online,” or “I got this weird direct message on this app.” What is in place? What can be done? How do we go from there instead of reacting, obviously, as you do as a caring adult, but really just making sure that those kids know you’re who they can go to for help and that there is help out there. Before we turn it over to Melissa, who’s our first panelist, I just want to share two things that may have been of interest and kind of thematic throughout the rest of our time here today. But at the National Center, we offer “NetSmartz”, and this is all free programing, educational materials, information about all the issues that kids could be facing online, from cyberbullying to sharing and disclosing too much data, to sextortion, online grooming, and they’re all geared for the appropriate age ranges of between kindergarten and 12th grade. So, really fun games for the younger kids where they can be characters and cartoons. And then for those older kids, more discussion guides, how to start these conversations with your kids so you can have meaningful conversations with them about Internet safety and online safety. So, I wanted to make sure that I covered that. And then also, this is a new service that we’ve had. It’s about a year old now, but it’s called, “Take It Down.” And really what this is, is a new tool that we’ve released. It’s anonymous. And what we’ve heard from kids, they just want content down. When the nudes that they’ve shared have been distributed without their permission, when things have been posted online, when they’ve lost control of the distribution of an image, they come to us and they just say, “I just want it down.” And so this is a way that we can do that. You know, you can certainly find more information at the website. But again, this is anonymous. The images are not sent in to us. They stay right on that device. But we work with partner companies such as Instagram and WhatsApp and Snapchat, where they can look for those images and that imagery then and try to take it down. So we’re really changing that narrative again. About ten, 15 years ago when it was the Internet is forever. It’s if it’s posted online, it’s posted online, and that narrative is just different now. We are able to say there are things we can do there are resources that we can use to try to help stop some of that distribution. So with that, you know, really trying to just set the stage for what we’re going to be talking about today. I’ll now turn it over to Melissa to share her screen while I tell you a little bit about her. I’ve had the pleasure of working with her for over 15 years now, close to 20. And she’s the Vice President of Research and Insights at Thorn. Melissa has spent nearly 20 years combating the online sexual exploitation of children as a victim identification analyst, product manager, and researcher. In her current role, she serves as a subject matter expert and drives strategic research examining emerging threats and trends in online exploitation. Take it away, Melissa.

[Melissa Stroebel]: Thanks so much, Jenn and our hosts at Children and Screens for inviting me and Thorn to join you in the conversation today. To our audience, whether you’re watching us live or coming back to review this remote again. Echoing Jenn’s sentiment, it is so important just to start with education and learning about the topic. So, this is when you think about what can you do, you’ve already taken the first step in doing that. So thank you for joining us today. As Jenn said, my name is Melissa Strobel. I lead our research initiatives at Thorn. Thorn is a nonprofit that’s building technology to defend kids from online sexual exploitation. Now, to ensure that we’ve got the best chance of success in combating this issue and safeguarding our young people from the risks they may encounter online. We conduct ongoing research to learn from those on the front lines. So who’s included in that? It is investigators, working these cases every day. It is the trust and safety teams. It’s caregivers, adults in communities like this, but also really importantly, it’s the kids who are confronting these things head on day in and day out. Because if we don’t understand their perspectives and their experiences, we are doomed to fail to meet them where they are. And that sets us up for a much steeper uphill battle in terms of equipping them with the knowledge that they need. So what are we learning? Well, the reality is that the majority of hands on child sexual exploitation does still occur within communities. It is those community members who are trusted. It is family members, unfortunately, neighbors. But the reality is that the Internet creates some new opportunities there. And we have to evolve our thinking to not only anticipate where risk might show up in the offline world, but also how it might show up in the online world. It’s also changed really what growing up means, and honestly, it’s changed what socialization means for all of those kids and adults. There’s no difference there. And when we think about what growing up means, the reality is that also includes sexual exploration. It is normal and healthy for kids to be thinking about flirting and exploring their sexuality as they get into puberty and older. But what that conversation looks like when you think about the digital age, it’s something a little bit different. So, what have we heard from kids directly? Well, one in seven say that they’ve already shared their own nudes. This is 9 to 17 year olds, and about a quarter also think that it’s normal for kids their age to be sharing nudes. So, this is not an outlier experience for them or their perception of how their peers interact. In fact, for many, we’ve had kids actually tell us outright sexting is kind of the, “new second base.” Now, keep in mind, this is not something only teens are doing. We can’t just think about this as this, “teens being teens making reckless decisions” type perspective because it sets us up for failure in the conversation. This might be surprising to everybody on the call, but we’ve done research with caregivers as well, and a third of them have also shared intimate images. So when you think about how kids are thinking about flirting and dating, if that many caregivers are using their phones as a form of flirtation, it’s unsurprising that kids are starting to do it as well. But not all risk is created equal, and it’s not just limited to teens. We see pre-teens also who are starting to engage in these behaviors. We’ve had nearly one in 12, 9 to 12 year olds tell us they’ve also shared nudes at some point. Now, this is not only more innocent flirtation. As an example, let’s say your 16 year old has come home and is thinking about how they’re going to keep flirting with the new kid in their lab class at home. It’s no longer that flirting ends when the school bell rings, it comes home with them. It’s in their pockets at night, depending on what your household rules are. But it is an extension of that in-school flirtation that may be happening. Not all cases are like that. We’ve actually had half of the kids tell us of those who have shared news that they actually shared it with somebody who they only knew online. So, this is not that person who they already can verify. “I’ve met this person face to face. It’s a teenager in my community.” It would not be uncommon for us to be socializing in this way. A large number are also sharing images with people that they’ve never met online. And it’s not only kids they’re sharing these images with. We’ve actually had kids tell us that they have sexual interactions with people they believe to be adults at about the same rate as they’re having sexual interactions online with other kids. And it’s about one in four, say they’ve had some type of sexual interaction. So, that means sharing nudes, it means flirtatious texts, it’s being asked by someone they’ve never met before if they can go ahead and share nudes or potentially go on camera for more explicit stream. When we think about these different dynamics, we can see that there are very different pathways leading to this really risky situation. And the risk that can result looks different depending on the situation as well. What that means is that we need to be responding accordingly so that it is not just a blunt response of, “Just don’t talk to strangers,” because we missed the real context of some of what they’re actually experiencing. Now, worse on top of all of this is that for a lot of kids who find themselves in one of these situations, they choose not to share it with anyone. They are in a position for whatever the reasons are, and we hear for some, it’s because they don’t believe it’s that big of a deal. For others, they’re worried about the consequences that are going to come either in terms of losing friends at school or losing access to their technologies at home or getting grounded at home. Because of some of those consequences, rather than it succeeding entirely at deterring the behaviors themselves, it’s actually serving as a deterrent to coming forward and asking for help. Now, as we’re continuing to catch up with the reality of what kids are facing online, just by the mere advent of the Internet and its integration into mainstream everyday life, technology is racing ahead, and that’s something that within our research we see we have to yes, think about how we equip them to deal with what is more mainstream today, but how do we look to the future and anticipate what is coming? The reality is, is that oftentimes young people are really some of the most early adopters of new technologies, and that means they’re sometimes encountering risk well before the adults in their lives, they anticipate that that’s going to happen. So, what are some of the risks that we’re keeping our eye out for? I’m just going to give you a quick high-level snapshot, because I know some of our other speakers today are going to dive into these topics a little bit more deeply. One of the things we’re watching is: financial sextortion. As Jenn indicated, we have seen a significant increase in this type of online threat over the last couple of years. And it’s really distinct from what we’ve seen in the past, in that we have organized defender groups and we have this very focused victim group of teenage boys who historically we haven’t seen as much of a targeting of as we’ve seen teenage girls and younger kids. We’re also trying to keep up with generative A.I. and what that means in terms of both opportunities and risks that young people are facing. Generative A.I. does create new ways for bad actors to be creating abuse imagery of minors, sometimes from completely benign images that are available online. But worryingly, we’re also seeing these same types of abuse behaviors being taking place between peers within schools and within their communities. It’s something we need to evolve and respond to very directly. The last is that we are now at a place where young people anticipate encountering some of these things online and they are proactively looking to those around them to help support them in navigating, avoiding, and responding to. Platforms play a critical role in that conversation. And kids are telling us they’re looking to the safety tools as a means of staying safe online.

[Jennifer Newman]: That’s great. Thank you so much, Melissa. I do have a quick question for you, but before we get to that, you know, on your slide, you had “SG-CSAM.” Can you just quickly define what that is for the group?

[Melissa Stroebel]: Thank you. Thank you. So “SG-CSAM” is the acronym for the term we refer to as self-generated child sexual abuse material. Put very plainly, it is sexual images, abuse images, of a minor where there is no apparent other person in the images to be taking it. Now, sometimes we would just call that, “nudes”. Might be how we think about it. But what’s important to remember is just because we don’t see a perpetrator in the images themselves, it doesn’t mean that that child has taken that image of their own volition. Yes, there are cases where nudes are being shared for exploratory purposes. But yes, it’s also very true that those images are being created because they are being extorted, they are being groomed, or other types of harms that are at play.

[Jennifer Newman]: That’s great. Thanks. And this may be a little bit of a loaded question to answer in a short amount of time, but one thing I remember you saying when we were at that Interpol Conference a few years ago, as you said, “Our kids are already talking about sex. We just need to join the conversation.” And I think this data is so powerful and insightful and shows that, especially showing the age of nine as the youngest in your study. And so, I guess my question for you again, pretty high level because we need to move on to Elizabeth, but what age should we start talking to kids about this? What is appropriate for learning levels and understanding comprehension? But what would your recommendation be?

[Melissa Stroebel]: Yeah, so what we think about is early and often. It is happening sooner than we think, number one. Number two, we also need to keep in mind if we come in cold, to a big conversation about have you ever been asked for nudes or something along those lines. It’s going to be very abrupt. This is a hard conversation. Anything talking about their kids’ bodies, It’s hard for adults. It’s also hard for the kids. But we don’t need to start the conversation about online safety. That specifically about risks of sexual exploitation. There’s a full world of runway that is age appropriate for much younger kids, five, seven, nine, where we’re talking about things like autonomy, who you turn to for help if you don’t feel comfortable about something, what respect and healthy interactions look like in online places and quite frankly, just sitting down and sitting with them while they support technology, it creates a really open conversation that is much lower pressure than the types of higher risk conversations that need to happen as they start entering puberty and really ten, 11, 12 years old.

[Jennifer Newman]: That’s great. Thank you, Melissa. Thank you so much. I’m sure we’ll be back with you towards the end for some questions. So, thanks. Elizabeth, while you share your screen, I’ll go ahead and read your background. So, Dr. Elizabeth Jeglic. Jeglic is an internationally renowned expert in sexual violence prevention, grooming and child sexual abuse. She’s a licensed clinical psychologist and professor of psychology at the John Jay College of Criminal Justice in New York. She’s an expert witness and consultant as well. She has authored over 160 journal articles and book chapters, as well as six books, including Protecting Your Child from Sexual Abuse: Sexual Grooming and Sexual Violence. Dr. Jeglic aspires to use her research to keep kids safe, prevent childhood sexual abuse. So, you are absolutely an expert on this panel, so we look forward to hearing from you. Take it away.

[Dr. Elizabeth Jeglic]: Thank you so much, Jennifer. So I’m going to be talking about online sexual grooming and I would like to thank Children and Screens for having me here today, because I think it’s such an important topic to educate individuals about. So, online sexual grooming is often confused with online sexual solicitation. And the big difference between grooming and solicitation is that sexual grooming requires the formation of a relationship and generally happens over a period of time, all of that period can be short. Whereas sexual solicitation is kind of like a one time contact. While we do know that online grooming happens much more quickly than in-person grooming, there usually is a period of time where that relationship takes place. Interestingly, some new research has suggested that 80 percent of kids who experience online sexual grooming also know that person offline. So, we’re seeing a lot of crossover between the kind of in-person sexual grooming and online sexual grooming, as these are people who are in their social sphere. Often they are, you know, significant others, friends, siblings of friends. They kind of get to know them through their DMS. They might be familiar through school, things like that. So, it’s really important when we think about you know, prevention initiatives as to really that it’s not just strangers that might be engaging in these types of behaviors, that could also be somebody who is known to the child. And David Finkel just published a study where they found that two thirds of kids who are abused online, both solicitation and grooming, knew the person offline as well. And so we have to kind of think about that when we think about our prevention efforts. One of the things that we know about is that a lot of kids who are harmed online and specifically with sexual grooming are minorities. And they could be because they have a disability like autism. It could be because they identify as a gender or sexual minority, or it could be because they’re a racial or ethnic minority. Oftentimes, these kids struggle to find peers in person, and so they find communities online who understand them better. But this also makes them vulnerable and sometimes they’re taken advantage of through these communities. My colleague Georgia Winters and I have developed a framework for understanding online sexual grooming. There are multiple different models out there. This mirrors the work that we’ve been doing kind of in-person sexual grooming, but we see a lot of the same characteristics online with the victim selection. So that perpetrators will select children who have, you know, they kind of lurk in chat rooms and then they will see who potentially is maybe not being supervised, who is using more sexualized language, who has a sexy screen name. And those are the kids they might target, because they might be vulnerable. They then gain access to them and try to get them offline and isolate them. So if it’s like a group chat, they’ll try to take it offline and they’re encouraging the minors to hide the relationships from their family. The main crux of the grooming is the trust development, and that’s where they share interests and hobbies and they develop a friendship. And especially with teens, we see a lot of the flirty-dating type relationships. We did a study where we saw that a lot of adolescents were, you know, engaging in chats with strangers online and they viewed these individuals to be kind of boyfriend or girlfriends to them. So, they view this as a relationship. And so it’s important to recognize that that’s kind of what they’re aiming to do. And then right before the abuse happens. So, whether that be that the exchange of CSAM or meeting the person offline, or doing the sexual talk, they introduce sexualized language and that’s the big red flag. Online, we see this much more quickly than we do in-person and our research has suggested that within the first 24 hours, perpetrators who are seeking to kind of engage in this type of online grooming will introduce sexualized language. And so when we think about targets for kids, these are things that we want to talk about. One of the things that we see that perpetrators do, too, is they will often engage multiple young people in conversation at the same time, because unlike in-person grooming, you can access multiple people at once. And so when they throw that sexualized language out there, they see kind of who will respond back and if the child stops having a conversation, they’ll just stop communicating with them, but they’ll pursue those who continue to engage with them. Post-abuse maintenance, because if they don’t have the person’s identity, they can just cut off and it’s easy to kind of lose contact, or they might use threats or blackmail in order for the person to prevent disclosure. So, how do we, as parents, and educators, and community members, prevent online sexual grooming? The biggest thing and what most of my colleagues have highlighted already is that we really need to talk to our children. We have to have these conversations frequently. We have to have them young and we have to have, you know, them in various different ways that are age appropriate for the child. You know, when they’re younger, we make sure that we have all those protocols in place and that protect them. We also want to make sure that they have fun online. It is fun to have fun online and to meet people and to have, especially during COVID. That’s how my children had their community. They were online with their friends. But we want to make sure that children never share their name, age or location with anyone. They’re never to share a picture online. They shouldn’t chat with adults or strangers online and never agree to meet anybody that they meet online. So, one of the things that I really encourage parents to do is to get your children to think critically, because we can put all the bells and whistles in place to protect them, but they’re going to come times where we are not going to be there or they’re going to find a way around it. So, we want them to think critically about these situations. So, explain why they should never have their share, their name or location with anybody, because the more they understand the rationale behind these rules, the more likely they are to kind of internalize that. I also suggest using hypothetical situations or critical thinking situations. So, you know, something may have happened at school and you talk about it at home and you say, how would you handle that? Because if they kind of run through how they would handle those scenarios, they’re more likely to be able to use those strategies and techniques when the situation takes place. The big thing is with online offending is that we have to know, we have to let our children know that we are there for them no matter what. So, we shouldn’t kind of think if they make a mistake, we should kind of think, when they make a mistake because they’re going to make mistakes online. Hopefully they’re not very serious mistakes, but we want them to be able to come and talk to us so that we can work it through with them. Because if they think that we’re going to get mad at them, then they’re less likely to tell us and then the situation can escalate. We also need to talk to our kids about the, you know, sexualized content. We heard about sexting. That’s quite common among young people these days. But if somebody that they don’t know online is introducing sexualized conversation or sexualized language or imagery, then that is suggested that they are trying to groom them and they should immediately contact, you know, a trusted adult and really just let them know that you are there for them and they’re not going to get in trouble, even if they’ve already made a mistake. You know, we’ve heard NCMEC can help get images down. There are different things that we can do as adults to help them if they make these kinds of mistakes, because we want to make sure that they do share this information with us. And as we will learn, it’s important that, you know, the children know that we can help them, should something happen. Obviously, you know, you hear about social privacy settings, things like that. You want to make sure, especially as children are young, that those are set to the maximum, that they can’t really engage with strangers. You’re going to have to give your teens a little bit more leeway. You know, if you keep them in this little hole, they’re going to try to escape it. Right? And so it’s really important that you give them a little bit of more leeway as they start to get older, but that you teach them those critical thinking skills and that you reinforce them when they adhere to, you know, the rules and regulations that you kind of set for the family. One of the things that we did in our home, especially when the children were little, is that we only allowed the use of Internet enabled devices in the common areas of the home and that way we were more able to kind of oversee what was going on. I could hear my children talking to other, their other peers, especially during COVID, you know, if they’re having private conversations in their bedroom. I mean, that’s normative when we were growing up, we would talk on the phone to our friends, but you want to know who they’re talking to and what they’re talking about. So, kind of like just stop in their room and if you see that they’re hiding or doing things you want to do, then once they’re off, you know, talk to them about it. Who are they talking to? What were they talking about? And just keep those lines of communication open. One thing that some parents do is have a kind of a cell phone contract. Obviously, a contract is not going to prevent them from engaging in, you know, from a perpetrator, from abusing them. But it does kind of set your family’s ground rules for the use of the phone. And it’s important to have that conversation. So, if nothing else, this contract enables you to have that conversation with your child. And so you might want to have that a contract when they first get their phone, when they’re younger. And then again, as they’re adolescents, you might want to renew that contract and maybe change some of the policies because when they’re younger, they might have to have everything at such maximum. But as they kind of get older, that’s going to change and you can negotiate that. So while, again, this contract is not going to really prevent anything, it’s a way for us to communicate with our children and keep those lines of communication open. One thing that’s really on a lot of people’s minds is, you know, how do we handle kids with special needs online? And we know that kids who have autism spectrum disorder might be particularly vulnerable. I do have a child with autism spectrum disorder myself, so these are things that I think about. And so it’s important that similarly, like a lot of the same things that we were just talking about, you discuss with your child with autism, that you come up with Internet ground rules together, especially when they’re younger. You make visual reminders and you can continue these when they’re older. But, so you have a poster of Internet rules they’re expected to follow, and you want to make it explicit, simple and concise, very concrete. Again, you know, simple rules like, no personal information, address, name of schools, sharing images, and that if they feel uncomfortable or something happens that they’re alerting adults. You want to educate them appropriately about online sexual abuse, that it does happen. And we know that kids with autism might have difficulty, you know, identifying social cues and the same thing happens with online sexual abuse. They might have difficulty identifying abuse cues. So, be explicit and concrete, especially when they’re younger, use parental control software. You know, depending on their level of autism, they may not be required. They may not want to engage in chat rooms. And so in that case, you can have it set to the maximum. So they cannot access pornography or child sex abuse material online. And you want to make sure that you’re checking their online messages, browsers, histories and chats because they are vulnerable and they’re more susceptible to be manipulated online. One of the biggest things that I encourage parents to do is not allow Internet-enabled devices at bedrooms at night. Our research has shown that perpetrators tend to engage with young people after they talk after 11 p.m. at night when parents are asleep, and so that’s when a lot of this grooming behavior happens. That’s when the pictures are exchanged. Kids are more apt to make poor decisions when they’re tired. It also promotes healthy sleeping habits. So you want to kind of keep them in a common area. And that way, you know, you’re sure that they can’t access their devices overnight. We, my colleague and I, have written a book where some of these issues, along with protecting your children from sexual abuse, are discussed. And so if you’re interested, you can get more information here. So thank you very much,

[Jennifer Newman]: So, Dorrian Horsey is an attorney at Minc Law where she focuses her practice on representing individuals and businesses in matters pertaining to defamation and online privacy, cyberbullying and content removal from numerous online platforms, among other Internet defamation issues. She has established a track record for successfully representing clients before state and federal courts, as well as various administrative agencies, including the Federal Communications Commission. So, thanks for joining us Dorrian.

[Dorrian Horsey]: Hi. Thanks, Jenn. So, you know, I often refer to our firm as the Internet’s emergency room, because we see some very serious issues. But the most devastated and emotionally fragile clients that we work with are always the clients who are victims of sextortion, and financial sextortion specifically. So, what is the number one mistake that parents make when dealing with this issue or any of the issues that we’re going to talk about today? And I think that we’ve kind of covered it. But I want to say it again. It is a mistake of believing that it will never happen to my child. And I call this the arrogance of disbelief, you know, not my Sunday school or not my athlete, not my scholar. But what we know, as has been stated already, is that, yeah it could be your kid. And if it’s not your kid, it could very well be one of their friends. So, we want to remove those blinders and just know that it could happen to any of us. Now, what is financial sextortion? Well, financial sextortion is a type of blackmail in which the victim is coerced into paying money to prevent the sharing of their intimate images. And we often see with young people that they’re told to pay with gift cards. But we’ve seen, you know, all forms of payment demanded, whether it’s Cash App, PayPal, Zelle, Venmo, Bitcoin. This is really a numbers game. Their sole objective is to steal money and they will say anything to get it. So, who is most likely to be targeted? Well, we’ve talked a little bit about this already, and this is kind of a trick question, because all of us are targets. And I know that we are speaking today about this in young people. But I’ll tell you, the youngest person I work with is 12, oldest person, well into their eighties, every age, every demographic in between. This is a real problem that we’ve seen skyrocketing in recent years. Now, according to NCMEC, boys between the ages of 14 and 17 are the prime targets of those who have reported their gender and their age. But the real thing that you have to keep in mind, that you just have to be vulnerable and let your guard down with the wrong person. Now, where are the perpetrators? Well, the perpetrators are all over the world, I can tell you, we’ve dealt with extortionists all over the globe. However, when we talk about this focus on young people and financial sextortion and we see a lot of this coming out of West Africa, particularly the Ivory Coast in Nigeria, and we see it coming from the United States and also the Philippines, I can tell you that there are playbooks that are being passed around online openly. There’s a group called the Yahoo Boys in Nigeria that just very blatantly publishes and posts things on social media about their activities, but there are other groups that publish their activities as well, and this is being shared openly to scam people and this is a very simple scam, but it is very effective. So how does this scam unfold? Let’s talk about it. Typically they’ll use a photo stolen from an attractive woman that is online or just an A.I. generated image, like this one. And they’ll use that as bait to lure in the victim. And we see this scam unfolding often on Instagram and Snapchat, I would say those are the two primary platforms that we see in this age group when we’re talking about teenagers, you know that the perpetrators go where the victims are. So, when we’re talking about older victims, we might say that we’re looking at a platform, like Facebook. But typically what we see is that they’ll follow someone on Instagram or, you know, ask someone to add them on Snapchat and just strike up a conversation. This is an actual chat from one of our clients, and they’ll just kind of, you know, make small talk and then they’ll kind of steer the conversation towards something very sexual. And, you know, the victim thinks that they’re a peer. They think that there’s someone, you know, in their age group. So they kind of drop their guard and once they have shared something sexual, the person might say, “Hey, I’ll share my picture, you share yours.” Once they share something sexual, then they flip a switch and become very, very threatening and this is what you can see. And you know, what we hear is that I’m just going to show you some of the things that I hear from victims. But, you know, the common thing that I hear people say often is that the sextortion said they were going to ruin their life, they’re going to ruin their life. They’re going to, you know, tell their parents or tell law enforcement or share these images with everyone they know. And that can be very terrifying. So, what are some tips for talking to your child or a child that you know about sextortion? I think we’ve talked a little bit about this already, but we want to approach the subject as a conversation and not a lecture, that is so important. We want to make sure that we talk to our kids about how they’re interacting with social media, you know, go through their privacy settings with them, and make sure they’re set at the highest levels. And I think the most important thing I can say is you need to reassure them that they can always come to you. That is the most important message that you want to get across with anything that we’re talking about today. Reassure them that they can always come to you. Now, what should you do if this does happen to your child? Well, first thing, Do not panic. Harder to do, I know than to say. But do not panic. Do not pay. If you pay them, they’ll ask for more. Preserve the evidence that could be taken. Screenshots of you know what’s transpired between your child and the extortionist. You want to block contact so they don’t have the ability to communicate with you or your child, set their online accounts to private or even deactivate them. Maybe it’s time for them to take a break from social media or any of the platforms that they were on. And then also you can report this sextortion to the proper authorities, such as the FBI or the cyber tip line with NCMEC. And I didn’t put this here, but, you know, obviously you can contact an experienced attorney and they can guide you as well. Now, the number one thing that I want to stress is be supportive. Be supportive. It is, I can’t tell you how many parents I’ve talked to who have said to me, I can’t believe my kid did this. I’ve talked to him. I told him not to send out pictures. And I always tell parents when we are in that situation that we’ve already won, we’ve already won because we’re having this conversation and not a much more difficult conversation. So if you find your kid becomes a victim of something like financial extortion, it’s so important that you encourage them and thank them for telling you, because that is half the battle right there. And I want to say that, you know, this presentation is dedicated to all of the families who have been tragically impacted by financial extortion. We’ve seen an uptick in suicides from young men, particularly who have been victims of sextortion. I’ve seen stats anywhere between 20 to 40 kids who have committed suicide in the last few years. And this is the Woods family and the Duamis family, James Woods was a young man here locally in Ohio. Jordan Dumais was a young man in Michigan, who both of them were 17 years old, had everything to live for and were targeted with sextortion and took their own lives. And I’ve highlighted these two families because they’ve both taken their story and, you know, done a lot to inform other people about what happened to their child. And James, his parents have started a foundation, “Do it for James,” which does amazing work. They go around and speak to schools and other organizations around the country. So if you’re looking for someone to come to your local school or your organization, I highly recommend them. They are amazing people. And then you have John Dumais here, whose son Jordan was tragically taken from him. And fortunately in that case, they were able to extradite the perpetrators from Nigeria and they pled guilty and just recently were sentenced to 17 years in prison. So, that’s not going to bring him back. But at least they were able to get some recourse for that tragedy. So that’s just a very high level overview of financial sextortion. You know, our firm works with sextortion victims on a regular basis. You can visit us at MincLaw.com to get more information.

[Jennifer Newman] Great. Thank you so much, Dorian. I wish I think if I was on camera, you would have just seen me nodding the entire time because just everything you said was so spot on, you know, just for the audience, their neck. Meghan Thorn from where Melissa is from, we actually just put out a report in June on data from our cyber tip line related to financial sextortion. And so much of that is echoed and supported by what you said Dorian And I think what I love, too, is how you said make it a conversation, not a lecture, because I think, you know, with kids, we’re just always telling them what to do. And the reality is we need to have this conversation together. I think there’s also just the reality of online safety tips and just what kids are going to do. I mean, to Melissa’s point, earlier, we of course, tell kids, “Don’t send nudes.” Kids are sending nudes, as we saw. So, you know, I think some of it is in terms of the prevention front, you know, when it comes to financial sextortion, what are some tips that parents can put in place or start talking to their child about in terms of not getting to this this situation to begin with? And we always asterisk that by saying we’re still talking about kids, you can tell the smartest, smartest, smartest kid, you know, what to do online and don’t do this and do that instead. They’re still children. They’re going to make a mistake. And so, you know, so always the asterisk if prevention doesn’t work. Make a report. But in terms of trying to mitigate risk, what do you what recommendations do you have?

[Dorrian Horsey] Yeah, I mean, I think some of the things that I would say have already been shared is it’s really, really important to have the conversation, so that your kids even know what sextortion is. I speak to a lot of teenagers and they don’t even know what it is until I talk to them about it. And, you know, it’s important to let your expectations be known. We know that kids are still doing it, but, you know, they do listen to parents. So, you know, talking to your kids about not sharing this kind of content and actually being aware of where your kids are online, You know, I spoke about Instagram and Snapchat, but those are not the only platforms where we see this start. And so, you know, giving your kids an awareness of what are the red flags, if someone takes the conversation to a sexual place, that’s a red flag. I often hear kids say, I saw that that was a red flag, but I kept going because that maybe I was bored, I just want to see what happened. And so I think it’s important to let them know and educate them. And then on the back end, you know, being supportive and I will say this, you know, as I said, I talk to a lot of kids, and when I first started speaking to kids and I was very you know, we just emphasize prevention and then just say, you know, if this happens, here’s what you do. But what I’m realizing is that in doing that, I get a lot of feedback from schools saying, hey, you know, we’re glad that you were here because we’ve had several students come to us and say, it happened to me, but I knew that I could talk to a trusted adult. So, we’re not going to stop everybody from doing this. But if we can get the message across to our kids, that they can come to us or a trusted adult, as I said earlier, we’ve already won because now they have someone in their corner that can help them manage the situation.

[Jennifer Newman] I think that’s exactly right. And I think you’ve made another really good point about the apps, because I think it’s maybe human nature or easy to vilify a particular app and say, you know, it’s this one. And if my child’s never on it, that my child’s absolutely going to be safe. And that’s just not true. You know, I think the theme that we’re all talking about is that this is child sexual exploitation and abuse facilitated by the Internet. So at the core, there’s still an offender and there’s still a victim. The app insert, whatever app name you’d like is the tool for how they’re connecting and how that harm is being caused. So I think it’s an important point to say, yes, there are some where we see more prevalence of it. We also see those companies engaging and do more to protect their space and to lock down those opportunities for offenders. But I really appreciate that. Thank you for the overview of financial sextortion, So that’s great. So, Steve, I invite you to the conversation now. I know you don’t have slides, but as you come on camera, I’ll just introduce you. I’ve worked with Steve for a long time as well, so it’s always nice to be on a panel with you. Steve was selected to serve as the Chief of the Department of Justice’s Child Exploitation and Obscenity section in October of 2015. In April 2023, Mr. Grocki was appointed as the department’s National Coordinator of Child Exploitation, Interdiction and Prevention. During his 19 year tenure with CRC as a trial attorney, Assistant Deputy Chief Deputy Chief and now chief, Mr. Grocki has investigated and prosecuted numerous nationally and internationally significant child sexual exploitation cases, receiving the Attorney General’s Distinguished Service Award on two occasions. So again, expert of the experts here with us right now. So, Steve, you know, you’ve heard kind of the evolution of this conversation, Melissa, really laying out what our kids are doing online, what we’re seeing online. You know, we heard a little bit a little bit from Elizabeth about starting that prevention conversation. Dorian took us through a deep dive on financial sextortion. I know one thing that the Department of Justice is focused on right now is the generative AI, deep fakes and what that looks like, especially the unique position of that being a peer to peer abuse, where, yes, it’s certainly happening with adult offenders, with children, but children to children. And what does that look like from a prosecutorial standpoint? So, really open it up to you for your thoughts before we move over to the general Q&A.

[Steve Grocki] Yeah, thank you, Jenn, very much. And thanks to Children and Screens for the invitation. I’ve been a prosecutor a long time involved in investigation and prosecution of these types of crimes, as you heard for more than 20 years, actually, because I was a military prosecutor before that, focusing to some extent on online crimes going back to the 1990s. So I’ve really kind of witnessed the evolution of this particular crime type. And one thing that I want to start with that is for certain, and as a prosecutor, I say this with no reservation is that investigation and prosecution is not the solution. It will not be the solution to ultimately turning the tide on children being exploited and abused in online environments or offline environments. It takes a whole of society approach. So, this type of engagement with parents and caregivers, teachers, whoever is out there listening is critical to gain a collective understanding of the problem and a collective resolve to responding to that problem. So, I’m very happy to be here. What I want to do, Jenn, I’m going to talk about, I only have 8 minutes, which is why slides are not good for me, because I would I would start in the first slide and probably never get to the second. So what I want to do is kind of pull back the aperture a little bit and just talk about a very 30,000 foot view of child sex abuse material and online child exploitation crimes in general. We’ve been very focused, which we should be with this audience, on things like enticement of minors online and financial sextortion and grooming that focuses the age range of what we’re talking about, typically to probably 8 to 17 or so. But I want to emphasize that child sex abuse material, that problem online actually is the full range of child ages, that we have infants and toddlers that we see imagery of that’s distributed and produced and trafficked online, all the way up to 17. And so and the nature of that changes throughout that spectrum. So just very, very quickly, Jenn, you touched on it in your presentation and NCMEC has the best data that reflects this, but I talk about this problem of online crimes against children with three, three key areas, and that is the volume of the problem, just the sheer scale of it and how that exists in the United States and how that exists globally. Number two, the complexity of it and how that has changed and continues to change and how A.I. certainly is the latest example of that. And then the dangerousness of that and I agree with Jenn that Dorian’s description of financial extortion in response to that and that problem was excellent, but it highlights the dangerousness and the evolution of the dangerousness that kids experience online. And if we were having this conversation 2 to 3 years ago, we likely wouldn’t have been talking about boys committing suicide as a result of just the utter desperation that they were experiencing from being targeted. We’re talking about thousands in the United States alone that are being targeted in this way for a financial motivation, not a sexual interest in that child, but purely financial. And so that added certainly dangerousness and complexity. So, volume first and foremost. You know, Jenn talked about the number of online enticement. She put up slides about the number of images. Those numbers only reflect U.S. based providers that are required to report to NCMEC and amongst that core group of U.S. based providers, we’re talking about a few very big, well-known names like Meta, that have Instagram and Facebook, and other big platforms. Google, you know, all the heavy hitters that we know here in the United States. But there are many, many more out there across the world that aren’t necessarily reflecting those numbers. And so the scale of CSAM, the scale of online enticement is extraordinary. That creates incredible difficulty for investigation prosecution and the need for these types of prevention to sort of stop it before it begins. But volume is not just in the United States. It is something that we grapple with. Our foreign partners grapple with, state and local investigators and prosecutors grapple with. And it’s something that we are and I’ll talk about the response in just a second, you know, trying to deal with not just through collaboration among the various agencies in the U.S.; NGOs like NCMEC and others and other global partners, but also in trying  to make change, legal change so that others bear some responsibility for responding to this problem. On complexity just ticking through the technology. And I’m going to talk about A.I. in a second, but we can’t forget about other areas of technology that are incredibly difficult and complex. Encryption; over 200 messaging apps online on the Apple App Store that are encrypted, you know, they’re not unique any longer and they create serious problems for investigation and identification of these offenses. Darkweb, you know, there’s a question that was posed pre me about that that is anonymous networks, communication platforms that exist globally and the ability to deal with those and offenders that gravitate to those is incredibly challenging. We have increases in commercial motivation, as I mentioned, with financial sextortion, the ubiquity of mobile devices, multiple motives, mobile devices, more devices in everyone’s pockets that take pictures. And when devices take pictures and kids are used to people taking pictures of them all the time, every day, people taking pictures of grocery stores, you know, that I send when I go looking for, you know, yellow onions that I have no idea what those are. And I’m trying to find them for my wife. So we just are, that’s part of our culture now. But that also amplifies the complexity of how often this type of offense is happening and how little opportunity it takes. So, you know, for parents out there, just think about that like a person with a phone in their pocket, the opportunity that they need to potentially take a compromising image of a child either in secret or openly very, very brief. And just global global access, Dorrian mentioned the access of individuals in Ivory Coast and Nigeria to our kids. You know, that is a problem that when I was growing up, my parents were worried about people in my neighborhood. They worried about people in my schools weren’t worried about people in the Ivory Coast or Nigeria reaching out to me to try to get images of me and then me being desperate about that and potentially harming myself as a result. So, you know, the global complexities is massive and then dangerous. As we talked about the suicides, I just want to highlight there are other dangerousness. This has multiple faces and certainly as offenders communicate with one another, which they do online and they share best practices which they do online, they are getting better at potentially engagement and they do so both with incredible volume, just a matter of the numbers of engaging as many as possible kids. And then they do so individually as well. And they’re very, very sophisticated at that. On A.I., just a couple of points, Jenn mentioned the peer to peer issue. So A.I. kind of breaks down in a couple different areas. We have fully generative A.I. We saw an image in one of the presentations of an image. It looks pretty real, right? It has a soft focus, but it’s fully generative. There’s no miner that’s involved in the production of that. You can produce CSAM, child sex abuse material, using the same type of technology. The bigger issue that I think we’re encountering is deepfakes where innocent images of individuals, sometimes images that are, you know, years old, decades old, are being then turned into CSAM. And that is causing an individual, a child, parent has done everything they could to keep them offline. And any one of their friends posts a picture on Instagram or something. That picture then becomes a source material and suddenly they become a CSAM victim. And so that is, you know, an incredible new technology enabled offense that everyone is grappling with around the country. The peer-to-peer area that Jenn mentioned is a more difficult problem. You’re not going to take a 14 year old who is in eighth grade who thought it was funny to basically create nudes of his classmates and put them in jail. They’re not going to be prosecuted as an adult. They may not be prosecuted at all, but there’s a response that’s necessary. And so developing kind of consistency in that response is important and it’s incredibly challenging legally, you know, pivot to sort of legislation in response to A.I. One thing that I’ve been trying to emphasize across the country is that at the federal level, A.I. generated imagery and deepfakes have been illegal for a very, very long time. Even if the technology hasn’t been there. Our laws have covered computer-generated images for over 20 years. And so we have a legislative legal framework to deal with that. At the state level, it is different. And right now we are working with state legislatures to try and build that response to form laws that they can use to then prosecute individuals. In addition, response in Congress for A.I. and other forms of online safety; there are many, many bills that have been pending for many, many years to try to require Internet providers to focus more on online safety, to require Internet providers to do more when it comes to detection, at least globally, and to provide a better framework for parents to have access to information, transparency of what providers are actually doing online when it comes to safety. They have come some distance, and Jenn stressed this, that providers are doing more than they were a year ago. They’re doing more, certainly than they were five years ago. But, I would argue they have some distance to travel still. As a parent, I have a 14 and 16 year old at home. I grapple with the same issues as all of you. So what I would say, just on a practical level, and I’m kind of going to close on this, is I struggle to know what the technologies are and what the apps are doing, but the information is out there. You can look for it, you can talk to your kids. And, what we are struggling with, legislatively, is to try to get companies to put that information into areas online that parents can access. And so we will continue to fight those battles and hopefully those online safety measures will come – become law within the next year or so. Jenn, I know I’m probably done because you came on screen.

[Jennifer Newman] No, you’re great. No, you just – that you’ve mentioned so much like, I think it’s great. Can you talk a little bit since you mentioned it? Just, you know, when we talk about which apps are the most dangerous and platforms and again, acknowledging that some have made headway and are making change, some still have a way to go, you know, like, yeah, or is there a way to speak at all to to the the more dangerous ones out there or are there any that parents and and adults should be flagging for their kids?

[Steve Grocki] So yes, I mean I think the answer, the simple answer is yes. I think in terms of apps and I mentioned how I have my own difficulties and I don’t forget my kids’ access to these technologies. I’m not shutting off their phones. I’m not shutting off; My daughter has Tik Tok. She has Snapchat. You know, my son uses Discord and for gaming. And so I–We know that, particularly during COVID, you know, to shut off that access meant you were shutting them off to their friend network. These kids interact and use these technologies, really, to interact with one another. And that’s the way they communicate. And we’re not going to change that culture that’s already baked in. And so I’ve got to adjust to that, not make them adjust back to the way I grew up. But I grouped them in terms of “What does the app do?” You know, that’s a question I would ask. When you look at your kids phones or you have the conversation, ask them, “What does this do for you?” Like, is it TikTok? How are they using that particular tool? TikTok is a consumption tool. I’m going online. Typically, I’m scrolling through and it might be for 4 to 8 hours a day, which is another problem, not for this briefing. But definitely a problem is, is that the type of thing they post on TikTok? Do they have that capability? You can restrict that. So that’s a really great measure. TikTok restricts direct messaging if they’re a known minor, so are they a known minor on that platform or are they 18? What’s the age that they signed up for? Now, I juxtapose that with a site like Instagram and Snapchat because those are different functions, right? Instagram, you are kind of the product, right? I am putting images on to Instagram. I am reflecting how many followers I have. I am reflecting and trying to get people to follow information about me. Right? And so that, you’re kind of a sitting duck in some ways if you’re a minor, because if that’s accessible and so you want to look at those settings, are they private? Who can access that? Is it something and you can test it out, right? You can try and access your own kids account if you log in or get an account that is not as yourself, but as another individual, which you can do in Snapchat and Snapchat is kind of the third category of like it’s a hybrid, right? It’s sort of they’re texting one another, those texts disappear, so I can’t see those. So that’s a little concerning. But they also have a storyboard now that kids use and the tools evolve over time. And so, you know, those three are very popular. I think that clearly Instagram, as Dorian mentioned, is it’s a gateway, right? That’s how a stranger is going to access my child. Discord is much the same. Roblox, other gaming platforms. And then they’re pairing off on to other messaging platforms, right? Snapchat can also be used now, in particular, to kind of fun because they have tools that allow individuals to find different kids by typing in gymnast or something like that, you might find a kid that has that in their profile. So knowing those things about these various apps and understanding kind of what the functionality of them are really, I think helps a parent understand where the potential problems might lie over time. And so that that would be my you know, and those are the platforms I would say right now, in my experience, we’re seeing in the most cases, there are others out there.

[Jennifer Newman] Yeah, I was going to say, and I think that’s what we’re seeing reflected in cyber tip line reports and all of that. So I think you’re spot on. And I think the other thing that I love about what you’re saying, Steve, that I think is also the theme of this is this is going to take work like you have to be engaged in what your kids are doing online. You have to learn yourself. To your point, you have to understand how the app works. And then also going back to what Dorian was talking about, about make it a conversation and not a lecture, sit down with your child and say, Why are you on this? Who of your friends are on it? How are you using it? Again, it has to be collaborative and you do need to put work into it. So I think that that those are great commentaries and I think that’s a good assessment of the main the main platforms that we’re seeing.

[Steve Grocki] And I do want to say one thing, you know, one thing I do want to say to parents out there, though, you know, I advocate to give parents a break, because this cannot this cannot rest solely on their shoulders. It evolves too rapidly, the threats are changing too frequently for any parent to keep pace with it. So, you know, we need in addition to parental knowledge and engagement, as you’re suggesting, and I’m not disagreeing with, we also need better understanding and reaction from the Internet community that has a better feel for what those devices can do. And so just, you know, looking in the App Store and being able to see safety information is a dream that I still have and we are continuing to shoot for.

[Jennifer Newman] And along those lines, you know, one thing that that all of us in this space work with the Internet companies provider companies on and companies is this concept of “safety by design,” which is that as that tool or feature is being built, there is acknowledgment and there are built in guardrails for how this could be exploited by offenders. And so that’s really what we try to talk to the companies about, just ensuring that as they build this technology that’s great for all these reasons, ways that it could go south are being mitigated as they build it as well. So you’re right. I mean, it takes everyone, it takes schools, it takes parents, it takes friends to help each other and educate each other. So, yeah, this is definitely a team approach to Internet safety and online safety. So I’ll invite the other speakers to come off. I’ll come on screen for us as we move to kind of the general Q&A. So we have about 20 minutes, which is great and we’ve had a lot of good comments come in, a lot of good questions. Elizabeth, we’ll start with you since I know your presentation didn’t work out the way we had hoped, but certainly still want to hear from you. I think in terms of the questions that we’ve received, one thing that I think you’re a great expert and someone to comment on is risk and prevention, especially especially with the neurodiverse community, understanding, you know, this group being online and maybe ways that they are more susceptible to victimization. And just any other thoughts you just like to give us generally as we kind of go through the Q&A, but especially for the neurodivergent. 

[Dr. Elizabeth Jeglic] Yes. Thank you so much. And I apologize for my Internet troubles. So, I am a mother of a child with autism. And so this is an issue that I think about a lot. And so, you know, you have to think about your child’s individual utilization of the Internet and start, when you’re young, kids who are neurodiverse are drawn to the online form for various reasons. It’s stimulating. But also a lot of them find their community on there because they’re not fitting in necessarily in their classrooms and so they find their community online, but they’re particularly vulnerable to being targeted there. And so really, you know, starting when they’re young and making sure that you have all those safety pieces that are in place. And I would error on the side of being a little bit more conservative, but also understand that just like they may not understand social cues, you know, in the in-person environment, they might also not understand when somebody is grooming them or engaging in inappropriate conversations with them online. So you’d have to be a little bit more concrete. And so when they are younger, you know, posting things, you know, kind of these are the things that you can do. If this is what happens, then, you know, this is how you handle it and making sure that you’re monitoring what’s going on with them, starting from a young age. Obviously, they shouldn’t be chatting when they’re when they’re really little. But as they kind of become teenagers and need to find that out, you want to have that. My son is a little bit more, you know, he’s not verbal, but he does enjoy being online a lot and my concern is him finding, you know, pornography online and how that might impact him. So having those things blocked, you know, at the highest levels and being aware of what he is watching online is really very important. I think just keeping those lines of conversation open as we hear again and again.

[Jennifer Newman] That’s great. Thank you, Melissa. I want to go back to some of the research that Thorne’s done and this also addresses some of the questions that we’ve had coming in, but really the use of technology and these apps for the exploration that kids are doing. You know, you made reference to like sending nudes is today’s equivalent of second base. And, how do we balance that, like as parents and as adults with also giving our teens privacy and also trying to give them respect and trust and all of that as well? And so, you know what does that look like when, you know, it’s not micromanagement of our kids online, but it is some sort of oversight. But then also speaking just to what Thorne’s seen in terms of that exploration, and again, whether it’s normal or not, it is normative behavior now for kids to do this, whether we agree with it, whether we like it or not. So I just open that up to you for comment.

[Melissa Stroebel] Yeah. So how we have been thinking about it is how do we balance exploration and exploitation? Because that’s really what we’re grappling with here and a way that I like to think about it is think about going to the playground with your little ones. We never just like dropped them at the playground and said, “Figure it out.” Like, you know, if you can reach the highest monkey bar, by all means, four year old, you reach that highest monkey bar. We were there with them and what their play looked like on the playground, progressed with us side-by-side with them, with teachers, side-by-side with them. So, initially we are there holding their hand and hovering underneath them in case they slip, but letting them climb higher. We were having that type of proximity. But by the time they are 10, 13, 15, they are getting dropped off at soccer practice, they are going to the mall with friends, they are driving themselves to socialization. The same thing needs to be applied in online environments. We need to help them grow, understanding what risk looks like, what their ability to navigate that risk should be, and also knowing that sometimes we’re going to be there to dust their knees off, even if they thought they were totally equipped to handle it. We’re not going to say, how dare you climb that high. We’re going to brush off your knee, and then next time they try and go that high, we might say, how about you do this thing instead? So, I try and take that approach to it where we are very mindful of the importance of building their skills to navigate risk and make healthy decisions, because going back to that, that concept of like, we’re not raising kids, we’re raising healthy adults, so how do we help them grow to be healthy adults knowing that technology is where they are going to be growing up?

[Jennifer Newman] That’s great. Thank you. You know, I think that, I really like what you said. “We’re not raising kids, we’re raising healthy adults.” And I think it again goes back to the theme of the day, which is engagement and check ins and involvement and not this kind of one or done conversation, you know, Dorian, switching over to you, one of the questions or some of the questions that have come in have also been about what can a parent do, if it’s the other parent who’s possibly, you know, putting this child at risk, whether exposing them to adult content or even CSAM, say,kind of in the context of gaming and things like that, What what does that look like when perhaps there’s some of this harm being caused inside the house?

[Dorrian Horsey] Yeah, I mean, that’s a challenge, right? You’ve got two parents who both have a right to that child and you know, from a legal standpoint, I’m going to defer to Steve on that point. But, I think part of it also goes back to, you know, educating that child and trying to trying to shield that child as much as you can, you know, making the other parent aware that, you know, you have standards. I talked about, you know, your standards known. Now, it’s important to make them known to your partner as well, because different people have different opinions about, you know, what’s acceptable and, you know, if it’s something that requires legal intervention, you know, obviously we could report it to law enforcement and then go from there.

[Jennifer Newman] That sounds great, Steve, did you want to add anything to that?

[Steve Grocki] Sorry, I was looking at one of the Q&A questions that I was listening to. Well, just repeat it real quick, Jenn.

[Jennifer Newman] Just about when there’s victimization coming from inside the house. If it’s another parent who is exposing the child maybe to adult pornography or borderline, CSAM, you know, some of the gaming apps that are overly overtly sexual, you know, what can a parent do?

[Steve Grocki] Yeah, I mean, those are red flags certainly from a legal standpoint. I mean, those are, as I mentioned, when we talk about kind of offline child sex abuse, particularly at younger ages, more typically it’s not a stranger, right? Like what we’re seeing are people that are within the home, people that have access to an uncle, a friend, a parent, even. And so, you know, those types of sort of red flags where you see inappropriate behavior, you have knowledge of that depending on what it is that can certainly, you know, might need to be reported to either law enforcement or to child protective services, you know, some type of agency at the state level is probably going to be the fastest response time. The F.B.I. or other federal agencies could engage on that. But I think that state agencies are going to be more responsive to that type of a scenario. So, it really does depend on what we’re talking about. You know, if a kid walks in, you have an adult. This isn’t outside the realm of possibility, right? Where they are using a device and maybe sees pornography, adult, you know, versus an adult in the home sitting down with the child, you see evidence where they’re actually showing something, showing them something like child pornography. That’s also grooming, right, and so now you have potential offenses in play. And so that is a huge red flag for me and something that I would look at reporting to law enforcement or, as I said, some other type of agency that can deal with intra-familial things within the home, forms of abuse. If it’s a non-custodial parent, you know, that type of scenario, something, then you can get the courts involved there as well on visitation and restraining orders and things like that.

[Jennifer Newman] And kind of on the reverse side. But a question that I know is asked a lot is, you know, when you think about financial sextortion or online enticement or, you know, to what we’re hearing about with kids where sometimes, the self production, they take a nude and send it to to one person and then all of a sudden it spread beyond their control. And they do want help, they do want to go to law enforcement. They want to ask for help. What is the general consensus on the fact? Are kids being charged with child sexual abuse material? Like what? What is the kind of, you know, sense of going to law enforcement and saying, yes, I do. I do have CSAM, it’s of me or my child does have child sexual abuse material of themselves. How is law enforcement going to receive that?

[Steve Grocki] So there’s a “what should happen” and you know “what might happen.” And those aren’t always the same thing. At the federal level. You know, we are working hard at the moment to really impress upon all federal prosecutors and investigators and working with our state and local counterparts to do the same. That a child in the scenario, the circumstances you just described, that they should never be under threat of prosecution. You know, we know that at times that type of messaging has been used even in these types of engagements with kids in schools like don’t do this because you’re violated, you will get arrested, you’re going to get arrested. And then we see offenders using that type of messaging actually to further their exploitation online. And so, hey, you sent this to me already. If you report me, you’re also going to get in trouble because what you did was illegal. You self-produced this. We see that as a tactic for grooming and for exploitation. And so we’re really trying to turn that around. I would say, you know, it’s under a concept of prosecutorial discretion, investigative discretion, right now in most places where technically, you could prosecute someone, but you really should not. You know, you don’t want to foreclose any possibility because you certainly have minor-on-minor crime. You have, you know, individuals that are 17 and they’re producing images of an eight year old. That’s a completely different scenario. So you can’t foreclose the possibility altogether. But you’re really trying to condition the culture of investigators and prosecutors that this is not where we should be headed with this scenario. And it’s what I mentioned even in the A.I. context when we were talking about, you know, a 14 year old producing a nude using a nudify app of a classmate, That’s a cultural problem, right? That kid not recognizing the real serious harm that that is going to inflict and the fact that it’s illegal in all likelihood. So, we need to change that paradigm, and ensure that kids understand that those apps, unfortunately, available to them, really can cause incredible harm. And they need to change their approach to certainly accessing and using them.

[Jennifer Newman] Well, I think that’s incredibly helpful. I think that there are a lot of parents and families who hesitate to reach out, because they know that the child has created imagery of themselves, and they do hear those threats like they do hear, kind of that that kind of rhetoric that you’re going to get arrested and, you know, you’re going to go to jail, because you took a picture of yourself. So, I think clarity on that is is incredibly helpful. So thanks for that.

[Steve Grocki] Yeah. And one more thing Jenn, just to tie in to NCMEC you know, recent law change federally Underreport Act just passed in August, where a child or parent caregiver that sends an image that might be explicit to NCMEC, there is immunity for doing that. So, that is not illegal and that is covered actually by federal law and protection there. So, there are some protections and we see that kind of culture evolving around that idea and concept. But that’s a great example of where there actually are very, very concrete protections for doing something like that.

[Jennifer Newman] That’s a great point. Thanks, Steve. So, Elizabeth, I’ll just ask you one more question and then we’ll kind of quickly go around and ask for closing thoughts by everyone. So, you know, Elizabeth, a lot of the comments that we’ve also gotten are around this prevention messaging in schools. You know, we talked about a little bit about it, that it can’t just be parents and kids like it. It is a “takes a village” kind of approach to this, you know, in schools, obviously, NCMEC we offer our net smarts curriculum for schools for them to employ. But you know, it’s not being adopted on, you know, district levels or things like that, which we’re always hoping for, that we see Department of Education and even at state and federal level mandate this, as they do other types of curriculum in schools. But, we do offer that, we offer presentations that parents can use, or any adult can use at church, you know, groups, at PTA meetings, at book clubs, you know, any opportunity is a time to educate. But kind of final on prevention and just kind of messaging and that sort of thing before we kind of go around the room? Just because I know parents are listening, you know, adults are listening here because there’s kids in their lives that they care about. So on the prevention side, what would you like to highlight?

[Dr. Elizabeth Jeglic] I mean, it takes the village, obviously, and parents here are probably the parents who are looking out for them. But a lot of the kids whose parents are not here are the kids who are vulnerable. Right? And so we want to maybe we want to advocate for those kids as well, not just our own. So making sure things like Erin’s Law, which mandates sexual violence prevention, is passed in all states, I think it’s only passed in 39 states and making sure that the content of that includes online sexual solicitation, CSAM production, all these topics that we’re talking about today, because they’ll then get it into the schools and we have to kind of start talking about sexuality, healthy sexuality, sexual abuse more in society, because when we keep it quiet, you know, that’s what gives the perpetrators, you know, the ability to engage in these types of behaviors. And we also have to continue as parents to advocate for our children and, you know, advocate to video game producers and online, you know, distributors and things like that. And so, that they have this software that’s available and give us the information that we need to keep our kids safe.

[Jennifer Newman] That’s great. Yeah. And, you know, just again, kind of go around the room for final thoughts. You know, I think what I would just initially say, because I think we want to leave people with hope like I think it’s very easy to kind of get into despair and pull your hand out and throw every technical logical device into the clear closest river. But, we can do this. We can navigate this, we can make it happen and we can keep our kids safer online. You know, I think to what you were just saying about talking about it, that that’s what I mentioned when we opened is being in this business for so long. Just you didn’t see conversation about it, you didn’t see headlines about it. And we hear from survivors that they say, Not only was I a victim of a crime, I was a victim of a crime no one wanted to talk about. It was so taboo that trying to find resources or help were just so hard because everyone just wanted to say, “we don’t talk about that, we don’t talk about that.” So, I think screaming about it from the top of the mountain is the first step towards that. So again, having Children and Screens hold a webinar like this, more education, more normalizing the issue of child sexual exploitation, so it is not so taboo, that it can just be as part of a conversation as anything else I think is really important. And, you know, I think also, as Steve said, I mean, he’s got kids and they’re allowed on apps. You know, again, it’s really about balancing what’s out there and all these things that make our lives so cool and easy and fun and connected. But again, just preparing ourselves, like we would, you know, any other kind of risk scenario or landscape out there. So, Melissa, I’ll start with you just for just some quick reflections.

[Melissa Stroebel] You stole my thinking, Jenn. I wanted to look like on, on all of that sentiment. And there was a question earlier in our conversation today about the role of technology also in combating this issue. It is integral in it, there’s a lot of these things that we were really scared about, some of the unintended consequences of the technology, and sometimes we overlook how core that technology, that same technology is in, how we are scaling our response to these particular issues and doing better at getting further upstream to prevent them from impacting our young people in the first place. Even this webinar; 15 years ago, we would not have been able to reach several hundred people, several thousand people in a single conversation from experts around the country and interested in caring adults from around the world, with this type of information. Technology is absolutely something we need to be thoughtful about, we need to be intentional about it. We need to be very directly thinking about the unintended consequences to mitigate those harms to young people. And it is going to be critical in the solution as well.

[Jennifer Newman] That’s great. Thanks. Totally agree. Okay. 30 seconds, final thoughts, Dorian and Steve, then Elizabeth.

[Dorrian Horsey] Well, I just want to say, you know, I agree with everything that’s been said that our kids are already talking about sex. When we were kids, you know, there was this big birds and the bees conversation, well, those days are gone. We got to start by informing them early. And I would just say, let’s not let these private companies off the hook, let’s keep applying pressure to the Meta’s of the world, the Snaps of the world, the Googles of the world. I think the change that we’re seeing most recently is because a pressure has been applied. They’ve been sued, and that’s what it’s going to take. Talking to your lawmakers about this, letting them know that you care about it, all those things are important. And I just wanted to make sure I mention that.

[Jennifer Newman] That’s a great point. Steve? You’re on mute.

[Steve Grocki] I’ll just use myself as an example and how I talk to my own kids about – I try to be technology neutral, because I don’t know what platform they’re going to be engaged on. I just expect it to happen. Maybe that’s because I’ve experienced it, you know, on the other end, so frequently. So, they know that both of them. But I say to them all the time, I get emails at work training me on security, trying to trick me. I get emails all the time in my private email, trying to trick me, text messages from people that I don’t know. Right? We all have that we all have that happen to us on a daily basis. And culturally it’s sometimes I’m still not sure, right? I worked this job for 20 years. I’m still not sure. And if that’s happening to me and I can’t tell, then I certainly know my kids are going to struggle with a very, very targeted, educated person who is very adept at trying to engage them. And so I try to use myself as that guinea pig and say, “Hey, look, if this is this is the way it happens to me. I know don’t be cocky yourself about being able to prevent this because you may not see it coming”. So that’s a good way to engage and you got to engage. And then Dorian said this too: you got to give them the opportunity to come to you when that mistake inevitably is going to happen, right? They need to know that this isn’t taboo, this isn’t secret. This isn’t something that no matter what, you know, they’ve got to have a place to go when something they don’t understand is happening, because that’s when we can really make the biggest difference.

[Jennifer Newman] Great. Thanks, Steve. And Elizabeth.

[Dr. Elizabeth Jeglic] I just want to kind of reiterate what Melissa said and like we are how our kids are living in a world where they’re going to be engaging with these devices and these platforms. And, you know, and we just want to give them the skills to think critically, because we cannot protect them. They’re going to have more than we do. And so that when they inevitably encounter things that are challenging for them, they will think critically about them. Hopefully, they will think twice about them. And they know that, you know, there are trusted adults in their environment who will help them when and if they do make mistakes. And so, you know that this is something that is going to be here to stay and we as a community need to help protect our kids from it.

[Jennifer Newman] Yep. And, you know, I would just say anything happens online, cybertipline.org, you can make a report to us at the National Center. It will get out to the appropriate law enforcement agency. And then, certainly, you can always reach out to law enforcement if there’s something that’s more of a threat and immediate threat. So thank you for that. We know that there are many other questions and comments we haven’t gotten to Children and Screens. We’ll send out an email and include resources that address many of those. So, we see them, we hear them, we will address them. So, stay tuned for that. And now I’ll turn it over to Dimitri to kind of wrap us up.

[Dr. Dimitri Christakis] Thanks, Jennifer, and thank you to the entire panel for sharing these valuable insights about children in sexual exploitation and abuse. It was in fact, quite sobering for me, who’s also an expert in this space to hear both your advice and the questions and concerns raised in the chat. But it is our hope that by equipping parents, caregivers, and youth, with the right tools and knowledge, we can better guide and support our children, creating a safer environment for everyone. And I really think it’s important, as all of you said, that we are open to discussing this, both amongst ourselves and with our children. Okay, that said, please save the date for our next Ask the Experts webinar. It’s ADHD: Children and Digital Media on October 9th at 12 p.m. Eastern Time. Finally, if you found today’s webinar valuable, please consider making a donation to support Future Ask The Experts webinars and other free educational resources. To donate, you can scan the QR code on the screen, click the link in the chat, or visit our website at childrenandscreens.org. Thanks everyone and take care.