On Wednesday, September 30th, 2020 at 12:00pm EDT, Children and Screens held the #AskTheExperts webinar “Privacy Please: Protecting Kids’ Identity and Personal Information Online.” This discussion was moderated by Dr. Sonia Livingstone, Professor of Social Psychology at The London School of Economics and Political Science, and featured an interdisciplinary panel of experts. Together, they explained what third parties really know about children and provided actionable information to help protect children’s online presence and data.
Speakers
-
Sonia Livingstone, PhD
Professor of Social PsychologyModerator -
Serge Egelman, PhD
Research Director; Director -
Angela J. Campbell, LLM, JD
Professor of Law; Co-Director -
Jeff Chester, MSW
Executive Director -
Ariel Fox Johnson, JD
Senior Counsel -
Stacey Steinberg, JD
Master Legal Skills Professor; Director; Associate Director
07:38 Dr. Serge Egelman, Research Director of the Usable Security and Privacy Group at the International Computer Science Institute, explains how mobile apps track an individual’s data. He describes the role of persistent identifiers, which follow users across platforms to establish a profile that can be sold to advertisers. Egelman describes other technical aspects of data collection, including features that complicate the task of protecting children’s privacy.
21:43 Angela Campbell, Professor of Law and Co-Director of the Institute for Public Representation at Georgetown University, discusses children’s privacy from a policy perspective. She describes the Children’s Online Privacy Protection Act (COPPA), outlining its benefits as well as ways in which it currently falls short. Specifically, she describes a need for greater protection for children over the age of 12.
31:49 Jeff Chester, Executive Director for the Center for Digital Democracy, describes tactics that the tech industry uses to ensure their financial success, sometimes at the expense of children’s privacy and wellbeing. He outlines additional benefits of COPPA, as well as the need for further legislative action, which he believes can achieve bipartisan support. Chester urges parents to stay engaged and to advocate for stronger privacy laws.
43:38 Ariel Fox Johnson, Senior Counsel for Privacy & Policy at Common Sense Media, highlights California laws that grant extra privacy rights to residents. Under the California Consumer Privacy Act (CCPA), she explains, individuals may ask businesses about their data collection practices, and may request that those businesses refrain from selling their data. Johnson describes additional details about this act, and the precedent it sets for future privacy laws.
53:18 Stacey Steinberg, Master Legal Skills Professor and the Director of the Gator Team Child Clinic at the University of Florida, explains how parents’ online behavior can expose children’s personal information. In their zeal to post cute pictures and events from their kids’ lives–a phenomenon called “sharenting”–parents may leave their children vulnerable to security threats. Steinberg offers guidance on how parents can protect their children’s privacy without disconnecting entirely.
[Dr. Pamela Hurst-Della Pietra]: Welcome, and thank you for joining us for this week’s Ask the Experts workshop. I am Dr. Pamela Hurst-Della Pietra, president and founder of Children and Screens Institute of Digital Media and Child Development, and host of this popular weekly series. Children and Screens is a leading convener, funder, and curator of scientific research and public educator on the topic of digital media and child development. Today’s workshop will explore potential threats to your family’s online privacy, what is being done to protect children’s digital data, and what parents can do to help. Our esteemed panelists will cover everything from phishing attacks to the routine collection of personal information by social media companies, and others and will offer practical tips for combating these issues as a family. Our panelists have reviewed the questions you submitted and will answer as many as possible during their presentations. If you have additional questions during the workshop, please type them into the Q&A box at the bottom of your screen and indicate whether or not you’d like to ask your question live on camera or if you prefer that the moderator read your question. Please know that we may not be able to answer all of your questions, but we’ll address as many as time permits. We are recording today’s workshop and hope to upload a video to YouTube in the coming days. Tomorrow you’ll receive a link to our YouTube channel where you’ll find videos from our past webinars as well. It is now my great pleasure to introduce our moderator Dr. Sonia Livingstone. Dr. Livingstone is a professor at the London School of Economics and Political Science and an international expert on children’s internet safety, privacy and rights in the digital environment. Welcome, Sonia.
[Dr. Sonia Livingstone]: Thank you Pam, and welcome to everyone who’s participating in this webinar and to our exciting speakers today. Children are encountering such complex technological innovation these days which brings all kinds of risks and opportunities and I think it’s only recently that we’ve really begun paying attention to the question of privacy. But as we know everything children do today online is in some ways the subject of processes of data collection, what they do is in some ways collected, tracked, monetized in various ways according to a complex data ecology which today’s speakers are going to tell you about. I myself am a researcher, I spend a lot of time with children at home and in school so that’s kind of my perspective on these dilemmas and the research I’ve done and the research i’ve read has given me a very clear sense of the dilemmas and the uncertainties that i know parents are facing in in the United States, in my country britain, and many many parts of the world. What we also know from research is that children and families really do care about their privacy online and in the digital world and they want to have the chance to decide what information of theirs is shared and with whom, and parents care about these issues as do children, but these are really complex and difficult issues to understand especially now that children use the internet so very early in their lives and especially now that increasingly everything is getting connected. I also see from research that very often, children and parents think about questions of going online in terms of safety, in terms of the risk to their person, and so it can be puzzling to understand how online activities are generating data and that our likes and our searches and our visits are all tracked and that that information gets sold. I’ve seen in my own research that when children understand something of this data ecology, they find it creepy and in many ways they are outraged, and yet at the same time they are puzzled because they’re not quite sure what the harm is. It’s not it’s not so easy to understand where the problem lies in relation to this enormous kind of marketplace of data compared with some of the very real safety threats that people feel and have become more familiar with. I know that parents are meant to protect and guide children but there’s plenty of research that shows how parents are both concerned but also confused and they too struggle to deal with all of those possible privacy options and questions that we’re already and always being asked about, do you want to share this or do you want to allow this or that privacy setting, and I know parents feel unsupported in this dilemmas and difficulties that they face. So many different organizations are, including those representing parents and children, are now calling for information and they’re calling for solutions, solutions from business solutions from government, and today I think we’ll both understand the situation for parents but we’ll also get a sense of what are the options available for businesses and for governments as they try to respect children’s rights to privacy, and to treat their, to just, we need to discover what are the, what are the possibilities for treating children’s privacy and data more respectfully in this in this digital age. So today we have five amazing speakers to help us address these issues and answer these questions and give us a better sense both of what parents can do but also what are the challenges for governments and business, and I’m aware that there are likely to be, I’m sure there are parents listening to the webinar but parents are sometimes also activists or lawyers or they might also be in positions where they have some say over how children’s data are treated, so we’re going to try to provide ideas to inform everyone as we as we proceed in the next hour and a half. So I’m going to introduce each of our speakers in order and they’ll make their presentation and then we’ll have a question from the audience for each of our speakers so that we kind of break up the conversation, and then I hope we’ll also have time at the end to address any additional questions. So first I’m going to introduce Dr. Serge Egelman, who is the research director of the Usable Security and Privacy Group at the International Computer Science Institute, and he leads the Berkeley laboratory for user usable and experimental security at UC Berkeley. Dr. Egelman’s research focuses on the intersection of privacy, computer security, and human-computer interaction with the specific aim of better understanding how people make decisions surrounding their privacy and security and creating data-driven improvements to systems and interfaces. So Serge, the floor is yours.
[Dr. Serge Egelman]: Thanks for the introduction. Yeah, I’m Serge Egelman, I actually don’t have the background with children, my background is in computer science and I look at human factors to understand how people make privacy and security decisions. One of the things that my research group has been doing recently has been doing a lot on tracking on the mobile app ecosystem, so I was gonna just walk through about some of how some of this actually happens. So when you have a mobile app there are several different what we call persistent identifiers, and a persistent identifier is just a unique number that can identify you and your device, similar to say, the license plate on your car. So there are lots of different, these unique identifiers that can be used and frequently are so you know when your app, you know apps on your device contact other, you know servers on the internet, they will share these identifiers, and so let’s take a look at how this works. So imagine you’re playing an app, Angry Birds. Your phone sends the serial number to an advertiser and that includes the name of the app, so now that app knows that the phone with this serial number plays Angry Birds. And so you can imagine you know, over time as this one third party, this ACME Advertising receives data from lots of different apps that allows them to tie lots of different behaviors to, you know you as an individual. So you know, if you play Angry Birds, Twitter, and Speed Car Racing on your phone, ACME Advertising receives, you know that information and then, you know, uses that to decide what are the ads that are probably most likely to be clicked on or interacted with by the type of person who plays Angry Birds, uses Twitter, and and plays Speed Car Racing. And so all of this happens automatically pretty much whenever you use apps and they use this data to make inferences about, you know the the types of things that are likely to motivate you, your demographics and so forth, and it’s all in the name of trying to increase the odds that you’ll either make purchases inside the app, continue using the app for longer because it’s, you know intentionally made to be addictive and therefore you’re more likely to either make in-app purchases or click ads the longer you play it, but you know the problem is, you know many of these companies receiving the data are augmenting the data from lots of different sources. So you know, every machine on the internet has an IP address which is another type of unique identifier, so in this case ACME Advertising, you know, might contact another data broker and says, you know and say hey, I just received some traffic that has this particular serial number, does anyone know any more about this user? And so then you know, data broker might respond and say oh, on the internet, you know, previously I saw someone with this IP address because they filled out a form on a website. We know that they’re also you know, they reside at this physical load, you know, address, and here’s some other web traffic that we associate with this IP address. You know, another data broker might have another persistent identifier that matches the device, and so forth and so in this manner you know, a third party can, you know use these different data sources, the apps directly as well as other third parties to aggregate a lot of information about individual users and all of this happens pretty much, you know, opaquely without you knowing when or if it’s even occurring, in about 2013, 2014 because this, you know, started to become more and more of a pervasive problem. To step back, I mean, this type of tracking occurs on the internet too, this isn’t, you know, unique to mobile apps. What’s unique to mobile apps is that there are a lot more persistent identifiers to choose from, on the web it’s generally cookies which are another type of persistent identifier that are stored in your browser. All current web browsers allow you to periodically clear your cookies and that’s, you know that allows you know, companies to essentially stop tracking you most of the time, but that didn’t exist on mobile apps and so both Google and Apple created a new identifier called the advertising ID, and by policy they mandated that any third parties that are going to track users for advertising and profiling purposes need to only use this one identifier which users can reset, similar to clearing your cookies in the browser. And so, you know, this is how this works instead, you know the app contacts the advertising company and says, you know, this ad is playing Angry Birds. The problem with this of course is that you know, if they you know, well, sorry stepping back again. So you know with the ad ID the user might decide that, you know they don’t want to be tracked anymore and you know, you can go then and reset the ad ID. In theory what would happen then is you know, the advertiser will say, oh, we’ve never seen this ad ID before because it’s brand new and therefore we can’t, you know, tie this to an existing profile, but the problem is if that’s sent alongside other you know, you know identifiers that they can’t be reset, that negates the whole purpose of this privacy interface. And so another question that I get from, you know a lot of parents is you know, how can I make sure that my apps, you know that apps from my child are safe, that you know, obey the various privacy settings that don’t collect lots of additional identifiers and so forth and you know, the answer is you know, it’s simple, you just need to instrument the Android permission, you know protected APIs, disassemble app binaries and perform deep packet inspection. This of course is a joke but this is, you know the technical process that one must follow if one wants to answer this question because the tools don’t exist for the average user to figure out privacy implications. Instead what we have is the notice and consent framework which basically just means read a privacy policy, and so let’s look at how that works. If I’m going to download an app from the, you know from the Play store, I can scroll to the bottom and know that there’s a link here for the privacy policy and yeah. And then I can click that and I get this privacy policy and it’s just really a matter of reading it to figure out what, what’s happening. Simple, huh. So the point is you know, these have been studied actually over a long time, this is the only tool that consumers have to make decisions about privacy, the privacy policy. And you know, study 10 years ago showed that you know, this is completely unreasonable for most consumers. So this policy is from TikTok, at the time that I took this screenshot it was, you know, over 5000 words. It’s written at, you know, a high school reading level. The average reader in the US would take more than 20 minutes to read it. There was a study 10 years ago you know, showing that basically if you were to read all the privacy policies that you encounter when interacting, you know with the web, this would basically be a full-time job for several months. Obviously in the past 12 years this problem has probably gotten worse do, the proliferation of online services, but you know it illustrates the point. The last point I was going to make because I’m at the end of my time is just, you know one of the questions that consumers are really interested in is what about third parties. So usually if you want to read a privacy policy it’s probably because you want to figure out, is this company going to share your data and if so with whom, and so you know going through the framework the app privacy policies often say the data will be shared with various third parties but don’t actually name them, and you know the theory there is well those third parties have their own privacy policies and you can, you know, you can just find the third parties. The problem is, you know unless you’re decompiling the app or inspecting traffic you have no idea who the third parties are and therefore can’t reasonably go and locate fair privacy policies, and there’s nothing in the CCPA or GDPR in Europe that actually corrects this. Other speakers are probably gonna, are gonna be talking about this a little more but there’s no requirement that companies actually name the third parties that are gonna be receiving data, so you know to wrap this up my group has been doing a lot of research. We have our own build of Android, we’re doing some stuff on iOS now too where we’ve instrumented phones and we just run apps to see what they do. We put together a service app census, we’ve since spun this out as a startup, we’re doing work for regulators, but we also have a free database online where you know, you can search for, you know, the behaviors of free apps the database online is out of date right now but we’re aware of that and are planning to get back to that in the next few months. We briefly looked at you know, COPPA compliance which others are going to talk about, COPPA has, you know, in the US is the only real, you know comprehensive privacy law that impacts children, and there are, you know requirements about, you know personal information requiring you know parental consent before it’s collected. Same thing with, you know, conducting pay for advertising, behavioral advertising there are strict requirements for what actually counts as parental consent, I mean then data that is transmitted must have reasonable security measures and we find that, you know by and large most apps that are specifically targeted at children are don’t appear to be, you know, following the law here. And you know, worse, parents have really no way of knowing. There’s industry, you know for 20 years now there have been, you know, pushes for industry self-regulation, there’s programs where there’s, you know, apps are certified as being COPPA compliant by you know, these private entities, but by and large we find that these certifications aren’t really useful because the apps that appear to be certified don’t appear to be doing anything better in terms of privacy than apps that aren’t certified. And so that’s why my conclusion is that this is firmly a policy problem, you know, technology alone just doesn’t solve this. Thank you.
[Dr. Sonia Livingstone]: Thank you Serge, that was a very actually rapid fire but really, I think insightful account of just what goes on when you, when you use an app or you sign up to a new service, and one thing that really struck me from what you, what you’re saying is you presented as parents deciding and parents you know, reading the the privacy, I’m sorry I’m from London, I say privacy, we’ll, you’ll get over it. The privacy policies, but of course very often it’s the child who is saying yes to the privacy policies or signing up to the app and I know in Europe, in relation to our general data protection regulation and I think also in the US in relation to COPPA, there’s been a very kind of lively debate of various points in time about just how much children can decide and up to what age it might be appropriate for parents to take that responsibility, and then at what point we say okay, the children are now ready to kind of make their decisions. And what’s what’s so extraordinary from the reading the terms and conditions and recognizing the decisions to be made is that you know, there is no age at which even adults can really understand the situation that they face, so the idea that somewhere around about 13 children get it is, I know this is why the regulations are being reviewed and I think some of the other speakers are going to, going to address this. But I do think it’s, from the point of view of families, it raises some interesting questions also about the dynamic within the family and when it is and how it is the parents kind of manage the environment for their children’s abuse and devices and at what point they say, okay you’re on your own, you’re ready to go, I think that’s something very much a challenge that parents face, and that takes us to a question that we have. I think we have Pallavi Shah here who’s going to ask Serge a question and Pallavi thank you and welcome.
[Pallavi Shah]: Thank you for the seminar and having me here. My question is that given the nature of internet I think we are leaving cookie crumbs everywhere, so is privacy even possible?
[Dr. Serge Egelman]: Sure, I, yeah you know, absolutely. I mean, there are a lot of you know, we have the technology, you know there’s, you know various, you know technologies that that could solve many of these problems and in terms of, you know the protecting the identifiers from being transmitted to begin with or using various encryption techniques so that you know even if the data is collected it’s not useful for other purposes you know beyond the purpose of collection. The problem is you know, for you know, someone needs to adopt these technologies and use them and that requires, you know investment, deciding that you know, it’s worth paying the money to you know, change our tooling to better handle privacy. But right now those, you know incentives for companies to do that don’t exist, right and that’s why you know I think this is a, you know fundamentally a policy issue, you know it’s the role of policy to create the incentives for the companies to, you know, handle this, you know responsibly, right so thank you, thank you for the question.
[Dr. Sonia Livingstone]: And actually, Serge’s answer takes us very neatly I think to our next speaker because in the policy world of course there are the regulators in the state but there are also those who are working on behalf of civil society to try to bring about change and to respect people’s privacy in new ways. So in that context, it gives me great pleasure to introduce Angela Campbell, our next speaker. Angela chairs the board of the campaign for a commercial-free childhood and is Professor Emeritus at Georgetown Law Center and for over 30 years she’s directed a clinical program at Georgetown University Law Center that worked to protect children’s privacy, promote quality children’s content, and prevent unfair and deceptive marketing to children.
[J.D. Angela Campbell]: Thank you, thank you Sonia and Serge, both of you for a great introduction. Let me get this corrected. So I’m going to give you information about the one children’s privacy law that we have at the federal level in the United States and it’s important to note as Serge did that we don’t have a general privacy law, adults don’t have these same protections, only children do and only children that are under age 13. So the way this works, I’ll give you a little more detail than Serge did, is that it prohibits the collection, use or disclosure of a child’s personal information by a website or online service unless the operator notifies the parents about what personal information is collected, how it is used, and with whom it is shared and gives both direct notice to parents and posts an online privacy policy that’s supposed to be easy to find and easy to understand and they have to obtain verifiable parental consent before they collect any information from the child. It also gives parents a right to review and to stop for any further use of their child’s personal information, and it prohibits conditioning a child’s participation on disclosing more information than necessary. Operators are required to protect the confidentiality and security of data and they may not retain the data longer than necessary but in fact, this sounds great but there’s a big enforcement problem. Parents have no right and children have no right to sue to enforce the law, only the Federal Trade Commission and the state’s attorney’s generals have the ability to bring enforcement actions and these are very small, generally organizations with lots of responsibilities and in the last 20 years since the rules has taken effect, the FTC has only brought 34 cases and all of those cases have been set up, and states have brought even fewer cases. Now in addition to just the lack of enforcement, the copy itself has a lot of limitations I’ve already mentioned. It doesn’t protect to, provide any protections for children over the age of 12. Another important limitation is that it doesn’t apply to all websites or online services that are used by children, it only applies in one of two circumstances. The service is directed to children, or the operator, the service, has actual knowledge that the user is a child, and I’m going to talk just about the first one because we don’t have time. So what does it mean to say something’s directed at children, where the FTC’s provided guidance it says these are the factors that we’re going to look at. Well it’s a whole bunch of factors and no one is determinative and so it’s very subjective and you can’t just look at a service necessarily and say oh, clearly this is child directed because someone else might look at and say no, it’s not. So I’m going to illustrate why this is a problem by use an example. Let’s say you you have a daughter, eight-year-old that that loves Barbie and you’re looking for a game for her, so you go to the Google Play store and there’s lots of games to choose from. And let’s say you choose the second one which is Barbie Fashion Closet by Mattel, and you go and you read the description. It looks, doesn’t say anything about whether it’s intended for children or not, it says it’s for everyone. So then you go and you’re a very diligent parent, you go and you read the privacy policy, which Serge showed you how to find, and in this case the privacy policy is a little under 2000 words but it’s still pretty and written at a 12th grade level. It’s, and you have to read the whole, that’s really just part of the privacy question, you have to also read the rest of the privacy policy to really understand it and you won’t understand it anyhow because they use what language terms that aren’t defined, people don’t know what they mean. But let me give you just one example, they say, this is the second paragraph, that they take special precautions to protect the privacy of children at Mattel services directed to children. So that sounds good right, okay, but how do you know if it’s directed to children? So they send later, say if you have a question about whether a particular Mattel service is directed to children, please use the child directive website request form. Now I know most parents aren’t going to do this but I tried, I submitted my request yesterday morning and I still have not heard anything back from them. So this bit goes into another problem that again Serge touched on is that the whole way COPPA is set up is based on parents being the gatekeeper, parents reading the notices, understanding them and making rational decisions about the risks and benefits. This really places an unrealistic burden on busy parents, on any parents and we really none of us really have the knowledge to really understand and assess the risks because the whole process that’s going on behind the scenes is not transparent, and also it’s very hard for anyone to keep up with the rapidly changing technology and marketing practices and even for regulators it’s difficult for them to do that. In 2013 the Federal Trade Commission did amend the COPPA rules and in some very important ways, but those rules are already out of date and they’re currently considering additional changes. However some changes will require new laws which I think Jeff is going to talk about next, so if you’d like more information or know how to get involved I encourage you to visit our website at commercialfreechildhood.org. Thank you.
[Dr. Sonia Livingstone]: Brilliant, thank you very much Angela and it’s slightly, but it is depressing to hear your dissection of the key law which is there to protect children’s privacy in the US and in fact in many countries around the world, because so many companies are headquartered in the US that they follow COPPA regulation even when they are applying in my country or in many other countries so it is a very important law and it is very worrying then when we see the the difficulty. We have a question from the audience which in a way kind of intensifies the challenge because I think as implied by the idea of a commercial free childhood, some parents might want to run away and say i don’t, i don’t want any of this, you know, let’s keep my child free of so many of these kind of commercial platforms, and yet as as somebody in the audience has asked in the age of COVID, children are on their screens even more than before and often young children under the COPPA age of 13 as well as as older so the screen has become really crucial for their education and for their communication and their connection with the world and it isn’t just the businesses, but it’s also the school. I mean, the school you know, children, parents are wondering and another question whether one can say no to the Zoom classroom or the you know the data protection, the data collection that goes on with that. So COVID makes it all much harder, how do you, how do we balance, how do we achieve balance?
[J.D. Angela Campbell]: Well it is hard, and obviously it depends on the family situation and the age of the children. Schools play a very important role here, they in some cases can give consent and I’ve seen some of the consent forms they sent from the parents, and they’re just as impenetrable as the worst private faculty I’ve seen. So that’s really hard, obviously your child needs to be educated, if that’s their only option then you know, don’t beat yourself up about it as a parent. But I would look at others, I would look very carefully like what apps, what other websites are actually useful for my, from, help my kids. So if it’s like they want to have a facetime with grandma, no problem. They want to do TikTok 20, you know, half hour, an hour a day and they’re under 13, you’re not supposed to be on TikTok, and there’s all kinds of scary stuff on TikTok, I would say that’s something where the parent has to say no, you know, you’ve spent, you know, all your hours so far on school, go outside and play, do something different, and you know basically, it’s just being a good parent and doing what you can to make sure your child lives a balanced life.
[Dr. Sonia Livingstone]: Thank you, thank you very much. So in the, in the preparation for this webinar when the speakers met we had a discussion among ourselves about really how much it is that parents can do and what responsibility can they take on and how much really must the responsibility lie with the regulators and the state and and companies. And our next speaker is a very strong advocate that more should be done and so it’s a great pleasure for me to introduce Jeff Chester who is the Executive Director of the Center for Digital Democracy, a Washington, D.C. non-profit organization advocates for citizens, consumers, and other stakeholders on digital privacy and consumer protections online, and Jeff is a well-renowned expert on and advocate for privacy in the digital world. So, Jeff.
—-
[Jeffery Chester]: Thank you Sonia and thank you Pamela and your colleagues for inviting me and having this important panel and I applaud all of you parents and others concerned about children who are viewing today, listening in today, for being concerned and doing what you can. But I’ll ask you to step back, which I’m sure you know, you often do, and think about, you know our responsibility as parents and responsibilities in terms of all the children in the United States, if not the world. Because our system really helps determine what goes on around the world, as Sonia said, and think about especially children from low-income homes and often children of color who don’t really have the resources, can’t afford the devices to protect their privacy, they are heavily targeted here and especially as the country becomes more diverse and these important groups are online even more, you know in many ways youth of color are engaged in very advanced uses of these digital media and companies really take advantage of them. We need to come up with a set of practices and policies, legislation, regulation, and corporate good practices that protects all children. But I want to just say that I personally think it’s very very hard for a parent to do, I mean after all, ultimately you know when I said to my daughter there’s no Barbies, we have a warehouse now she’s grown up we have a warehouse of Barbies you know, so it’s very, you know, hopefully you’re better parents than I am, but it’s very very hard. I think especially as others have said, these children, and this is part of the industry strategy, children are literally growing up and busy parents just, unless I use the television you know, so I could cook dinner right, but you know today parents have to hand the mobile phone and to their kid oftentimes. I think so, but just think about what we’re up against. We’re up against streaming video, we’re up against virtual games. We’re up against smart toys and smart speakers. We’re up against mobile phones and social media and artificial intelligence and machine learning and virtual reality. All these things now run by super smart computers that make decisions about you and your child in less than 20 milliseconds. What kind of person? How do you target them? Where are they located? What do they like, you know? How we can get them to buy? How can we get them to act? That system is operating 24/7 and it’s very, and it’s becoming only more powerful. And I do think it’s very hard for parents. I don’t know how many of you, you know decline using, let’s say your discount coupons at the grocery store or the mobile discount coupons. Well when you click into that discount coupon at the grocery store in the U.K or in the United States you know you know. You don’t know that information is being tied into information about your home, about whether there’s children in the home, about what you stream, what you view, so you can be targeted on all your devices anytime, anywhere, that’s the system that’s been created. Now back in, and somebody but I forgot to set the timer so you better let me know when my time is up. Look, so when we, when this system first emerged back in the mid 1990s and we knew it was going to be the most powerful advertising marketing system ever made, you know, we decided to regulate it and working with Angela and other U.S. consumer groups we got and we were fought, the President at the time, Clinton opposed COPPA, but we got the United States’s only Online Privacy Act passed, and that unfortunately once you turn 13 in the United States you have no more privacy. If you’re under 12 you do have some privacy and that’s, and that’s COPPA. And I just want to say something about COPPA here because I do track the market. If you’re under 12 you are not subjected to the kind of far-reaching data collection and tracking system that’s been, that’s the dominant system in the United States in the world. You know it’s called programmatic you’re not part of that. Yes, they do track you and yes companies get away with things and part of the problem is that up until recently everybody gave Google, Facebook and everybody else a free pass. But the fact of the matter is that the kind of techniques that are tracking your child once they turn 13 and above and track us all are not as powerful in the, in the youth market. That’s been a tremendous tremendous success because if we had not passed COPPA, which went into effect in 2000, and frankly if we had not worked with Angela if we had not gotten the rules in 2012 that took what the Europeans had created for privacy protections and brought them to protect children, we never would have been able to win a case last year against Google/ Youtube which is the most powerful platform targeting Children in the United States and the world. And we forced them to admit something they were lying about; that they were in fact tracking children. And now as a result of this legislation and this law there’s no more data-driven advertising to, directed to children on YouTube in the United States and throughout the world. I won’t get into the details but the legislation, the regulation, the advocacy, the public shaming, the exposure has helped create a system that can protect your child and children in the United States and and around the world. Now, we have a real opportunity here. Is it this almost time or you’re saying hello? We have a really, we have.
—–
[Dr. Sonia Livingstone]: It’s almost time.
[Jeffery Chester]: I know you’re doing. Do I have 30 seconds?
[Dr. Sonia Livingstone]: Yeah.
[Jeffery Chester]: 30 seconds okay. So look, children’s privacy has been a bipartisan issue in the United States. We would never have gotten COPPA without the late Senator John Mccain and you know we, we have Senator Hawley and Senator Markey introducing very effective proposed legislation that would only protect better children, but also teens and and more and more republicans and more and more democrats agree. For example they’ve expressed the need to regulate Tik Tok. So we have an opportunity now, especially as the governments in the States go after Google and Facebook and work on their power and try to limit their power. We have an opportunity to create some rules of the road at the national level, at the state level, at the corporate level to respect the interests of our children and we hope that you will join us and all the groups on this uh webinar and support their work because we can get something done the next few years. Thank you
[Dr. Sonia Livingstone]: Thank you thank you very much Jeff. I mean there are there are several questions coming in and one I guess is one that is often asked I wonder and what your thoughts are um on whether whether we’re always behind the curve? Um isn’t policy always uh too slow? The state is always too slow to act. Um isn’t it would it be better to be calling for better self-regulation by companies because they are the ones who know the the leading edge of innovation?
[Jeffery Chester]: lLook it’s a very tough issue because as you all know these are very very powerful interests and they give a lot of money to both political parties, which is one reason why we haven’t had any success for the most part in the last few years. But, we have reached a a transformative moment in the United States since 2016 and and now you have members of both parties very very critical for different reasons in part uh very very critical of the google and facebook and amazon and their data collection practices etc. So there is a real opportunity now to get something done because now they’re being attacked not just for their privacy policies which is what we’ve been doing. Now they’re being attacked because they have become monopolists and there is a political moment here to make some change in part. Also just to say taking advantage of the push that the Europeans are doing like in the UK with food marketing they can give this is a global system and we can go after the companies in a global way and that’s one of the things that my colleagues at the Campaign for Commercial Free Childhood and CDD are doing with allies around the world this is a global campaign to protect our children and I hope you will join us.
[Dr. Sonia Livingstone]: I think um it’s it’s um you’ve mentioned what’s happening in Europe and uh many things are happening in Europe and uh um other parts of the world not always uh all that we would hope for. Um but I do think uh parents and those listening might be um interested to hear kind of other possible models other ways of managing this this data ecology than um is is currently uh that um dominating in in the United States and I know our next speaker actually will talk about what’s happening specifically in California. Um but I just wanted to mention um in Britain we have we certainly had what we like to think of as a kind of um field changing um regulation which is called the age-appropriate design code or is actually now being dubbed the children’s code which um in this country is a new statutory instrument about to come into force that um promises to really put children’s um privacy and children’s interests at the heart of data protection. So, it’s very new and we’re we’re still waiting to um you know we can’t be sure how it’s exactly going to work out, but it does address some of the points that you um have variously raised as a concern. So one is that it provides a higher level of privacy to everyone under 18, not just those under 13 and it provides various kind of protections in terms of default settings that children’s location could not be shared by default and that should be clear what location settings what privacy settings are present. Um there should be no behavioral advertising for children and no kind of nudging them towards too much sharing. Companies should adhere to their own published community guidelines. Um be clear about the age of their child using the service if there’s a risk to a child on that service um and so forth. So um you know it is new and exciting here and um I’m kind of uh maybe at the at the end we’ll have a chance to kind of compare the the various models that are in play because I know a lot of um legal brains around the world are are struggling with exactly these um these these dilemmas. um but I want to introduce our next um speaker at this point. So I’m going to introduce Ariel Fox Johnson who is um senior um senior counsel for privacy policy and privacy at Common Sense Media and uh her work seeks to enhance families experiences with media and emerging technology to strengthen students educational privacy and promote robust consumer protections in the online world and she frequently advises policy makers experts in the media on questions around children’s privacy and has indeed helped to develop laws on consumer privacy, student privacy, children’s privacy, and the internet of things. And um Ariel your slides need to be turned on to presenter view and then I think you’re good to go
[J.D. Ariel Fox Johnson]: Hmm. Is that working? If not, we’ll move on. if it’s not okay sorry guys you don’t get to see the beautiful slides. They’re mostly green and white with black images. So I wanted to talk about some state privacy laws um particularly state privacy laws in California that give special rights to kids and teens and in some cases give rights to everyone. The first one is the California Consumer Privacy Act and that law was passed in 2018 and it gives a number of rights to families and kids it’s the first sort of broadly applicable consumer privacy law in the United States that doesn’t apply only to children, but or only to like health data or baby data but applies to everyone. And it applies to a large number of companies um doing business in California; if they have over certain revenues if they process the data of over 50 000 you know individuals. If they make money over half of their money is made from processing data collecting it buying selling it, then it applies to them. It also defines sort of personal data very broadly it covers online and offline data and that’s important because as the other panelists have talked about, data is really collected in all different manners and combined from all different places nowadays. And then it gives a number of key rights to everyone. It gives you the right to know what information is being collected about you though often as serge noted and others have noted that right is in smaller text in a privacy policy. It gives you the right to access your information and to download and import it and that’s similar to the right we see in the GDPR. It gives you the right to have your information be deleted, though it only applies to information you yourself have provided to a company. And it gives you the right to say no to the sale of your data. Now for the special rights for minors, are for children under 16, you don’t have to say no to the sale of your data companies are supposed to get affirmative consent and you’re supposed to opt in before they sell your data. So the default is you’re protected now for kids under 13. It’s supposed to be the parents who consent and for 13, 14, 15 it’s supposed to be the teen who does the consenting on themselves and we see this as sort of a training wheels situation. Though i’m cognizant of other presenters points that even adults don’t really know what they’re consenting to. Um and I think it’s important to also understand what the what does it mean to sell data. So in the CCPA, sale is broadly defined to include sharing data with a third party for monetary or other valuable consideration. So, it is supposed to encompass the sorts of ad network id sharing that Serge discussed at the beginning. Uh some companies have taken the position that that and from that is not a sale of data and they’re trying to come up with these complex work arounds and these contracts and try to get out of of that definition and there’s in fact a pending ballot initiative partly driven by by that, but the intent behind the law is that you’re you’re able to say no stop selling my data or if you’re a kid the data is not sold unless you opt in. I also think it’s important to mention um Angela talked about how companies can avoid um being covered by COPPA they might say they don’t have actual knowledge of kids or they’re not directed to children. So we saw with like the kids toy website saying they’re not directed to children. Under CCPA, you have to treat someone as a kid if you have actual knowledge that they’re a kid under 16, but actual knowledge includes willfully disregarding the age. So, I think in many instances for example a toy website you you couldn’t say you didn’t have actual knowledge you would at least be willfully disregarding and you would be therefore fall under that. Uh for families who want to exercise CCPA rights uh common sense worked with some others to set up a site you can go to donotsell.org and you can make requests to companies. Now um in California there’s also another privacy law that applies to kids that i think is important in this remote learning context and that is SOPIPA it’s um the student online personal information protection act it’s the first law in the country to directly apply to ed tech providers who are collecting data from school children and it it’s pretty strong it says you can’t sell students data, you can’t use it to commercially profile them, you can’t use it to target them with ads um. It only applies to ed tech providers that are you know primarily marketed for and directed to kids. So as we know, there’s a lot of general platform general consumer devices now in schools and there are you know questions about whether it applies to them. They’re also with all these laws issues with enforcement they’re primarily enforced by the attorney general in California, which has limited resources obviously at this time. So for people who are interested in protecting their privacy I think as others have said it’s in many instances a policy um question at this point the technology is there, but is there the will from the companies. You know a handful of other states are also considering broad consumer privacy laws. There’s they’ve been introduced in Maryland, in New Jersey, um New York, Texas has a task force. Oregon is looking at this. If you’re interested in getting more privacy in your state you know one thing you can do is contact your state legislature another thing you can do is contact companies themselves because if they think consumers don’t care about privacy they’re certainly not going to do anything you know. You can use our donotsell.org tool to say don’t sell my information to a company. They might say you don’t live in California so no, but they might say okay you know what we’re going to extend these rights to everyone. We’re going to try to be good corporate citizens now that we think that consumers care about this. So those are sort of things you can do to try to expand protections it’s not perfect but it’s small steps you can take
[Dr. Sonia Livingstone]: Brilliant. Thank you, Ariel. Um I think a question um that has come in is um echoing something that’s going to be on many people’s minds which is when you’re using an app or a device or a service how does the company know if you’re 9 or 13 or 15 or 25 ? and I think it’s quite, I mean how do they even know that you’re a child ? So I wonder if you could give us a sense of how things are developing there and are there technologies that can detect the age of the child?
[J.D. Ariel Fox Johnson]: Sure, and so you know in many instances the company doesn’t know and it’s basically a regulator might look at what kinds of content as is the is the page putting out. Is it reasonable to expect that a child is your audience as a child your intended audience? But in other sense if they’re really trying to figure out the age of the user and we see this as a problem not just in like children’s content but in people who want to look at adult content online or make certain purchases online you know COPPA gives a number of ways that parents can tell that parents can give consent, but there are sort of also creative ways that companies are trying to see if they’re dealing with children. Um one that I’ve been reading about in the last couple of years involves taking a quick picture of your face or sort of like you use your face ID to log into a phone. It looks at your face, it does a scan it sort of figures out what age you think it thinks you are and then the privacy protective ones will delete everything and it’ll just be like you know estimated 15 or estimated 11. Um and there won’t be any other information saved so that’s one way companies could tell i’ve also heard of people developing ways to look at if I was going to download an app and they wanted to figure out how old I was they would look at the other apps on my phone. And you know children would have different apps than adults would have and it could make some sort of guess or estimate about your age. None of these things are perfect and there are ways to do them in more and less privacy protective ways. I think as as technology gets better they can certainly get better but then I know a couple years ago even working with facial recognition or for kids has gotten I think better especially in the last couple years but as they get better at telling age then we also need privacy production sort of even more even more than ever.
[Dr. Sonia Livingstone]: Right I mean if I imagine um my children taking a photo and you know offering that to the companies to their age that seems worrying to me, but then of course their images are out there already. And actually our next speaker um uh has been researching the way in which parents themselves are putting images of their children um out there and online so um there are some questions I think for parents to reflect upon as they too become one of the means by which they um uh share data about their children, but um let me introduce our next speaker who actually is I think going to not be despairing, but offer some practical uh tips and suggestions also for the parents who are listening. So um Stacy Stacy Steinberg is the supervising attorney for the Gator Team Child Juvenile Law clinic at the University of Florida and she also serves as an associate professor for the center on children and families her research really explores the intersection between the parents right to share online and a child’s own interest in privacy and she’s a leading scholar on what we’re calling sharanting. So, Stacy…
[J.D. Stacey Steinberg]: Thank you thank you so much it’s great to be here and to be on this panel. I’m gonna go ahead and get my screen shared. All right. So thank you guys very much for having me and today. I wanted to talk a little bit about how we as parents can continue to share or if we should continue to share despite the risks that it poses to our children. I started this research because i recognize that while there are many dangers that come when we put ourselves online many parents are still choosing to do so parents benefit from sharing on social media in a lot of different ways. They benefit by being connected with one another, by being able to get advice to share insights, but kids also benefit and need to have their privacy protected. I’ve noticed in my research when I’ve looked at this that it’s not necessarily something that parents are doing maliciously. We know that they’re not trying to take advantage of their kids privacy, but parents act in a unique role they serve as both the gatekeeper they’re the people charged with keeping their children’s information private under rules like HIPPA and FERPA, but they’re also the person who is opening the gate. They are the gatekeepers and they are the gate openers and they haven’t yet considered the importance of a child’s digital footprint. So while I’m a law professor and I set off to do this work trying to find a legal solution to the conflict between a parent’s right to share and a child’s interest in privacy, I really walked away recognizing that this is more of something that can be looked at through a public health lens. Because well-informed parents are really best suited to make sharing decisions on behalf of their kids. So I got into this work because it found its genesis and self-reflection. Along with being a former prosecutor and child welfare attorney, I also have taken pictures for many years and I’ve taken pictures of a lot of kids who have gone through horrible medical conditions and they have asked me to take their pictures and they’ve shared their children’s pictures with wide audiences because they feel that there is a benefit to doing this. It benefits the family it benefits the kids because it brings in community into the hospital rooms where they are. It helps raise awareness for important research like when humans of new york shared information about a prominent pediatric uh cancer doctor um millions of dollars poured in to support important cancer research. So I recognize that there is there is power and sharing, but there’s also significant risks. We know that when parents share online the information can end up in dangerous hands and it might end up in hands that parents have no idea could possibly get. One of the most shocking statistics that I heard when I started this work was learning that the australian e-safety commissioner who was studying um uh who was looking and prosecuting people for child pornography found that 50 of images shared on pedophile image sharing sites had originated on family blogs and on family social media sites. These weren’t pictures of children engaged in sexual acts or even naked pictures, but these were pictures that that predators and pedophiles had downloaded and were sharing with other people that had their interests. We also know that images can be stolen and they can be morphed. Many of us think of deep fakes when we think of adults and it comes up in politics often the idea of deep fake videos or celebrities and deep fake videos of people doing things or appearing to do things that they’re not. The reality is this can often happen with children as well and that is called morph to child pornography. Laws have not been updated in states to necessarily deal with the harm of morph child pornography because even though children are harmed in the creation of the image they’re not even if children are not harmed in the creation of the image they’re certainly harmed by the image’s distribution. Uh luckily The Protect Act which is a federal statute does uh encompass these morph child pornography images, but many state laws including the state that I am in um has not updated its child pornography laws to take account for these images. So when I look at this issue I spend a lot of time trying to identify what the risks are are and research how prevalent the risks are and when it comes to the tangible risks like the ones that i just talked about and the risks of data collection which I I don’t have a slide for but has obviously been presented by others here on this panel and even at the risk of data theft those are what i consider the tangible harm, but I focus a lot on the intangible harms as well and how our kids feel about sharing and how it can affect their well-being in the future. Research does show that kids are sometimes embarrassed by what we post about them online, but research also does show that kids do feel that some of their parents are empowering them and so maybe they’re not quite as upset about it as some of the fear research might suggest. So, we have research on both sides some kids are okay with it and some aren’t but what i think is critical is that we include our children in the decision making process. We need to help them feel empowered by how we look at online sharing because really this is their training wheels time when it comes to their own sharing when they become teenagers. We can’t expect our kids to understand privacy or consent when they take on their own social media feeds unless we teach it unless we talk about it and unless we model it for them while they are young children and so when we bring them into the conversation and we talk to them about what we’re sharing and why we’re sharing we’re helping to create that pattern. I think that sharing on social media also can have other risks that we need to think about like it takes us outside of the moment it makes us thinking more about our news feed we also have to think about how it is that we want our kids to remember the memory. We want them to remember being there not what the photograph looked like i’m going to skip over the right to be forgotten but let you know that you can always reach out to me if you’d like more information about this but the idea that kids might have a right to have information posted about them deleted later on when they become adults. It’s all about balance on the same day that I was interviewed on this topic my son was was on the news about winning a chess competition and so what I tell parents is that we are the first generation to raise kids alongside social media, but our kids are also the first generation to grow up share and so the number one rule is that we need to give ourselves some grace. A few quick tips that I’ll run through quickly. Avoid using your child’s full name. Consider nicknames. Remove your maiden name from your social media profile. Don’t share birthday posts on the day of the birthday you’re giving away information. Don’t share near naked or naked pictures. Ask your kids permission before posting give them veto power be good social media role models. Consider how kids will feel now and years into the future about the information you’re sharing and review old posts at least once a year delete things that no longer serve your family. Further information here’s my email address and my website and I really thank you for the opportunity to be on this panel.
[Dr. Sonia Livingstone]: Thank you thank you so much Stacey. I think this is heartening to feel that there are things that parents can do um and I think there’s a very kind of practical um pieces of advice that will ring true with the decisions and dilemmas that I know many parents are facing. uh I wonder however um what about all the other people that know your child ? Um how do we kind of deal with the grandparents or um the the the parents of the child’s friend who put up the birthday party or the school even? Can you give us a sense of how, how can parents kind of protect their children’s privacy more widely?
[J.D. Stacey Steinberg]: I think that it’s critical that sharenting um becomes part a central part of child rearing discourse. So just like we’re talking to our peers and our friends other teachers about how we choose to discipline our kids, how we choose to educate our kids. What we find has worked or not worked as far as meal time. We need to be talking about our online sharing habits as well and we need to be comfortable informing the people that we entrust our kids to about what those practices are. We would never send our child to school without telling them about an allergy that they had. We would never leave a child with a teacher without making sure that they knew about the child’s behaviors. When we entrust other people with our kids I think it’s also critically important that we give them the information to help them protect their digital footprint in a way that we would want to protect it if we were in charge of that child ourselves at that moment.
[Dr. Sonia Livingstone]: Yeah thank you. Um we are getting more questions coming in and we have um 20 25 minutes that um all the panelists can now um be asked questions and address questions. So I do uh invite everyone to keep posting your questions here we are paying attention to them all and um we’ll try to ask as many as we can in the next little while. Um maybe one question that several panelists may want to address is the idea that data collection can perhaps be beneficial. Um apps might want to or app developers might want to collect the data and process it to improve the functioning of the app to make it tailored better to the um needs of the users um children or others. Um how do we or or you know in in some of those issues like ed tech of course that collecting the data is designed to improve um educational outcomes. So how do how do you think how do you think about that kind of balance between collecting data for beneficial purposes which might also be commercial um compared with locking it down and keeping children’s data private. Does anyone want to offer a sense? Jeff?
[Jeffery Chester]: Yeah I can start I mean I think going back to the legislation. We now have um some very new thinking about how to govern all this in part.
[Dr. Sonia Livingstone]: You’re a little quiet Jeff.
[Jeffery Chester]: Can you hear me can you hear me now? I said uh we have some new ideas about how to govern all this. Um part of the problem is that in the system that’s emerged um you know you you give up the information it then goes to this big you know uh dark hole and it’s used for all kinds of of different things things that you’d never would have agreed to. Uh underproposed legislation by uh united states Senator Brown we’re just really having a huge impact on the privacy debates in the United States. Um uh only you know uh there’d be a limit there’d be real limits you’d only be able to use the data for the permissible use so if it was to improve the health, the welfare, the educational facilities that’s okay. It would have to be monitored and evaluated, but it couldn’t be used for anything else so you’re absolutely right it’s an ecology it’s not it’s all bad it’s all it’s all good like anything in life but there have to be limits and the problem has been there’s been no limits because the industry’s had a hold on the politics of the privacy debate uh in the United States. So we think that’s a good way of uh moving forward and then finally there’s another movement which is global in nature around algorithmic accountability transparency and there’s legislation there. Okay you’re going to say you’re going to use it for one thing but we have to make sure you really only use it for one thing.
[Dr. Sonia Livingstone]: Right, does anyone else want to um come in on that point? Yes, Ariel
[J.D. Ariel Fox Johnson]: Sure yeah I would I would sort of echo that and I think we have to be careful when we’re talking about collecting stuff for beneficial purposes like beneficial to whom. Right, often it’s the companies and they’re not thinking about the kids or putting their interests at top of mind and I know that’s something like Sonia knows better than I do the UK and Europe are looking into changing more, but um you know in one of the student privacy laws in California there’s the ability to use anonymized and de-identified information to improve products, which will then also you know have have benefits for students, but the children’s privacy and their personal well-being is still protected in that way and I think there’s there’s got to be ways of being able to use data, but also always keeping the the children’s interests sort of top of mind.
[Dr. Sonia Livingstone]: I mean I think for many parents and children they it it’s a very kind of comprehensible approach that you’re suggesting because people understand I have this app for this purpose use the data to improve that purpose, but don’t sell it to somebody I never heard of for some other you know some other reason. I think that’s um yeah. So I think so we have um a question from the audience um uh who I think is going to um ask a question that may be going back to what we just we’re discussing before about that question about age um Ximena yes can I call on you to ask yourself. yes thank you
[Ximena Audience Question]: thank you thank you thank you Sonia for this opportunity thank you for all the panel members. I’m an independent researcher and passionate about human rights and now I’m doing a research in relation to children’s rights. I’m very interested about age verification so my question to the panelists is what do you know or which examples can you give me in relation of what work are companies doing uh about age verification? I would appreciate it thank you.
[Dr. Sonia Livingstone]: Thank you who would like to um take that one um first uh maybe Serge would like to tell us something about the the the technical challenges and prospects for age verification.
[Dr. Serge Egelman]: I’m curious too actually that’s actually not a not an area that I know a whole lot about I mean what I can tell you is from the apps that we’ve looked at it’s clear that you know most of the apps that are targeted at children are not you know obtaining verifiable parental consent. Um that is known um in terms of what are the best methods to use um I don’t have a strong opinion on that I can just say that you know what’s happening you know what’s happening doesn’t appear to be um for you know getting consent and there’s a lot that can be improved there.
[Dr. Sonia Livingstone]: Right, I guess the question is whether there really are the technical technological solutions available um and the companies aren’t using them? Or are those solutions not really available at all?
[Dr. Serge Egelman]: I think that might be no sure of course those solutions are of course those solutions are available. As I said that you know the first question um that I got, you know we know how to do these things it’s just that most of them you know require some amount of investment and companies being willing to do them. Um and you know so long as you know screening children results in lower revenues for companies due to you know not being able to target ads as well they’re gonna be resistant to doing that and that’s again why it’s a policy problem.
[Dr. Sonia Livingstone]: Okay thank you. So um Angela and then Jeff both want to come in on this so i’ll turn to Angela first. Thank you.
[J.D. Angela Campbell]: Yeah I was just gonna give an example companies vary some don’t ask for age at all a lot of them do ask you to put your date of birth and this is not very effective at all we know that because especially if it’s a service that the kids want to use and they know they have to be 13 they have every incentive to lie. And a good example of this is Tik Tok where uh you you know if a kid in the U.S puts in an age under 13 and they will only be able to use the service to create a video they can’t share with anyone. So and they can also just immediately go back and put a different birth date in and get onto the service and then we found out that Tik Tok actually knows or it has its own way of using some of these methods you’ve talked about facial recognition and inferences and um algorithms uh to classify ages and they found that according to New York Times a third of their users are under the age 14 or 18 million of their users and they’re not supposed to be under on it if they’re under 13, at least in the United States. So they they know this they have a pretty good idea who the kids are i mean you can see a lot of times there’s a kid there in the video, but they don’t use that information to either uh to kick the people off the service or to you know protect their information in any way. So I think it’s really uh you know really goes back to this we want the companies we’ve got to pressure the companies to be more responsible because the otherwise they just in their economic interest not to protect children.
[Dr. Sonia Livingstone]: So, so Jeff we’ve heard that um the technological solutions are available um from Angela that the companies are not using them.
[Jeffery Chester]: Um no I don’t think that’s true. I mean look we think the companies are using them. I mean following up with Angela because her organization organization actually worked together on this um hi well look we’ve all we’ve thought for a number of years that the uh industry has in fact knows there’s a child in the audience um but both in terms of their measurement system which is in significantly involved this is on a global basis not just the United States, but in the last several years the the the work they’ve done on child safety to identify children in the audience our set of systems and then the brand safety systems. The brand safety systems are incredibly important and part of them are designed to identify children and protect the brands from um targeting transmitting content in ways to children that would place them in some kind of a jeopardy in terms of their reputation. Now, what Angela just talked about is a whistleblower first called us and our colleague Campaign for Commercial-free Child called told us about this internal system and indeed this internal system comes out of the brand safety system of Tik Tok. They know how many children are on are on there so they they how they are using it and whether or not they’re using it for monetization. We’ve given that information to the federal trade commission and the department of even though it’s hard for me to say yes, we’ve given it to the billboard department of justice to investigate what exactly are they using this system in addition recently there was a settlement, but very important settlement involving children’s apps and as part of the settlement the companies have in essence agreed to institute measures to identify when children use an app or in the app, which to me illustrated the industry knows that these solutions are there. So yes the technological and business applications identifying children as part of the marketplace are there and the problem has been that the advocates and the parents can’t force the companies to uh disclose which is why we want the governments to come in and and just frankly and this is a great it shows you how important privacy is because you could get the data protection commissioners of the UK and and Europe to force the companies to turn over all of this data. What exactly are they collecting on young people in terms of brand safety metrics? How’s that being used what does it all mean?
[Dr. Sonia Livingstone]: Yeah thank you. Uh Xiemna thank you so much for your um question and I hope the uh answers were um helpful. Uh there’s another question just come in that I have to say is one that puzzles me as well. So um Serge described the way in which the data comes in through um a device or an app and then it kind of goes on a journey to the other the other data brokers and and so forth. Um it feels to me like we’re talking very much about what parents need to understand or know about the app they see their child using and whether they can get Tik Tok to um you know provide appropriate age settings or privacy settings, but what what what do we do what can we do about that longer journey that the data goes on and the way in which it gets aggregated and profiled downstream as it were resulting in other kinds of um marketing. Is there any role for parents there or is that really just now down to the regulators? And what what happens to to the data subjects rights there? I I don’t have Ariel if you want to um come in on whether whether parents can exercise any kind of data rights um in relation to those big profilers and aggregators?
[J.D. Ariel Fox Johnson]: I mean if you can identify who the profile is and aggregators are which as discussed is uh stumbling block number one um, but if you can identify who they are depending on where you live you could uh ask them um, if you’re in California certainly, then you could ask them what data do they have about you. Um or if you’re in Europe and I also it’s there’s some sort of education questions coming out if we’ll answer those later but also under under a federal education privacy under FERPA then you have a right as a parent to know what sort of educational data is is being collected about your children. So there are ways um but i think when we’re talking about the big data aggregators it’s it’s almost impossible for for people to find out where the stuff has been which is by a lot of the focus is as you’re hearing us talk about just like using the data for that one app or that one purpose or that one company because once it kind of gets out there it’s almost impossible to trace or follow or or get back.
[Dr. Sonia Livingstone]: Right and as you said the the California Act gives parents or children or everybody the chance to get back the data that they provided but not the data that is inferred about them.
[J.D. Ariel Fox Johnson]: So two yes there’s a slight distinction under California you have the right to know about all the data a company has about you including that they’ve gotten from another source in terms of your rights to delete data you can only have them delete data that you have given them. It’s not the right to be forgotten in in Europe.
[Dr. Sonia Livingstone]: Okay so maybe we can come to um uh back to Stacy and the idea of the right to be forgotten. I know that when I do interviews with parents and children they really really believe it somehow when they reach the age of 18 it should all be wiped and they get to start over and that’s a very kind of gut reaction I think they have, but Stacey is there any prospect of this happening?
[J.D. Stacey Steinberg]: So I kind of geek out over this question because it’s my favorite thing to talk about is the right to be forgotten. Um for those on the call that don’t know the right to be forgotten is a principle that does that is not really in existence in the United States, but is in existence in Europe that says that as time passes the value of certain information dissipates and that people have a right to reclaim their reputation in their name and so in Europe and in some countries what I understand happens is that if a person sees something in a google search result that is unfavorable to them and they can make the argument that that information is not accurate or even more so no longer relevant to who they are right now it doesn’t necessarily even have to be not accurate if it’s not reflective of them they can have the input they can they have a right to have the information taken off the google search result, but doesn’t mean the article itself where the information is posted goes away but that the link between that person’s name and the google search result is in essence broken. Now I have argued um that in my in my work that perhaps what the United States can do to kind of cure the issue between a parent’s right to share and a child’s right to privacy is it can recognize that while when children are young information that parents are sharing about their children is speech and our free speech protections are the reason why the right to be forgotten isn’t working here in the United States or hasn’t taken root, but when parents are are speaking about their kids and their kids are young that information is speech but that over time the value of that speech just dissipates and instead has to make way for the competing interest of privacy that the child has. And so my argument is that when kids get older that that information is no longer that parent’s expressive speech but is instead data that has been collected about the child and so therefore children should have a right to be forgotten regarding the information that was shared during her childhood. Um I don’t know of anyone else making that specific argument in the states uh I guess uh just making that argument I would love to or wants to comment about it but um but that’s just something that i really like to think about and and um you know it’s more mental gymnastics for me than anything else right now so so that’s kind of the overview of where we’re at.
[Dr. Sonia Livingstone]: I think, think the entire issue is mental gymnastics for all of us actually. So um we are we are all struggling to understand this uh Jeff did you want to come in on.
[Jeffery Chester]: No just that Senator Markey, as as who is really the country’s foremost privacy champion in congress and he works with republicans um to get COPPA through and to this the bill that we hope will pass to the next congress bipartisan bill um to strengthen children and teen privacy, he has he calls it the eraser button and which is in essence the right to be forgotten for young people. I think it’s under 15 and under now because of a compromise but but but in fact that legislation has now been around for two years and it will move through and we’ll have a right to be forgotten kind of framework in the United States eventually for young people.
[J.D. Stacey Steinberg]: I think it’s critical that there are any sort of right to be forgotten doesn’t focus on what the child themselves puts out on the internet about themselves but instead focuses on what others put out about themselves about those kids online and so really my focus has been more on other people sharing about kids, as opposed to kids sharing themselves.
[Dr. Sonia Livingstone]: Right and I think that brings us also back to the question that i think Ariel made about schools and schools are a huge way of data being collected about children and indeed shared about children even though i know parents would like to want to trust the school. So Ariel, I didn’t if you wanted to add on or perhaps what what what hopes you see for the FERPA law when it really takes hold.
[J.D. Ariel Fox Johnson]: Um what I was gonna say we we also we did get passed in california the eraser button provisions uh from from Senator Markey because he as Jeff has said he’s been trying to test his stuff for years in congress decades um got passed in California for for kids under 18 but it again and it’s limited to what the kid themselves posts. Um and when we got it passed not every company would allow you to delete things you posted and and now most upstanding companies will at least like allow you to sort of delete it from from public view um and I think i’m I think common common sense we’re very supportive of of Stacy’s notion that parents should have some say in what their or kids should have some say what their parents post. Um though I would probably encourage uh kids and parents to talk about what is appropriate and to delete it as a faster mechanism than like you know seeking legal resource recourse. Um but um.
[Dr. Sonia Livingstone]: Yeah no thank you I’m interested that nobody um has really talked about questions of security and data breaches, but I know that um over here we have the kind of headlines frequently that even when the systems are you know designed they’re not designed well enough really to hold kids data securely and um we have had a question about um um sharing data or storing data in the cloud how secure is that. You know how much should we be kind of locking down our devices and is that something parents could do take what’s do more about perhaps? I don’t know if anyone wants to. Okay Jeff.
[Jeffery Chester]: Well you know I was thinking the previous question I mean what can parents do and this is impossible to do but you can look at the apps that you have and your kid has and coca-cola, mcdonald’s, the music companies all those companies are the phone provider you have all those companies through their apps to transmitting data about your kids and about about you and and you know and it’s a huge system that’s been put into effect. So coca-cola almost every if you’re going to be in business today you’re a data broker coca-cola and mcdonald’s and and honda they’re all data brokers. They have huge internal data operations. Apps are a new way for them to collect data directly and then there’s an outs outside system so it’s it’s you know it’s it’s it’s mind-boggling uh to to even think about and we i would do want to talk about data security, but I also want to talk about the pandemic because what’s like the pandemic has done it’s it’s accelerated the data collection of young of young people’s information across all these spheres and including streaming video and gaming and what’s happened according to the industry is that they have reached a level of maturity in terms of data collection and audience that they did not expect to see for three or four years. So now you have children even more at risk because all this information including now their school information which is why a number of us raised alarm a few weeks ago that on these educational sites kids were seeing at junk food advertising and their data was being collected because the schools don’t have any control anymore given what having to do online learning. So this really the pandemic really underscores the need for us to look at this whole issue comprehensively and if we can come up with global standards that would be very very important.
[Dr. Sonia Livingstone]: I I think we would all be very excited about that. I think Angela wanted to quickly say something about security before we wrap up.
[J.D. Angela Campbell]: Yeah and as I mentioned in my talk the law says you’re supposed to have provide reasonable data security and that you’re not supposed to retain data longer than is necessary for the intended purpose. Well so the more data that’s collected the more possibilities there are for breaches and yet this is something that the FTC as far as I know is only uh prosecuted in a small number of cases a bit one being the VTEC which was a toy company. Um but but you know Jeff is right They’re collecting more and more data and and another one we looked at we complained about but the FTC hasn’t had to do with Amazon Alexa speakers that were specifically designed for children and their default is that they save every every they record everything that the child says or after they use the wake word and then they keep those messages in the cloud forever unless the parents go to the trouble of deleting it and that’s you know it’s not easy they make it really hard for you to figure out how to do that. Which is an example of what we call nudging or design persuasive design which I think is another way of addressing some of these problems is that uh there have to be this faults that are privacy friendly and they have to make it when you want to do something you shouldn’t they shouldn’t put the burden on the parents and it should make it easier for parents when they need to intervene.
[Dr. Sonia Livingstone]: Right right and and and insofar as the burden is on parents I think you know there will be some parents who are better positioned to make those complaints and make those requests and some parents won’t have that that capacity. So there are some inequalities there. Um i’m really sorry to say that our time is up and Pam has popped back up I know.
[Jeffery Chester]: Can we thank our moderator Sonia Livingston for a fabulous job thank you Professor.
[Dr. Sonia Livingstone]: Thank you Jeff. I um I didn’t give everyone a kind of final word because I really wanted to prioritize the questions that were coming in and there have been some great questions so I thank the panelists and the audience and Pam back to you.
[Dr. Pamela Hurst-Della Pietra]: If you’d like to give a final word Sonia please do so.
[Dr. Sonia Livingstone]: And i’m happy i’m all up for the idea of a global collaboration to bring about change because I think um parents and children everywhere really and and teachers actually schools so many folks are really kind of dismayed at their seeming takeover um of their data and their lives. So I think there’s a lot of scope for action. I really hope we get to see it.
[Dr. Pamela Hurst-Della Pietra]: and hopefully you’ll share that and so uh on that note um thank you so much uh Sonia and uh Angela, Serge, Stacy Ariel, and Jeff for an enlightening fascinating and important conversation. And thank you to our attendees for your participation and wonderful questions. We hope that you can use what you’ve learned today to work toward establishing safe and secure online practices for your family. To continue learning about this topic be sure to visit our website where we will post additional insights in the coming days. We will also be posting a Youtube video of today’s workshop which we encourage you to share with your fellow parents, teachers, clinicians, researchers, and friends. For more from children’s screens please follow us on social media at the account shown on your screen. Our discussions about digital media use and children’s well-being will continue throughout the rest of the year next Wednesday October 7th at noon edt we will be discussing persuasive design, a set of sophisticated tactics used by technologists to manipulate users behavior and keep them glued to their screens. Panelists will explain how and why these techniques work and we’ll outline strategies to mitigate their efforts the following wednesday our workshop will cover the topic of screens and differently abled students during COVID 19. When you leave that workshop you’ll see a link to a short survey. Please click on the link and let us know what you thought of today’s webinar thanks again and everyone stay safe and well!