Can the outdated laws meant to protect children in the digital world keep up with the rapidly advancing technology? Join us as we engage in an enlightening discussion with Kris Perry, a passionate advocate for early childhood safety. Kris shares his unique journey from growing up in a family of educators to becoming a social worker and child abuse investigator. We explore the critical need for updated legislation to safeguard children online, delving into the shortcomings of the Children’s Online Privacy and Protection Act and the potential impact of new measures like the Kids Online Safety Act. With insights into state-level efforts in New York, California, Maryland, and Vermont, Kris paints a vivid picture of the legal and political hurdles in creating a safer digital environment for our youth.

We tackle the complexities of technology's role in childhood development, balancing it with the necessity for healthy practices. From screen time guidelines for the youngest among us to the importance of media literacy in adolescence, Kris and I discuss how to navigate this digital landscape. His background as a child abuse investigator brings a unique perspective on the pervasive data collection practices that threaten children's privacy. Listeners will find valuable strategies for fostering healthier relationships with technology, from delaying personal devices to establishing screen-free zones. This episode equips parents, educators, and policymakers with the knowledge to better protect and prepare children for the digital challenges ahead.

Listen Now

Read the transcript 

Marnetta: Hi, listeners. It's me, your host, Marnetta Larrimer. As always, we like to kick off our conversation by asking, what's impacting the classroom? Today we have with us, Kris Perry. Hello, Kris.

Kris: Hi, Marnetta.

Marnetta: How are you doing?

Kris: Great. Really excited about the fall, how beautiful it is, and what a great time of year it is.

Marnetta: I love that you said that. It is my favorite time of year, but I'm angry. I live in Louisiana, so it's still not boot weather for me yet. Some people are having boot weather, I'm not.

Kris: Got it. Soon.

Marnetta: Soon. Tell us a little bit about yourself.

Kris: I grew up in a conservative agricultural part of California. My parents were public school teachers, and I had the great opportunity of growing up surrounded by their friends and watching them work in communities, serving children and families. I went off to college, then to social work school, and started my career as a child abuse investigator in the Bay area. I learned quickly that poverty was the main contributor to the struggles that parents were having with providing for their children, not in any way that they didn't love them, but they didn't have all the resources they needed to provide for what they wanted to.

Over many years, I worked with families and tried to bring innovative programs in, whether they were state or federally funded. In California, around 1998, we passed a tobacco tax that was channeled into a public charity for children birth to five. I left the child welfare world, and I joined the early childhood world, where I have spent the better part of the last 25 years of my career in various positions of program, policy, advocacy, even politics at times. I've learned a lot from researchers, providers, clinicians, and parents over the years. My career has taken many turns, but one of my very fondest, happiest memories are working directly with families at the beginning of my career.

Marnetta: That was a beautiful story. So much depth there, and welcome to the field. I thought about social work, and I wasn't built for it. It takes a special people work in that field. That had to be very hard. You have this broad experience in our field, but I want to tap into the expertise of why we're here today. Can we talk a little bit about your work and how you got into what you were working on at this moment?

Kris: I'd love to. In fact it's very connected to my earliest position in child welfare and child protection. In those days, we were very concerned about homes, communities, environments, schools, and all the things we could do to make sure children were safe in those different environments. Here we sit, 12 years, 13 years into the social media smartphone era, and children are not safe online.

There are clearly hazards and problems associated with their time online, and there are very few, if any, protections in place to protect them in their digital life. My commitment for the last 30 years has been making sure children live a safe, just, and prosperous life. I'm seeing that their online digital lives are interfering, frankly, with their optimal development.

Marnetta: You said a lot in there. It is scary. The access, it's everywhere. There's not anywhere you can go that you don't see an infant with a phone in their hand, ranging all the way up through our high schoolers. I can only imagine just how busy you might be trying to enforce and align those protections for them. To date, what laws have been passed to protect children?

Kris: You said it perfectly. Screens are everywhere. They're integrated into our lives completely. Whether it's education, play, or even social relationships, everything's to it seems to involve screens at this point. We're even seeing that in many cases, we've become entirely reliant on screens to be able to conduct parts of our life. Children have been very impacted by that transition from us all having real lives to us having real and digital lives.

It has presented a number of challenges. Some of those challenges have been addressed by lawmakers, frankly, local, state, and federal levels. Even though many people are talking and starting to work on what laws to pass to protect children, very few laws exist that do protect children online at the federal level, which we know would protect all children in the United States.

There really haven't been any new laws passed since 25 years ago when something called the Children's Online Privacy and Protection Act was passed, which set limits on the data that could be collected around children's data and how that data was collected and used. It only protects children 13 and younger, and we know from the vast amount of reporting on this issue, children older than 13 are struggling with their digital lives and their screen use, so we don't believe that this very old law is nearly sufficient enough to protect children 18 and under. It's not just about collecting their data, it's far more than that. We know there are really serious design issues that affect the way children are using the screens and keeping the way that it keeps them on screens.

What we hope for is that there will be an update to COPA and that there would be an additional law, KOSA, the Kids Online Safety Act, both could be passed as soon as possible to protect both children's data, make that law cover children 18 and under, as well as address some of these really tough design issues that we know have created an attention economy with children and led them in many cases, problematic use.

The last thing I'll mention is that there are states working on this at the same time that Congress is working on this. New York and California have passed restrictions on personalized feeds for minors, but those have not got not gotten into effect yet. California, Maryland, and Vermont have passed age appropriate design codes, and California is currently in a legal battle over the law that they passed. Vermont's law that was passed has been vetoed and Maryland's codes, which were passed recently. We did go into effect on October 1st, so they may be the 1st state to be able to begin the process of protecting children online. We're watching that very closely.

Marnetta: Wow. As a mother of young men, I was very old school. They were surrounded by children who had phones. They're just like, oh, I need a phone. I'm like, no, teacher can let me know. You can go to the office. You don't need it, you want it, but you don't need it. It became increasingly hard when, in the school system, so much of their work was tied to the technology and having to have it right. It made it a little more difficult. Still hung in there that I had to wait till they were a certain age or whatever, but it definitely was challenging.

I will also say there was a struggle as we're thinking about that. Even Facebook or Instagram, social media things you have to be certain ages to get in. They will ask for your date of birth, but there's really no way to make sure that it's accurate. You have children who are able to bypass this system by just knowing math. I know what over 21 would be, I know what over 13 would be without having to prove and verify that they are actually of that age. Is there any work around things like that happening?

Kris: You bring up a great point that the technology is very smart and so are children. They do figure out how to get around what may be some of the tools that have been put in place for parents to help protect them. Age verification is a really interesting issue because as I mentioned earlier, we're really concerned about children's data, children's privacy. One of the things you have to do to verify your age is to provide data, which can intrude upon your privacy. We're hoping that the platforms along with researchers and other experts can come up with a creative solution for validating, verifying the age of children so that the most minimal amount of data is collected and used in that process in service to protecting them from content that may be harmful on their platform.

That is still a process that's ongoing. It is not resolved at all. There are a number of other design features that we know from the research, from the science that really cause maladaptive behavior in children that are particularly problematic. I referred to that earlier as design issues that create this attention economy for the platforms, where the longer you're on, the more data they collect, the more able they are to modify what they're feeding you through an algorithm, and then the longer you're on.

We know that there's this give and take relationship between the user and the platform, but when you're a child, if your brain is not yet fully developed and you haven't necessarily been trained in how to use media safely, you might find yourself very consumed by these products and not even aware of the risks that you're taking. We've really been thinking about what some of those design changes could be, age verification being a big one, the entry point, but also thinking about how the children's data could be protected from the algorithm. In other words, their feed would be chronological versus algorithmic. That could go a long way in reducing their interest over time to their feed because it's just their friends, then they've seen everything, and now they're bored and they can put the phone down.

You also mentioned school, and I really want to talk about that for just a second because as a policy matter, schools have started to put school phone bans in place as a way to prevent children from being distracted during the school day. You also mentioned as a parent that you made a decision at some point that it was okay to provide your child with a smartphone, as most parents do, because they want to have access to contact with their child if necessary, but there's this double-edged sword where once they have the smartphone, which is a supercomputer in their pocket, they do find themselves using it at all times, including the school day. Schools have become increasingly frustrated by the children using the phone during the day for the reasons I've already noted.

We're hoping that some of those school bans will be studied, and we'll see through the research if they have created the impact that the school wanted. In other words, children's test scores have gone up, their satisfaction levels have gone up, or their individual performance has improved. That's what we're anxious to see if whether the bans have even accomplished that.

Marnetta: Yeah. Luckily my children, the rule was 16. Those extracurricular sports, like having access after. They knew the rule. You lose it at school, it's lost. I'm not going to pick it up because you're not supposed to be on it. I'm curious, would you give your child a smartphone? If so, what age would that be for you?

Kris: You know what I love to do with this question is go to the very, very beginning of a child's life. They're zero to two, two to five. The AAP says children under two shouldn't have any screen time at all. Obviously, I wouldn't be handing them my phone, a tablet, or any device before two. After two, between two and five, you're really supposed to be very, very careful, no more than an hour a day. It flies by if you're a busy parent and you're trying to figure out how to entertain your child or help them with something while you're cooking, cleaning, or doing something you have to do.

Let's get all the way to school age. Now children are five, and we know how very curious and important this stage of development is. The school age where you have many developmental tasks to accomplish to reach your full potential. The device is, in a sense, disrupting or interfering with all of these developmental tasks that you have to complete, much of which is done in relationship to others.

Something that many people want to talk about is when should kids have a screen, but I want to make sure we talk about parents too and caregivers. They're on their screens, which is also interrupting that relationship with the child, which is also somewhat problematic. I'm talking right now about kids, but I hope your listeners are keeping in mind that while they're on their device, they're not interacting with their child, and that's causing a similar disruption as if the child were on it.

Let's get to teens, where most parents feel by the time the child's 12 or 13, getting close to high school, it's becoming more and more necessary to provide them with some kind of device because they're so busy and they're out and about. This is where it's a very personal decision for a family, but in order to make that a successful decision, you will have wanted to work with your child up to this point around their own media literacy, their digital literacy, so that when they are provided a smart watch, a smartphone, or a dumb watch or a dumb phone, they're prepared for this moment. They've been working through all these questions with you. Now they know what's going to happen.

I am providing this provider, my data. They're using this data to market things to me. They're sharing it with other platforms, and you want them to be a really good consumer of those platforms and products. You want them to cooperate with you and putting controls in place so that they're protected and you have some visibility or line of sight into what they're doing online, and that's for their protection.

Many people think 13 or wait till 8th grade is another. I've heard that often used as a threshold for parents who want to get at least the child to high school. I understand that. We're not going back to a non-digital age. We're not going to be in a situation where a child could reasonably get through high school without being online, whether it's the device the school provides them, their parents, or both. The best thing you can do is put it off as long as you can, prepare them over many years for this moment, and then stay very close and involved with them as they explore their digital life. They feel like they can come to you if things occur that are problematic and ask you for help.

Marnetta: Yeah. You touched on some things I would have asked about. No screens before two, you talked about the developmental challenges of that, when children are introduced to technology too early, what it happens with building relationships, and all those things. I want to add a thing to it as an early childhood educator of preschoolers. Not giving them access to technology in preparation for kindergarten was a fail on my part because in kindergarten, they're having to input things on the computer, whether it be their name, their whatever. There's this preparation you have to do in order to not fail them as an educator, which doesn't really align with these screen time limits or standards and stuff. What do you say about that?

Kris: I have a lot of empathy for you, early childhood educators and parents because we really are still experimenting with technology. We're all trying to understand its proper role and place in our lives. There really is something necessary about learning about technology, learning how to use a keyboard. That's okay. Learning how to use a mouse, learning how to do something in partnership with a teacher or a friend, those are all really useful skills that will benefit the child over the course of their life. I wouldn't feel badly about having a child enter their name into a computer, but it is a slippery slope.

Again, coming back to those early childhood basics, child development basics, young children learn best from one on one interactions that screens are a distraction from those one on one interactions. Frankly, the earlier children use them, the more likely it is they'll become dependent on them as a way of soothing themselves versus learning other ways of soothing themselves, seeking help for regulating their emotions. These are some of the major tasks young children have to manage to be ready to go to school and to grow up.

What you're potentially doing by introducing a screen and using it instead of those one on one interactions is the child may take longer to develop those skills and find themselves having more and more trouble as they get older with some of those basic fundamental social skills that you cannot learn online. They aren't going to come to you through an app. They're not going to come to you through a video game. They are only going to come to you through one on one interaction with others.

The early childhood education field, again, I have a huge amount of empathy because there hasn't been much guidance provided. I'm really glad we're talking about this because I think caregivers, providers, and teachers really haven't been given enough information that's evidence based about child development and the proper time and place for technology given their age or stage.

We have several resources on our website at childrenandscreens.org, and yet this is bigger than us. It's something that as a nation, we should be talking about what's the appropriate time to introduce technology into children's lives. How can we ensure that they aren't being subjected to harm, getting back to child protection again, through the device at school? We know that during the pandemic, many schools provided children with tablets or laptops that were then connected to the internet, and their data was being used through EdTech companies, through Google, Amazon, whatever your search engine might have been. It seems so innocent and innocuous to give a child a laptop to do their homework, but in fact, it's a portal into this entire world that is not yet regulated.

Marnetta: Yeah. I will say my sons might argue with you about the video game part because when they game, they game with other people. They might argue.

Kris: I know. You're right, Marnetta. It's complicated. These are big decisions parents have to make based on their child's developmental stage, the family values. But also when you see a technology being used for good, in other words, they're playing a game but they're also socializing, that's good. I think that's a great way actually to support your child as if they're doing both at the same time.

Marnetta: As long as they have real people friends. Yes, you have your online friends, but also you need friends that you're breathing the same air, you can touch them, and you can't live. That can't be your only place to get social engagement. We've been talking a lot about child protection, and you gave some examples of exploitation. Do you have any extreme examples of exploitation? What should parents be concerned about when their children are on?

Kris: I do. In fact, there are new stories every day, and they're deeply disturbing and tragic. There is a case of a Florida boy who died by suicide after engaging extensively with an AI chat bot. There is litigation pending in California right now against the company that had the chat bot.

And it’s one case.There are dozens of other cases against most of the platforms and many cases led by families who have lost children to what started out as an online relationship but ended up in a real life crisis or real life harmful situation, children being provided with drugs, meeting up with adults, the things that happen in real life that were started online. There are many of those cases. Parents are doing a remarkable job of advocating for regulation change as a result of those tragedies.

At the same time, there are numerous cases of children who have suffered from something called problematic use, where they're so preoccupied. They're consumed by the platforms, by the endorphin hit that comes, the dopamine hit that comes from the notifications and the likes that they find it hard to do almost anything else, including eating and sleeping. We've talked about child development, much of it about children in their educational setting, but let's not forget some of these basic physical needs that children have, and sleep is a huge part of a successful childhood. So much of the reorganizing and the digestion of information occurs while you're asleep.

If children aren't getting the sleep they need, in some cases, 9, 10, 11 hours a day is a healthy amount of sleep for a child, but instead they're online, not only aren't they sleeping, they're perhaps being subjected to content that's harmful, or they're feeling anxious, negative feelings, or self critical feelings because they're online and they're not feeling included in a group. They're not getting the likes they want. They're following a streak, they break a streak. These sorts of experiences are very stressful and can lead children to being depressed and anxious.

We are aware of many studies, where those kinds of experiences have been documented over many years by thousands of children. We know that there are certain children that are more at risk than other children, but we are following closely the cases that are traveling through the courts at this point because that's where the evidence can be shared. The courts can weigh in on whether or not children were able to interact with a harmful product.

Marnetta: Do you expect to see a rise in litigation related to the mental health effects of technology and using technology?

Kris: Yes, I do. I think in the end, probably what will happen is these issues we're talking about today, whether it's children's privacy, their exposure to harmful content, or the connection between their online and real lives will be tried in courts. The political system, the legislative system that we have in place, we have three branches of government, judicial, legislative, the legislative branch is not moving quickly enough to address these issues. Children are growing up and have been for the last 12 years without the protections they've needed.

We're reaching a crisis point. We probably are at that crisis point where it's been long enough, and it's enough children and enough parents that there seems to be no faster way to get resolution on some of these issues then to go through the courts. That's not fast, but it may be faster than waiting for congressional action.

A third prong here is when states have taken action because they do seem to move faster, the platforms have sued them to say that their claims aren't accurate, aren't reasonable, aren't fair. That's what's caused some of the litigation. Some things start as legislation. They're signed into law. But before they can go into effect, the platforms are suing to say that shouldn't go into effect because it's unfair to us as a company. That's why you're seeing more and more cases in court, whether it's parents proactively suing or platforms suing states and saying that law isn't fair.

Marnetta: It's crazy. It's just crazy. I'm just wrapping my head around all of that. It just gives me, as a parent but also as a consumer, other perspectives. I can visualize everything you're saying because I took a social media hiatus for a couple of years.

I came back. I was on one of my social things, I laughed at a video once, and then now that's all the algorithm. All they do is in me. These things. I laughed at once. I didn't even like it or nothing, but the algorithms are addicting and can put you in this rabbit hole of just time suck. Deterioration is what it can lead to.

Kris: Absolutely. Imagine if you were a teenager, how at that stage of development, especially teenage girls, are wired to do something called upward social comparison. This is a very normal stage of development, where it's not uncommon for 13-15 year old girls to experience a heightened awareness of others. it can be around appearance, it can be around accomplishments, it can be around friendships.

That feeling I described earlier, being online and starting to feel like physically, there's something wrong or emotionally, there's something wrong, your friendships aren't good enough, causes that downward spiral emotionally. That is very tied to the algorithmic feed, it knowing that there's a vulnerability or an interest on the part of the user, and then pushing further on that vulnerability to the extent that we've seen very tragic stories, especially young girls taking extreme actions around body weight. We're deeply concerned about their safety when it comes to these algorithmic feeds.

Marnetta: We were talking about children, their online presence, the data that's collected. Can you tell us a little bit about what is being done with the data that we should care about?

Kris: We have touched on a few of the things. Data is used to feed the algorithm so that you stay online as long as possible. It's also used to create a profile of you that will follow you forever. That profile is used by many other companies they will sell your data to, whether it's products, ideas, or places that you might want to go. It's used to frankly make the platforms more money and get you to engage in behaviors that benefit them.

It's really almost so vast, it would be hard to tell you all of the different ways. But as you said earlier, if you just think about what happens if you like something and now even if your phone hears you say something, it knows that you just talked about video gaming. Marnetta, it's possible since my phone's sitting here that I will have something in my feed, you will too, when this interview is over, that has something to do with video gaming that is the result of us talking about it for a few minutes.

Algorithms are so sophisticated that they're able to siphon out keywords, attach them to your profile, and then sell that data to companies that would use it to get you to want something that they make associated with that word. We talked just for a second about ed tech in schools and devices in schools. Remember also, when your child has a smartphone at school, it's tracking your child's location. It's tracking what the teacher's saying, what other kids are saying, what the child might search during the day.

Whether they're on a laptop at school, they're on a phone at night, or they're on a tablet in the middle of the day, if they're logging into these same accounts, it's pulling data from all of their locations and all of their activities and updating the profile on the child to benefit them, not the child. I think it's just really important sometimes to step back and think. We talk about phones like that's the only screen, but there are multiple screens that children interact with during a day, starting with television, perhaps being on to a tablet, to a computer, to a phone, to a video gaming device. All of those are collecting data, and all of those are creating your profile online.

Marnetta: My eyes got really big because you made a very explicit connection. Yes, they're using things at school, but the trackings are happening everywhere. It's not even just at my house, it's anywhere they may be connected. We create this line to them.

I will say it is eerie. I will be mindful of what I say. Thanks for the reminder to be careful what I say so my phone doesn't pick up or whatever. It is interesting because soon as my youngest turned 18, the flood of things came. There was nothing till he turned 18, and then there was a flood of reaching out, products, just all of these things. They waited till he was 18. It was literally the day of his birthday. Random phone calls, all that kind of stuff started.

Yes, there's been this compilation obviously in collection of his journey and just floodgates open as soon as he turned 18. Luckily I prepared him for those conversations, otherwise he'd be picking up calls and there's no telling like how that would have went. Some parents and educators may think, especially after listening to this, they've done open Pandora's box, it's already open, you can't go back. What would be your advice for people who want to rein in that screen time usage at any age?

Kris: I'm so glad we got to this point because we've talked a lot about the harms and a lot about the risks. They're out there. One of the most important things to do as you're starting your parenting life or you're thinking about your educational setting where you work, is that the primary objective of your relationship with a child are those one on one interactions. What you want is to protect as much of that time as you can each day.

The screen is a disruption and a distraction from that primary goal that you should have every day. Therefore, when you're deciding what the number of hours or minutes of screen time should be, do it, keeping in mind that it isn't displacing some of this quality time that the two of you are going to have, the class is going to have, or your family's going to have.

Ways of doing that are pretty simple and straightforward. You asked about how old. I think the longer you put off giving your child their own device, is a great way to put off so many of these smaller decisions about age verification and settings. Put it off as long as possible.

The second one is to create screen free zones and screen free times. This can apply to classrooms and or home. They're also simple because they're black and white. This is a screen free zone. The dinner table, screen free zone. Bedroom, screen free zone. Car, screen free zone. You know how children are, we're all creatures of habit. The sooner you put those habits in place, the more they're going to stick. The child might carry that habit with them into adulthood, which would be great.

Screen free times. I talked about sleep, and I cannot emphasize enough the importance of sleep for children, why I brought up screen free zones should be bedrooms. It's because children using their phone in their bedroom at night does lead to tremendous sleep disruption, which can lead to academic performance drops, to obesity, to depression.

There are lots of negative effects caused by sleep deprivation, so I think you want to be really mindful at home that you're preparing your child for the next day by protecting their sleep, but also making sure that they had time to interact with you around meals, and that perhaps you didn't even let them be on all evening so that they could also interact with you on story time, a walk around the block, or whatever you might do, walking the dog, so that they're not at home on their screen while you're out doing all these healthy, happy things, you're leaving them to their own no pun intended devices.

Marnetta: Yeah. I love that you keep mentioning interactions. We are an interactions company. We understand. We know what the research says about spending time together, building relationships, and those things happen in the moment face to face without those distractions and disruptions. Even when done intentionally, we don't get enough of them. We definitely don't want to create any barriers to any of those.

I'm trying to think of what I wanted to ask next. Speaking of interactions, that's where I wanted to go. Earlier you mentioned that you can't interact with technology. It's not a person, you can't build social emotional skills, those types of things. That always leads me to AI, because there's this big fear that AI is going to take over our education system. Let's talk a little bit about AI, your thoughts on it, and how it might affect our young children.

Kris: I'm glad we're getting to this because we spent quite a bit of time on the device, and we talked a little bit about algorithms and social media. That has dominated the space of children and technology for a long time. You're right. Over the last few years, AI, particularly generative AI, has captured all of our interest. There again seems to be a bit of a breathless excitement in the education space that we'll all learn faster, do more. Thanks to AI.

I think it's important to be especially cautious. Based on everything we just learned with the social media phenomenon is to really make sure any product we introduce to children that's AI or generative AI is very safe. I mentioned a minute ago about a child who was connected to an AI chatbot and was experiencing depression and over the course of time was talking to the chatbot about suicide. The chatbot essentially reinforced that the child should explore that and that's what the child did. It's a very dangerous product if not controlled more than it is right now.

Young children in particular cannot understand AI, that it's not human. It would be almost impossible for them to be able to distinguish the difference. We talked a little bit about media literacy earlier. We have to incorporate AI literacy into any programs we develop for kids and parents. I don't think parents or adults even understand exactly what AI is or generative AI. We also talked about privacy and safety could not emphasize enough how much more important those issues are as we embark on a generative AI era.

Keep in mind that these AI products are not designed with kids in mind at all, that they are experiments, and they're using content and chatbots already that have been deployed across almost all the social media platforms. It's almost hard to distinguish anymore that you're talking to a person or a chatbot. We all know this. If we start with our cable company or the bank, you don't know this. You're not talking to a person, it's a chatbot. Children don't know the difference. We barely know the difference.

There are also already some documented situations where AI has essentially replicated the same bias that exists in other, what exists online. Children are exposed to those same biases when they're interacting with an AI product versus a real product. That's not great.

There really are some disturbing consequences of the use of at this point. The suicide case I just mentioned, there's an eating disorders hotline. It's already deployed an AI chat bot. That seems really problematic. Generative AI is producing child sexual abuse material at an alarming scale. We're really worried about the way AI is being used so far.

Marnetta: Yeah. I'm more worried now. I was worried before, but again, this has been a very enlightening conversation. Wow. You gave us what to be cautious for, how it might affect young children, and we had a really great and heavy conversation. I think my next question would be, what are some of your favorite resources for our listeners?

Kris: I really believe that our website, childrenandscreens.org, has a tremendous number of resources because we really do value translating research into actionable steps for parents and providers. We've updated our website, created a learn and explore tab where it's really easy to search by age, stage, or topic, whichever you prefer, and go straight to what those resources might be. It could be a webinar or podcast. It might be a one page tip sheet. It might be some research at a glance as a way to give the various people coming the material in a way that suits them best.

There are great resources through common sense media. There are great resources through the California Partners Program. There are many websites. In fact, it's actually become more and more common for organizations to provide these resources. We like to think at Children's Screens that we've done a really nice job of seeking out everything we can find and putting it into an easily searchable format.

Marnetta: Wonderful. Thank you for that. Any final thoughts for our listeners?

Kris: Yes. I think it's been great. This conversation has been wonderful, and it's really important for parents and kids to be aware of their screen use and how it might interfere with their relationships and other activities when using screens. We should always ask if they're enhancing our experiences and creating opportunities, or if they're distractions. It's also important to acknowledge what healthy digital media use looks like for one family can look very different for another family and that children's needs, their parents' needs, and their teacher's needs have to be considered when making decisions about children's digital lives.

Be realistic with yourself and engage your kids or students as much as possible in developing your screen use rules. Remember that this is a two way street. A healthy screen use for kids is also healthy screen use for adults. They can be distracted and in a sense, send a message to their children that screen use is okay, being distracted is okay, not interacting with each other is okay. Parents, be really mindful of your own use, in your own distractibility, and be sure you're not modeling for your children that that's okay.

Marnetta: I love that you said the word. I was just thinking that and I was like, we have to remember that we are modeling behaviors. Saying one thing and doing something else is a mixed message and creates a challenging behavior for the child. It's conflicting. They don't work well with conflicting information.

Kris, this has been great. It went by really fast. Listeners, I hope that you enjoyed today's conversation, and we hope that you follow along for another great season. You can find today's episode and transcript on our website, teachstone.com/podcast.

We care about what you think. Let us know if you liked this episode by putting a like and a comment on Apple Podcast or whatever streaming platform that you're using so we can continue to make great content that brings you back to us, so we can spend time together. You can also send us an email at podcast@teachstone.com. As always, behind great leading and teaching are powerful interactions. Let's build that culture together.