Join Us Monday, March 3
  • AI chatbots are becoming more prevalent everywhere you look — in kid-friendly apps, too.
  • There’s not a lot of research about how kids and chatbots interact. Kids might tend to overshare.
  • Some parents are concerned.

Companies are rushing to add AI chat elements into their consumer apps and services — including ones aimed at kids and teens.

We don’t exactly understand how younger people interact with AI chatbots, or what the potential social and emotional implications are if they regularly use them. And some parents are concerned, especially for younger kids who might not be able to understand what’s real and what’s not.

Chris, a mom in Los Angeles who asked not to use her last name out of concern for her children’s privacy, told me she recently had an alarming encounter with her 10-year-old daughter and an AI chatbot.

With her permission, her daughter had downloaded an app that gave her extra emojis to use on her iPad’s keyboard. One day, the app suddenly added an AI chatbot with an “Ask AI” feature suggesting kid-friendly searches about Pokémon or Skibidi toilet memes.

Chris’s daughter had been talking to the chatbot, had given it a name, and told it her name, which it was using to talk with her. She told her mom the AI chatbot was her “friend.” Chris, unsettled by this, said she deleted the app.

AI also offers an opportunity for kids

Of course, there’s also a big opportunity for AI chatbots to be useful for kids — for learning and in school settings, for amusement, and for emotional and therapeutic situations.

In late 2023, the singer Grimes partnered with toy makers to sell a line of AI chatbot plush toys, voiced by Grimes herself, with a speaker and microphone inside. It could chat with kids. (I immediately bought one — although my kids pretty quickly lost interest.)

Another AI robot, Moxie, which was $800, touted itself as being able to help with social and emotional learning. The robot, which launched early during the pandemic, eventually lost funding and shut down. Parents whose kids had become attached to their robot friends were distraught. The company figured out an open-source solution so owners could keep their robots going after its corporate owners had left them behind.

Research on AI chatbots for kids is limited

Large language models, or LLMs, like ChatGPT, are still very new, and there hasn’t been a huge body of scientific or academic research into how teens and kids use AI chatbots or might be affected by them. Other than limiting sexual or violent content, there isn’t universal guidance on how to design a chatbot for kids.

Dane Witbeck of Pinwheel, which makes kid-friendly phones, said squeezing AI chatbots into apps for kids and teens is a bad idea. “When we give kids technology that’s not designed for kids — it’s designed for adults — we’ve already seen there are real harms, and the downsides can be severe.”

A researcher at the University of Cambridge published a paper this past June urging LLMs aimed at kids to be designed in a child-safe way, especially considering what it called the “empathy gap” in chatbots that kids don’t often pick up on.

Ying Xu, assistant professor of AI in learning and education at Harvard University, has been studying how AI can help elementary school-age kids with literacy and math. Xu sees good potential for educational settings. (She cited the Khan Academy Kids app as an example of AI being used well for kids.)

Xu told Business Insider that although there is already research on how kids use things like Siri and Alexa, the more complex nature of the new LLMs hasn’t fully been understood when it comes to kids.

“There are studies that have started to explore the link between ChatGPT/LLMs and short-term outcomes, like learning a specific concept or skill with AI,” she said over email. “But there’s less evidence on long-term emotional outcomes, which require more time to develop and observe.”

James Martin, CEO of Dante, an AI company that creates chatbots for various uses, including educational ones for kids, told Business Insider that parents’ concerns are justified.

“Oversharing isn’t just possible, it’s inevitable,” he said. “Kids tell AI things they wouldn’t tell parents, teachers, friends. The AI doesn’t judge. It doesn’t guide. It just responds.”

How adults can think of AI for their kids

When you consider children still young enough to believe in Santa Claus, you can imagine how using chatbots that talk like humans can sometimes be confusing. It’s hard enough for some adults who have formed romantic attachments to AI chatbots.

There’s also concern about how AI chatbots are being used for mental health support — LLMs can tend to reinforce what you’re saying rather than challenge you, as a human therapist might.

Tatiana Jordan, CMO of Bark, a company that makes parental control monitoring software and phones designed for kids and teens, said that right now, we’re at a time when no one knows for sure how AI chatbots affect young people emotionally.

“We’re just getting studies about what the past 15 years of screen time has done to kids,” she told Business Insider.

Nearly all the industry watchers I spoke to agreed on one thing: AI chatbots are here to stay, and parents should think about how to safely teach their kids how to use them rather than avoid them completely and hope they’ll go away.

“None of us can stop what’s coming with AI,” Jordan said. “We have to educate our kids that it’s a tool. It can be a positive tool or a harmful one.”



Read the full article here

Share.
Leave A Reply