top of page

AN AI ROBOT WANTS TO BE YOUR CHILD'S BEST FRIEND by John Grabowski


 

At first I laughed. Then I cried.


I looked up from my dinner and saw a TV commercial for a friend you can buy for your child.

That’s right. You can buy your child a friend. But that’s not the part that made me laugh. And then cry.


Cost for this friendship? $1500. But that’s not the part that made me laugh. And then cry.

This friend is a robot.


And a bit of a creepy one in my opinion. Moxie is a robot doll that is intended to give your small child (ages 5-10) the emotional satisfaction of having a best friend.

Whoah.


Now, I get it. I had stuffed animals and even a G.I. Joe when I was little. But I imbued these things with my imagination. I was G.I. Joe, only without the beard. (I never really liked the beard. I imagined his government rations getting stuck in it whenever he ate.) My plastic toys and my flesh-and-blood friends were very different in my childhood universe. As they ought to be.

Might a child wonder why their closest companion is fake - while their classmates play with “real” people?

Here’s an ad for the Moxie robot, which I should stress is a different toy than the Moxie Girlz doll that’s been around for quite a while. This spot is intended to be sweet and heartwarming, but to me is … sad at best and slightly creepy at worst.



The emergence of the Moxie robot is not entirely surprising, though. For several years now, there have been artificial entities that we talk to in our cars and homes, and now AI apps to dish out wholly-formed thoughts and essays, even though, to me, the best of them still feel stilted and lacking real insight. So, is this robot doll merely the next step? Why shouldn’t we give our kids artificially intelligent playthings?


Some years back, a movie came out called Her, about a man, played by Joaquin Phoenix, who falls desperately in love with his phone’s very human OS, voiced by Scarlett Johansson. The film shows how the line between human feeling and simulation is very easy to blur.

So I had to wonder, would growing up with companionship like this change the development of children in some way? Make them uncomfortable around real people, who might not be as accommodating as their toy?

I’m not a psychologist. So to get a professional’s perspective, I talked to Dr. Elizabeth Burns Kramer. She is a child psychologist, one who counts anxiety and relationship concerns among her specialties. She had several concerns.

"I... could not see a clear explanation as to what the doll is trying to offer through the AI aspect. Some testimonials mention ‘helping my child with emotions,’ but I have no sense of how the AI specifically is doing that.”

“Our brains develop social and emotional skills through relationships: interactions with caregivers, interactions with siblings, and interactions with peers… [Children] also engage in imaginative play.”

“Is the purpose of the code to understand the child's imagination and enter those make-believe worlds with her? If so, does the child lose the opportunity to have to negotiate with a playmate around how to blend their imaginative play?”


The robot’s purpose seems to be, at the very least, a supplementary friend and, at most, a substitute friend, which makes me wonder if it might backfire: Might a child wonder why their closest companion is fake - while their classmates play with “real” people?


“I would worry that the child might begin to develop a false story that there was something different about them or that they could only connect with their doll friend.” Dr. Burns Kramer adds, “I really have to believe that the parents would prioritize peer friendships and interactions over time playing with the AI doll. If not, honestly, I don't want to play that out in my mind.”

Can you do anything to make the doll dislike you? If not, the child could grow up to think any behavior is okay, because their first friend allowed it.

She’s also unclear, as I often am when I see the two capital letters “A” and “I,” exactly what it means. Lately, it seems to be a term being applied to everything technological.

“I read through the website a few times and could not see a clear explanation as to what the doll is trying to offer through the AI aspect. Some testimonials mention ‘helping my child with emotions,’ but I have no sense of how the AI specifically is doing that.”

Perhaps it is soothing your child by being his or her friend? Promoting good feelings? Again I thought how that could actually backfire. Can you do anything to make the doll dislike you? If not, the child could grow up to think any behavior is okay, because their first friend allowed it. I admit I have never interacted with Moxie, nor have I watched anyone do this—maybe its makers have all of this covered. But I think Dr. Kramer would agree with me that children learn important boundaries from other children. And just as psychologists see adult instant-gratification devices lead to impatience and anti-social behavior, I have to wonder if a robot friend might promote this among children.

Supplying them with another artificial stimulation instead of a bat and ball and some wet grass is probably not doing them any favors. Maybe that’s all they need, a bat and a ball. Or better yet, water rockets! Remember water rockets?

I also think there’s reason to be concerned about something else Dr. Kramer touched on: Children growing up isolated. Because, novelty aside, why else would you buy a fake friend other than because your child doesn’t have enough real friends? Is a robot a cure? A stop-gap? Could it actually increase a child’s withdrawal from the real world?


Of course some people will say—and perhaps they’re right—that I’m over-thinking all of this, that I see doom everywhere. As I said in a past article, everything from rock-n-roll to Dungeons & Dragons to video games has been cited as The Beginning of the End for kids.


On the other hand, there’s widespread agreement among psychologists that children are growing up with more anxieties than ever. They also agree fewer of them are engaging in old-fashioned, carefree play on jungle gyms and in sandboxes. Supplying them with another artificial stimulation instead of a bat and ball and some wet grass is probably not doing them any favors. Maybe that’s all they need, a bat and a ball. Or better yet, water rockets! Remember water rockets? And flashlight tag on summer nights. That sure was fun…



John Grabowski is a San Francisco Bay Area writer specializing in tech—specifically AI and chatbots, real estate and real estate tech. He has worked in PR, television news and advertising. He is also the author of two novels and a collection of short fiction. His latest novel, Made in the U.S.A., will be published by Arbiter Press early next year.

bottom of page