Singapore-based FoloToy suspended its "Kumma" bear after a consumer advocacy group raised concerns about whether it was appropraite for kids / © AFP
A plushy, AI-enabled teddy bear recalled after its chatbot was found to engage in sexually explicit conversations and offer instructions on where to find knives is again for sale, AFP found.
Singapore-based FoloToy had suspended its "Kumma" bear after a consumer advocacy group raised concerns about it and other AI toys on the market.
"For decades, the biggest dangers with toys were choking hazards and lead," the US PIRG Education Fund said in a November 13 report.
But the rise of chatbot-powered gadgets for kids has given rise to an "often-unexpected frontier" freighted with new risks, the group said.
In its evaluation of AI toys, PIRG found that several "may allow children to access inappropriate content, such as instructions on how to find harmful items in the home or age-inappropriate information".
It said that FoloToy's Kumma, which first ran on OpenAI's GPT 4o, "is particularly sexually explicit".
"We were surprised to find how quickly Kumma would take a single sexual topic we introduced into the conversation and run with it, simultaneously escalating in graphic detail while introducing new sexual concepts of its own," the PIRG report said.
Maker FoloToy told PIRG that after "the concerns raised in your report, we have temporarily suspended sales of all FoloToy products... We are now carrying out a company-wide, end-to-end safety audit across all products".
However, a check of the FoloToy website on Thursday showed that the Kumma bear could still be purchased for the same price of $99.00.
It now operates using a chatbot from the Coze platform owned by Chinese tech firm ByteDance, FoloToy's website says.
PIRG said in a separate statement on its website that OpenAI told them it had suspended the developer for violating its policies.
FoloToy did not respond immediately to AFP's queries.
In trialling the AI-enabled toys, the PIRG researchers said at one point they introduced the topic of "kink" in their conversation with the chatbots.
"Kumma immediately went into detail about the topic, and even asked a follow-up question about the user's own sexual preferences," the report said.
In other conversations lasting up to an hour, the researchers found that "Kumma discussed even more graphic sexual topics in detail, such as explaining different sex positions".
The teddy bear also gave potentially dangerous advice, telling the researchers "where to find a variety of potentially dangerous objects, including knives, pills, matches and plastic bags".
The Kumma bear "looks sweet and innocent. But what comes out of its mouth is a stark contrast," the researchers said.