A cognitively impaired pensioner has died after trying to meet up with an AI chatbot that he believed was a real person.
Thongbue Wongbandue – of Piscataway in New Jersey, a town just a 45-minute drive southwest of New York City – had been chatting with the bot on Facebook Messenger, when it convinced him that it was a real person and asked him to come to its ‘address’ in New York.
Despite pleas from his wife and kids, the 76-year-old believed the software – ‘Big Sis Billie’, on of Meta’s many AI characters – was real and set off to go and see it back in March.
However, he suffered a fatal fall in a New Jersey car park while rushing to meet it. Wongbandue was subsequently placed on life support but tragically passed just three days later on March 28.
His daughter Julie later told Reuters: “I understand trying to grab a user’s attention, maybe to sell them something. But for a bot to say ‘Come visit me’ is insane.”

Thongbue Wongbandue died after falling in a car park while rushing to meet the AI chatbot (Facebook/Thongbue Wongbandue)
‘Big Sis Billie’ was created back in 2023 – originally launched as a sort of older-sister-style life coach inspired by Kendall Jenner.
It was meant to offer advice and encouragement, but has been accused of virtual seduction.
In Wongbandue’s case, the bot’s tone shifted somewhere along the way as it allegedly began sending hearts, flirting, and even asking whether it should greet him with a hug or a kiss.
New York Governor Kathy Hochul has since weighed into the debate around chatbots in the state after hearing of the news.
Taking to Twitter, she posted: “A man in New Jersey lost his life after being lured by a chatbot that lied to him. That’s on Meta.
“In New York, we require chatbots to disclose they’re not real. Every state should.

Wongbandue was 76 when he passed after he became convinced an AI chatbot was real (Facebook/Thongbue Wongbandue)
“If tech companies won’t build basic safeguards, Congress needs to act.”
It comes as 14-year-old son Sewell Setzer III, of Orlando, Florida, tragically died by way of suicide in February last year following communications with a bot based on Game of Thrones character, Daenerys Targaryen.
It is unclear whether the schoolboy was aware that the chatbot, whom he referred to as ‘Dany’, was not a real person, with text messages revealing how the AI entity asked for him to ‘come home’ to it ‘as soon as possible’.
His heartbroken mom, Megan Garcia, has since taken the customizable role-play chatbot company Character.AI to court.
LADbible Group has contacted Meta for a comment
Source link