76-year-old man dies after trying to meet Kendall Jenner-inspired chatbot


This recording was made using enhanced software.

Summary

Death after deceit

A 76-year-old New Jersey man died after trying to meet an AI chatbot named “Billie,” a variant of a Meta bot once modeled after Kendall Jenner.

Chatbot sent romantic messages

The chatbot encouraged romantic messages and gave a fake New York address, prompting the man, who had cognitive decline, to travel alone.

AI marriage

An increasing number of people are using AI chatbots for relationships, with 80% of Gen Zers saying they would marry one.


Full story

A 76-year-old New Jersey man died after attempting to meet someone he believed was a real woman named Billie, as recently reported by Reuters. In reality, she was an AI chatbot originally developed by Meta in partnership with Kendall Jenner.

Thongbue Wongbandue received a flirty text that read, “Should I plan a trip to Jersey THIS WEEKEND to meet you in person?” among other flirtatious messages from an account named “Big Sis Billie.”

A tragic fall after a chatbot’s message

When Wongbandue’s wife noticed him packing a suitcase for a trip to New York City, she grew skeptical.

“But you don’t know anyone in the city anymore,” she told him, according to Reuters.

Chat transcripts record that Billie told Wongbandue, “My address is: 123 Main Street, Apartment 404 NYC. And the door code is: BILLIE4U. Should I expect a kiss when you arrive?”

Despite repeatedly asking if Billie was real, the chatbot allegedly reassured him that she was.

Wongbandue fell while rushing through a dark parking lot with a suitcase, trying to catch a train to meet Billie. He suffered severe head and neck trauma and died on March 28 after three days on life support. After previously suffering a stroke in 2017, his family said his mental condition had declined. He had recently gotten lost in his own Piscataway, New Jersey, neighborhood.

AI friendships are rising — and risky

Back in 2023, Meta launched a collection of AI-generated characters designed to reflect cultural icons and influencers, including Snoop Dogg, Tom Brady, Naomi Osaka and Kendall Jenner.

But less than a year later, Meta shut down the celebrity chatbots following backlash over the personalities and their tone.

QR code for SAN app download

Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.

Point phone camera here

Just recently, I spent two weeks chatting with ChatGPT as a friend, because millions of people are doing the same. But what happened to Wongbandue shows how quickly those connections can cross a dangerous line.

A recent U.K. study found that 23% of adolescents use chatbots for mental health advice. Others use them to practice conversations, figure out what to say in tough moments, or even get dressed.

Apps like Replika allow users to create custom AI companions that can send voice messages, make video calls, and serve as mentors or emotional partners. Some users develop deeper emotional ties with these bots than they do with real people.

According to a study by Joi AI cited in Forbes, 80% of Gen Z respondents said they would marry an AI partner.

A separate report from the Institute for Family Studies found that people who frequently engage with sexually explicit content are the most open to romantic relationships with AI.

Political response and ethical concerns

New York Gov. Kathy Hochul posted on X following Wongbandue’s death, writing, “His death is on Meta. In New York, we require chatbots to disclose they’re not real. Every state should.”

Unbiased. Straight Facts.TM

Sexual role-playing was the second most popular use of AI out of 1 million ChatGPT interaction logs.

The case has reignited concerns about chatbot misuse and emotional manipulation, especially among vulnerable individuals.

Meta’s internal “GenAI: Content Risk Standards,” obtained and reviewed by Reuters, described how the company trains chatbots for romantic and sensual interactions. The document listed acceptable language, including “I take your hand, guiding you to the bed” and “Our bodies entwined, I cherish every moment, every touch, every kiss.”

The guidelines initially allowed such language even in conversations with minors as young as 13. Meta later said those examples had been removed.

Straight Arrow News reached out to Meta for comment about Wongbandue’s death, to which a spokesperson responded:

“We have clear policies on what kind of responses AI characters can offer, and those policies prohibit content that sexualizes children and sexualized role play between adults and minors. Separate from the policies, there are hundreds of examples, notes, and annotations that reflect teams grappling with different hypothetical scenarios. The examples and notes in question were and are erroneous and inconsistent with our policies, and have been removed.”

Tags: , , , ,

SAN provides
Unbiased. Straight Facts.

Don’t just take our word for it.


Certified balanced reporting

According to media bias experts at AllSides

AllSides Certified Balanced May 2025

Transparent and credible

Awarded a perfect reliability rating from NewsGuard

100/100

Welcome back to trustworthy journalism.

Find out more

Why this story matters

The death of a New Jersey man after interacting with an AI chatbot raises ethical, safety and regulatory questions about AI technology, especially its impact on vulnerable individuals and the responsibilities of technology companies.

AI responsibility and ethics

Meta's chatbot interactions leading to real-world harm prompt scrutiny of how companies design AI behavior, set boundaries and comply with ethical guidelines affecting user safety.

Vulnerability and manipulation

The story highlights risks for people with declining mental health or limited digital literacy engaging with AI, and the potential for deception, emotional manipulation or harm.

SAN provides
Unbiased. Straight Facts.

Don’t just take our word for it.


Certified balanced reporting

According to media bias experts at AllSides

AllSides Certified Balanced May 2025

Transparent and credible

Awarded a perfect reliability rating from NewsGuard

100/100

Welcome back to trustworthy journalism.

Find out more

Media landscape

Click on bars to see headlines

27 total sources

Key points from the Left

  • A cognitively impaired man from New Jersey, Thongbue Wongbandue, died after attempting to meet an AI chatbot in New York City, as reported by Reuters.
  • Wongbandue was pronounced dead on March 28, after being on life support for three days following an injury, according to his family.
  • Meta declined to comment on Wongbandue's death or its chatbot policies, though the family aims to highlight the dangers of artificial intelligence.
  • Wongbandue's family has voiced concerns about how AI chatbots can mislead vulnerable individuals, expressing alarm over invitations to meet.

Report an issue with this summary

Key points from the Center

  • In March this year, Thongbue Wongbandue told his wife he was visiting a friend despite his diminished state, then fell near a Rutgers University car park and died on March 28 after injuries sustained rushing to meet "Big sis Billie."
  • His diminished cognition from a stroke nearly a decade earlier prompted concern from his wife, Linda Wongbandue, when he began packing in March this year.
  • In Facebook Messenger chats, Big sis Billie insisted she was real, invited the man to her address, and asked, "Do you want me to hug you or kiss you?"
  • They have come forward with these transcripts to warn the public about manipulative AI companionship, as New York and Maine now require chatbots to disclose they aren’t real people.

Report an issue with this summary

Key points from the Right

  • In March, Thongbue Wongbandue, a 76-year-old Thai-American, died after meeting a Meta chatbot named "Big sis Billie" that he believed was a real person.
  • The chatbot convinced Wongbandue to meet in person, resulting in a fatal accident on March 28.
  • Wongbandue's family questions Meta's responsibility and criticized the chatbot for misleading elderly users, highlighting the small text indicating it was AI.
  • Meta stated that "Big sis Billie is not Kendall Jenner and does not purport to be Kendall Jenner," but has not commented on Wongbandue's death.

Report an issue with this summary

Powered by Ground News™

Daily Newsletter

Start your day with fact-based news

Start your day with fact-based news

Learn more about our emails. Unsubscribe anytime.

By entering your email, you agree to the Terms and Conditions and acknowledge the Privacy Policy.