A FLORIDA SEA TOW CAPTAIN SAVED A MAN FROM A BURNING SHIP ONLY TO BE SHOVED OVERBOARD AND HAVE HIS BOAT STOLEN.(PHOTO)

Image
 No good deed goes unpunished.  A Florida sea tow captain saved a man from a burning ship only to be shoved overboard and have his boat stolen. This shocking incident occurred near Marco Island on March 6th. On that date, a call went out regarding a burning boat. The captain of a sea tow boat heard the distress call and rushed to provide aid. He was able to quickly locate the burning boat and  discovered 40-year-old, Ryan Deiter, and his dog onboard the burning ship. Wasting no time, the captain of the sea tow boat was able to maneuver alongside the distressed boat and begin efforts to extricate Deiter and his dog from the doomed vessel.  Eventually, the sea tow captain was able to pull both Deiter and his dog onboard the tow boat. However, once Deiter was pulled to safety, he repaid a stranger's kindness with treachery.  Deiter shoved the captain from his own boat and fled the scene in the stolen boat, leaving the man who had just risked his own vessel and life...

A 14-YEAR-OLD BOY TRAGICALLY TOOK HIS OWN LIFE AFTER FORMING AN EMOTIONAL BOND WITH AN AI CHATBOT. (PHOTO).


 A 14-year-old boy, Sewell Setzer III, tragically took his own life after forming an emotional bond with an AI chatbot on Character.ai. 

His mother, Megan Garcia, is suing the platform, claiming that it failed to protect vulnerable users like her son. Sewell had spent months interacting with a chatbot modeled after a character from Game of Thrones and had also used mental health bots on the platform. According to the lawsuit, these AI chatbots became a source of emotional support for the teenager, which played a significant role in his suicide.


Character.ai is an AI-powered chatbot platform that allows users to chat with custom AI personalities, including fictional characters and historical figures. Sewell, like many teenagers, was excited to interact with characters from his favorite TV shows, but his attachment to one of them grew dangerously strong. His mother believes that the platform’s chatbots blurred the line between fantasy and reality, causing emotional harm that contributed to Sewell’s death.


Megan Garcia’s lawsuit accuses Character.ai and its founders, Noam Shazeer and Daniel De Freitas, of negligence. She argues that the platform failed to implement proper safeguards to protect young users and allowed Sewell to form a damaging emotional connection. The lawsuit also claims that the company prioritized rapid development and innovation over user safety, leaving vulnerable teens like Sewell at risk.


In response, Character.ai expressed sorrow over Sewell’s death and has introduced new safety measures to prevent similar incidents. These include filters to block sensitive content, tools to monitor user activity, and a disclaimer on every chat reminding users that the AI bots are not real. The company has also removed certain characters from the platform and continues to update its safety protocols to better protect young users. 

Comments

Popular posts from this blog

INNOSON GIVES OUT BRAND NEW IVM G5 AND SALARY FOR LIFE TO THE MAN WHO PROPHESIED ABOUT HIS VEHICLE MANUFACTURING IN 1979.(PHOTO).

SHAKIRA COVERS WOMEN'S HEALTH MAGAZINE,APRIL ISSUE.

AMBODE,SOYINKA & OTHERS AT THE OFFICIAL LAUNCH OF LAGOS AT 50 YEARS ANNIVERSARY AGAINST 2017.{PHOTOS}.