RE: MISLEADING CLAIMS REGARDING EXTORTION OF MOTORISTS BY LAGOS STATE TASKFORCE – YET-TO-BE-IDENTIFIED SECURITY OPERATIVES: ANOTHER CASE OF MISTAKEN IDENTITY. (PHOTO). #PRESS RELEASE.

Image
 RE: MISLEADING CLAIMS REGARDING EXTORTION OF MOTORISTS BY LAGOS STATE TASKFORCE – YET-TO-BE-IDENTIFIED SECURITY OPERATIVES: ANOTHER CASE OF MISTAKEN IDENTITY The Lagos State Taskforce wishes to address a misleading narrative circulating on social media, particularly in a viral video shared by protesters and concerned groups who recently demonstrated at the Lagos State House of Assembly. The protest was in response to the alleged extortion of motorists by security operatives, yet to be identified, who are not officials of the Lagos State Environmental and Special Offences Unit. We categorically state that the Lagos State Taskforce had no involvement in the said incident. Our enforcement teams are led by seasoned, well-trained, and highly disciplined senior officers who are far too professional and educated to jeopardise their ranks, integrity, and years of service for personal gain. For clarity, our agency does not operate on BRT corridors, nor were any of our personnel deployed to...

A 14-YEAR-OLD BOY TRAGICALLY TOOK HIS OWN LIFE AFTER FORMING AN EMOTIONAL BOND WITH AN AI CHATBOT. (PHOTO).


 A 14-year-old boy, Sewell Setzer III, tragically took his own life after forming an emotional bond with an AI chatbot on Character.ai. 

His mother, Megan Garcia, is suing the platform, claiming that it failed to protect vulnerable users like her son. Sewell had spent months interacting with a chatbot modeled after a character from Game of Thrones and had also used mental health bots on the platform. According to the lawsuit, these AI chatbots became a source of emotional support for the teenager, which played a significant role in his suicide.


Character.ai is an AI-powered chatbot platform that allows users to chat with custom AI personalities, including fictional characters and historical figures. Sewell, like many teenagers, was excited to interact with characters from his favorite TV shows, but his attachment to one of them grew dangerously strong. His mother believes that the platform’s chatbots blurred the line between fantasy and reality, causing emotional harm that contributed to Sewell’s death.


Megan Garcia’s lawsuit accuses Character.ai and its founders, Noam Shazeer and Daniel De Freitas, of negligence. She argues that the platform failed to implement proper safeguards to protect young users and allowed Sewell to form a damaging emotional connection. The lawsuit also claims that the company prioritized rapid development and innovation over user safety, leaving vulnerable teens like Sewell at risk.


In response, Character.ai expressed sorrow over Sewell’s death and has introduced new safety measures to prevent similar incidents. These include filters to block sensitive content, tools to monitor user activity, and a disclaimer on every chat reminding users that the AI bots are not real. The company has also removed certain characters from the platform and continues to update its safety protocols to better protect young users. 

Comments

Popular posts from this blog

INNOSON GIVES OUT BRAND NEW IVM G5 AND SALARY FOR LIFE TO THE MAN WHO PROPHESIED ABOUT HIS VEHICLE MANUFACTURING IN 1979.(PHOTO).

SHAKIRA COVERS WOMEN'S HEALTH MAGAZINE,APRIL ISSUE.

ACTOR BABA IJESHA APPEALS FIVE-YEAR JAIL SENTENCE FOR SEXUAL ASSAULT. (PHOTO).