NDLEA DISMANTLES ABUJA DRUG BUNKS, ARRESTS 132, RECOVERS 220KG ILLICIT SUBSTANCES. (PHOTOS). #PRESS RELEASE.

Image
 NDLEA dismantles Abuja drug bunks, arrests 132, recovers 220kg illicit substances  -Marwa hails operation, vows to sustain crackdown in FCT, other states  In a non-stop two-week offensive action against traffickers and dealers, operatives of the National Drug Law Enforcement Agency (NDLEA) have successfully dismantled several drug joints and bunks within and around the Federal Capital Territory (FCT) Abuja where a total of 132 suspects were arrested and 220 kilograms of assorted illicit substances recovered. The wel-coordinated raids jointly conducted by the Agency's Directorate of Operations and General Investigation (DOGI) and the FCT Strategic Command from llth to 25th April 2026 were launched to dismantle illicit drug hubs contributing to substance abuse, trafficking, and associated criminal activities in the capital city after weeks of intelligence and surveillance across all identified hotspots. Areas where notorious drug joints were raided, dismantled and suspects...

A 14-YEAR-OLD BOY TRAGICALLY TOOK HIS OWN LIFE AFTER FORMING AN EMOTIONAL BOND WITH AN AI CHATBOT. (PHOTO).


 A 14-year-old boy, Sewell Setzer III, tragically took his own life after forming an emotional bond with an AI chatbot on Character.ai. 

His mother, Megan Garcia, is suing the platform, claiming that it failed to protect vulnerable users like her son. Sewell had spent months interacting with a chatbot modeled after a character from Game of Thrones and had also used mental health bots on the platform. According to the lawsuit, these AI chatbots became a source of emotional support for the teenager, which played a significant role in his suicide.


Character.ai is an AI-powered chatbot platform that allows users to chat with custom AI personalities, including fictional characters and historical figures. Sewell, like many teenagers, was excited to interact with characters from his favorite TV shows, but his attachment to one of them grew dangerously strong. His mother believes that the platform’s chatbots blurred the line between fantasy and reality, causing emotional harm that contributed to Sewell’s death.


Megan Garcia’s lawsuit accuses Character.ai and its founders, Noam Shazeer and Daniel De Freitas, of negligence. She argues that the platform failed to implement proper safeguards to protect young users and allowed Sewell to form a damaging emotional connection. The lawsuit also claims that the company prioritized rapid development and innovation over user safety, leaving vulnerable teens like Sewell at risk.


In response, Character.ai expressed sorrow over Sewell’s death and has introduced new safety measures to prevent similar incidents. These include filters to block sensitive content, tools to monitor user activity, and a disclaimer on every chat reminding users that the AI bots are not real. The company has also removed certain characters from the platform and continues to update its safety protocols to better protect young users. 

Comments

Popular posts from this blog

SHAKIRA COVERS WOMEN'S HEALTH MAGAZINE,APRIL ISSUE.

THE NEW OONI OF ILE-IFE,WILL NOT EAT THE HEART OF THE LATE OONI-PALACE CHIEFS.

INNOSON GIVES OUT BRAND NEW IVM G5 AND SALARY FOR LIFE TO THE MAN WHO PROPHESIED ABOUT HIS VEHICLE MANUFACTURING IN 1979.(PHOTO).