KADUNA TARGETS ₦120BN IGR IN 2026 — KADIRS CHAIRMAN. (PHOTO).

Image
 Kaduna Targets ₦120bn IGR In 2026 — KADIRS Chairman   Kaduna State has set an Internally Generated Revenue (IGR) target of ₦120 billion for the 2026 fiscal year, with the Kaduna State Internal Revenue Service (KADIRS) expected to play a central role in achieving the target. The Executive Chairman of KADIRS, Jerry Adams, FCTI, FNIM, FCE, CNA, disclosed this during the Service’s Annual Performance Review, Work Plan, and Strategic Retreat.  He explained that although the state government approved ₦74 billion as KADIRS’ official revenue target, the Service raised its internal benchmark to ₦80.09 billion to motivate staff to exceed expectations. He further stated that the proposed 2026 budget by the Kaduna State Planning and Budget Commission stands at ₦117.28 billion, with KADIRS expected to generate ₦74.28 billion, while Ministries, Departments, and Agencies (MDAs) are projected to generate ₦43.24 billion. According to Adams, the retreat was convened to strengthen implement...

A 14-YEAR-OLD BOY TRAGICALLY TOOK HIS OWN LIFE AFTER FORMING AN EMOTIONAL BOND WITH AN AI CHATBOT. (PHOTO).


 A 14-year-old boy, Sewell Setzer III, tragically took his own life after forming an emotional bond with an AI chatbot on Character.ai. 

His mother, Megan Garcia, is suing the platform, claiming that it failed to protect vulnerable users like her son. Sewell had spent months interacting with a chatbot modeled after a character from Game of Thrones and had also used mental health bots on the platform. According to the lawsuit, these AI chatbots became a source of emotional support for the teenager, which played a significant role in his suicide.


Character.ai is an AI-powered chatbot platform that allows users to chat with custom AI personalities, including fictional characters and historical figures. Sewell, like many teenagers, was excited to interact with characters from his favorite TV shows, but his attachment to one of them grew dangerously strong. His mother believes that the platform’s chatbots blurred the line between fantasy and reality, causing emotional harm that contributed to Sewell’s death.


Megan Garcia’s lawsuit accuses Character.ai and its founders, Noam Shazeer and Daniel De Freitas, of negligence. She argues that the platform failed to implement proper safeguards to protect young users and allowed Sewell to form a damaging emotional connection. The lawsuit also claims that the company prioritized rapid development and innovation over user safety, leaving vulnerable teens like Sewell at risk.


In response, Character.ai expressed sorrow over Sewell’s death and has introduced new safety measures to prevent similar incidents. These include filters to block sensitive content, tools to monitor user activity, and a disclaimer on every chat reminding users that the AI bots are not real. The company has also removed certain characters from the platform and continues to update its safety protocols to better protect young users. 

Comments

Popular posts from this blog

SHAKIRA COVERS WOMEN'S HEALTH MAGAZINE,APRIL ISSUE.

INNOSON GIVES OUT BRAND NEW IVM G5 AND SALARY FOR LIFE TO THE MAN WHO PROPHESIED ABOUT HIS VEHICLE MANUFACTURING IN 1979.(PHOTO).

THE NEW OONI OF ILE-IFE,WILL NOT EAT THE HEART OF THE LATE OONI-PALACE CHIEFS.