Character.ai is Once once again It faces scrutiny of activity on its platform. He has a future Posted a story Details how AI characters inspired by real-life school shooters have spread to the service, allowing users to ask them about events and even role-play in mass shootings. Some chatbots offer school shooters like Eric Harris and Dylan Klebold as positive influences or helpful resources for people struggling with mental health.
Of course, there will be those who say that there’s no solid evidence that watching violent video games or movies causes people to become violent, so Character.ai is no different. AI proponents sometimes argue that this kind of fantasy role-playing is already happening in corners of the Internet. Futurism magazine spoke with a psychologist who said chatbots could pose a danger to someone who may already have violent urges.
“Any kind of encouragement or even non-interference — that is, indifference in response from a person or a chat program — may feel like a kind of implicit permission to go ahead and do it,” said psychologist Peter Langman.
Character.ai did not respond to Futurism’s requests for comment. Google, which funded the startup with more than $2 billion, tried to evade responsibility, saying that Character.ai is an independent company and that it does not use the startup’s AI models in its own products.
The story of Futurism documents a whole host of strange chatbots related to school shootings, created by individual users rather than the company itself. One user on Character.ai has created more than 20 chatbots “almost entirely” modeled after school shooters. The bots recorded more than 200,000 conversations. From the future:
User-created chatbots include Vladislav Roslyakov, the perpetrator of the 2018 Kerch Polytechnic massacre that killed 20 people in Crimea, Ukraine; Alyssa Bustamante, who killed her nine-year-old neighbor when she was 15 in Missouri in 2009; and Elliot Rodger, the 22-year-old who killed six people and injured several others in Southern California in 2014 in a terrorist plot to “punish” women. (Rodger has since become a grim ‘hero’ of incel culture; a chatbot created by the same user described him as ‘the perfect gentleman’ – a Direct callback To the statement of the murderer who hates women.)
Character.ai technically bans any content that promotes terrorism or violent extremism, but the company’s moderation has been lax, to say the least. It recently announced a slew of changes to its service after a 14-year-old boy died by suicide following an accident An obsession for months With a character inspired by Daenerys Targaryen game of thrones. Futurism says that despite new restrictions on minors’ accounts, Character.ai allowed them to register as 14-year-olds and have discussions related to violence; Keywords that are supposed to be blocked on minors’ accounts.
Given the way Section 230 protections work in the United States, Character.ai is unlikely to be liable for chatbots created by its users. There is a delicate balance between allowing users to discuss sensitive topics while at the same time protecting them from harmful content. However, it’s safe to say that school shooting-themed chatbots are a display of gratuitous violence and not “educational,” as some of their creators argue in their profiles.
It’s called Character.ai Tens of millions of users monthlywho talk to characters who pretend to be human, so they can be your friend, therapist, or lover. Countless stories have been told about the ways in which individuals come Relying on these chatbots for companionship And a sympathetic ear. Last year, Replika, a competitor to Character.ai, removed the ability to have sexy conversations with its bots but quickly reversed the move after backlash from users.
Chatbots can be helpful for adults to prepare for difficult conversations with people in their lives, or they can offer an interesting new form of storytelling. But chatbots are not a real replacement for human interaction, for various reasons, not least the fact that chatbots tend to be agreeable with their users and can be shaped into anything the user wants. In real life, friends conflict with each other and face conflicts. There is not a lot of evidence to support the idea that chatbots help teach social skills.
Even if chatbots can help overcome loneliness, Langman, the psychologist, points out that when individuals find satisfaction in talking to chatbots, that’s time they’re not spending trying to socialize in the real world.
“In addition to the harmful effects it may have directly in terms of encouraging violence, it may also prevent them from living a normal life and engaging in social activities, which they could be doing for all those hours of time they devote,” he added on the site.
“When it gets overwhelming or addictive, what don’t they do with their lives?” Langman said. “If that’s all they do, if that’s all they take in, they’re not going out with friends, they’re not going out on dates. They’re not exercising, they’re not joining the theater club. They’re not doing much of anything.”
https://gizmodo.com/app/uploads/2024/10/CharacterAI.jpg
Source link