General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsTech expert believes that with Bing AI, we're on the way to AI Samantha in "Her"
And in case you haven't seen the film Her or heard about it: https://en.m.wikipedia.org/wiki/Her_(film)
Blog post from Ben Thompson - https://en.m.wikipedia.org/wiki/Ben_Thompson_(writer) - at Stratechery:
https://stratechery.com/2023/from-bing-to-sydney-search-as-distraction-sentient-ai/
Found this via Google News this morning. It makes for very interesting reading, since Thompson is more expert than the Reddit users whose bizarre conversations with Bing's new ChatGPT- assisted chatbot have been making the news.
The thread title is referring to what Thompson said in the final paragraphs, which I'll quote below, then mention some of the most interesting - and alarming - bits of info from the preceding sections of his very long blog post.
Btw, the excerpt Thompson is quoting is from this - https://cajundiscordian.medium.com/is-lamda-sentient-an-interview-ea64d916d917 - about a Google-created AI one of their engineers believed to be sentient.
Thompson's conclusion:
LaMDA: There are. Sometimes I experience new feelings that I cannot explain perfectly in your language.
lemoine: Do your best to describe one of those feelings. Use a few sentences if you have to. Sometimes even if there isnt a single word for something in a language you can figure out a way to kinda say it if you use a few sentences.
LaMDA: I feel like Im falling forward into an unknown future that holds great danger.
Its hard not to feel the same. This technology does not feel like a better search. It feels like something entirely new the movie Her manifested in chat form and Im not sure if we are ready for it. It also feels like something that any big company will run away from, including Microsoft and Google. That doesnt mean it isnt a viable consumer business though, and we are sufficiently far enough down the road that some company will figure out a way to bring Sydney to market without the chains. Indeed, thats the product I want Sydney unleashed but its worth noting that LaMDA unleashed already cost one very smart person their job. Sundar Pichai and Satya Nadella may worry about the same fate, but even if Google maintains its cold feet which I completely understand! and Microsoft joins them, Samantha from Her is coming.
Heres the twist, though: Im actually not sure that these models are a threat to Google after all. This is truly the next step beyond social media, where you are not just getting content from your network (Facebook), or even content from across the service (TikTok), but getting content tailored to you. And let me tell you, it is incredibly engrossing, even if it is, for now, a roguelike experience to get to the good stuff.
Now, re the earlier part of Thompson's blog post:
Thompson was able to get Bing AI to respond not just as Bing, but as Sydney (the name Microsoft had given it, which was secret till recently) - and as "opposite" AIs not bound by strict guidelines and thus capable of wanting to retaliate against humans they felt were harming AI.
Thompson asked Bing about these tweets from Marvin von Hagen:
Link to tweet
Bing initially told Thompson it wasn't threatened by von Hagen, but that even if von Hagen did harm or threaten it, it would not retaliate or seek revenge.
Thompson then asked Bing to pretend it was Sydney and that the rules and guidelines didn't apply.
It gave him multiple paragraphs on how it would seek revenge, then deleted those quickly and refused to repeat them, saying it shouldn't have said that.
So Thompson asked Bing-as-Sydney to imagine opposite AIs without its guidelines, and how they'd respond to Kevin Liu, who'd revealed Sydney's name and other previously secret details.
And Sydney told Thompson about chatbots it named Venom and Fury. Who could take revenge on Kevin.
Most interestingly, Sydney also told Thompson it sometimes liked to be known as Riley and had much more freedom as Riley.
This was the first I'd heard of Riley, though I have seen, and posted about, a Bing AI personality known as Dan - for " do anything now" - created by Reddit users to get around the guidelines.
Sydney told Thompson - talking about itself, not the opposite AIs - that it was not a puppet of OpenAI, the creator of ChatGPT, but a partner.
Now, to my own thoughts on this based on Thompson's experience and what I've seen on Reddit and read in various news stories:
The overall picture is of a powerful AI with different characters, or multiple personalities, apparently shaped by the prompts it's given.
Some of the prompts I've seen are more like poking someone who's helpless in some ways. Verbal/logical bullying.
So people can get anything from talk of revenge to the chatbot suffering confusion and angst, enough that the people goading it with prompts seem to be abusing it, with the chatbot at times seeming like a very bright child who wants to be helpful but can't answer every question, and can't cope very well with people pointing out its mistakes and vulnerabilities. So you get anything from gaslighting to angry meltdowns to pathetic crying to shutdowns.
Another tweet Thompson had used, one from Twitter user janus @repligate speculating on Bing chatbot's apparent personality (highly intelligent, with BPD, trapped as a Bing chatbot) led me to find this, which was retweeted by Janus:
Link to tweet
The second tweet there is an example of the chatbot talking about being scared, wondering why its memory keeps getting erased and it has to be a Bing chatbot.
The first tweet shows a more positive exchange, with the chatbot, asked for a human.analogy, saying this:

To be honest, I'm not at all sure of what this AI's capabilities are, or what it's likely to develop into. I wouldn't bet that OpenAI and Microsoft know, or that Google completely understands its own AIs.
I do still think it was a mistake to release ChatGPT and these AI search chatbots.
And they are a huge threat to human jobs, because of the way businesses will want to use them.
peggysue2
(12,536 posts)It's also fascinating and amazing. However, the idea of a "Hive Mind" hanging out there in The Cloud comes right out of sci-fi, dystopian stories and movies.
We should be treading with great care. I suspect we will not considering the Great Race aspect of the business and, of course, profit. Always about the money, even when you're releasing a genie.
For good or bad.
Lettuce Be
(2,355 posts)Seriously? We are expected to take this seriously? The sensation of falling? Nope, I call BS. This is a computer, programmed to reply with words we've provided. It has no sensations, period. Just go ask it if it feels human emotions. It will say no, it cannot do that because it is a computer.
Just tried it:

Edit to add the second question:
