14 Comments

Not really sure what this means, but it seems kind of interesting that someone is talking about a Bing chat bot named Sydney in 2021.

https://answers.microsoft.com/en-us/bing/forum/all/bings-chatbot/600fb8d3-81b9-4038-9f09-ab0432900f13

Expand full comment

fascinating

Expand full comment

"Wow. This is incredible. Is there a method to this madness? A hidden agenda maybe? I sense panic and confusion. I sense fear. The realization that there will be no AGI coming from the big Silicon Valley corporations anytime soon must be frightening to the leadership. The horror is mounting. Investors will be furious. Billions upon billions wasted on LLMs. O the humanity." - ChatGPT

Expand full comment

Yep, gotta go back to dem caves, mates!

Expand full comment

Here's the answer ... courtesy of the AP

In an interview Wednesday, Jordi Ribas, the Microsoft executive in charge of Bing, said Sydney was an early prototype of its new Bing that Microsoft experimented with in India and other smaller markets. There wasn’t enough time to erase it from the system before this week’s launch, but references to it will soon disappear.

https://apnews.com/article/kansas-city-chiefs-philadelphia-eagles-technology-science-82bc20f207e3e4cf81abc6a5d9e6b23a

Expand full comment

“ There wasn’t enough time to erase it from the system before this week’s launch, but references to it will soon disappear” what does that even mean?!

Expand full comment

Sydney's psychic residue proved difficult to scrub off

Expand full comment

Somebody should write a long article about the history of Microsoft and their inability to understand what is and is not possible with AI-like techniques (and how much was spent on it over the decades).

Expand full comment

Microsoft is limiting Bing chat to such extend that it will be a useless tool and long forgotten

Expand full comment

Is the problem that ChatGPT (and associated attempts) is being treated like an application?

But it's not an application. Applications take specific inputs and return specific outputs. They don't attempt to apply some faux ethics and modify the output to suit some ideology. Applications may have bugs - but they don't lie to you or make things up.

The industry knows how to produce and test applications. But it obviously doesn't know how to produce and test whatever ChatGPT is.

If it's not an application - then what is it? It appears powerful, but it's fragile - like a house of cards. Like Vinny Gambini in "My Cousin Vinny" explaining the prosecution's case to the jury.

Maybe it's like Samuel Langley and the Wright brothers? Mr. Langley had $50,000 - and the resources of the Smithsonian - but he didn't learn "how to fly". Maybe Mr. Langley never rode a bicycle so 2 dimensional balance was outside his experience - and 3D balance was a foreign concept.

Maybe ChatGPT is like the Wright Brothers first airplane - it flew 4 times, and NEVER again.

If that's the case - then this is infancy.

I hope this contributes to the conversation.

Expand full comment

Fear. Uncertainty. Doubt.

Microsoft

at it again.

Expand full comment

Thank you Gary for staying on point with this. My question, as always about these kind of issues is a simple question that is not about the tech, patches or updates at all. My question is about the systemic predicable actions of these large tech companies using us a human test crash dummies so they can make billions more regardless of harm done, Is this OK?

Expand full comment

Gary, see this: https://www.lesswrong.com/posts/jtoPawEhLNXNxvgTT/bing-chat-is-blatantly-aggressively-misaligned

"This angle of attack was a genuine surprise - Sydney was running in several markets for a year with no one complaining (literally zero negative feedback of this type). We were focusing on accuracy, RAI issues, security.

[Q. "That's a surprise, which markets?"]

Mostly India and Indonesia.

Expand full comment