"Sydney is an old code name for a chat feature based on earlier models that we began testing more than a year ago."
Microsoft has confirmed that it tested its bizarrely-behaved Bing AI chatbot for way longer than we realized — and somehow, it still managed to be totally unhinged upon wider launch.
In a statement provided to Futurism, Microsoft confirmed that it had indeed been quietly beta-testing the AI in India, and admitted that the tests went back further than just a few months ago.
"Sydney is an old code name for a chat feature based on earlier models that we began testing more than a year ago," the statement from a Microsoft spokesperson reads. "The insights we gathered as part of that have helped to inform our work with the new Bing preview."
It's unclear what, if anything, the tech giant has learned from its Sydney experiment given that prior to it being "lobotomized" over the past week, the AI was still trying to break up marriages, writing a hit list, and generally acting not entirely well.
In fact, it's hard to tell if Microsoft has learned anything from its last big foray into AI because at least with Tay, its Twitter chatbot that was swiftly goaded into racism, they put the kibosh on it in less than a day. Meanwhile, the Bing AI was allowed to continue pretty deranged for weeks before the company stepped in in a serious way.
Microsoft told us in its statement that the company will "continue to tune our techniques" and is "working on more advanced models to incorporate the learnings and feedback so that we can deliver the best user experience possible," but as we reported yesterday, the feedback from Indian beta testers about the AI "misbehaving" and spewing disinformation seemed to not be heeded in time for the chatbot to get a wider launch in the West.
Hopefully now that Microsoft has given its latest chatbot the AI version of brain surgery it'll start acting a bit more normal — and maybe, with any luck, the company will have learned some lessons, too.
More on AI insanity: OpenAI CEO Says AI Will Give Medical Advice to People Too Poor to Afford Doctors
Share This Article