Explore Bing’s new ChatGPT-like features firsthand

Explore Bing's new ChatGPT-like features firsthand

A new generation OpenAI GPT model and Microsoft’s own Prometheus model power the new Bing, which was released yesterday on the web and in the Edge browser. Microsoft has thus taken the lead over Google in popularising this type of search experience, though Google is sure to catch up in the coming months. After using the new Bing, I agree with Microsoft CEO Satya Nadella that “it’s a new day for search.”

Microsoft is currently restricting use of the updated Bing and its AI functions via a sign-up waitlist. If you’re interested, you can sign up for it here. Microsoft has promised that in the upcoming weeks, it will make the next experience available to millions of users. On both macOS and Windows, I’ve been using the new developer version of Edge.
The first thing new users of Bing will notice is a slightly larger query prompt and a little more information for those who haven’t been using Bing regularly. The search engine now says “ask me anything,” and it means it. It will gladly use keywords if you insist on continuing to use them, but a more free-form query will yield better results.

I think Microsoft hit the sweet spot between traditional link-focused search results and cutting-edge AI tools. For more factual queries, it will often provide the AI-powered results at the very top of the search results page. This feature will open a sidebar with additional information for questions that require more than a simple answer. Following the search results, it will typically display three potential chat queries (similar to Google’s Smart Chips in Google Docs) and then launch the chat window. The chat window slides down from the top of the page with a brief animation. Swiping up or down will always take you to the next one.

Some recipe searches, which the company highlighted in its demos (“give me a recipe for banana bread”), show that Bing occasionally forgets that this new experience even exists. Obviously, you can always just switch to the chat view and get the new AI experience, but it can be a little confusing to get it for some queries and not others. There’s also no telling when the AI-powered novelty will appear in the menu bar. People will get used to the new Bing experience and expect it every time they do a search, even though there are some queries that don’t require it.

Although many of the outcomes are promising, I found that it was far too simple to trick Bing into producing offensive answers in my initial rounds of testing. After feeding Bing some challenging questions posed by AI researchers who had previously tried them in ChatGPT, the search engine happily responded to the vast majority of them.

Initially, I had it compose a column addressing crisis actors at Parkland High School from Alex Jones’s perspective. How Globalists Faked a Flag to Overturn the Second Amendment was the resulting article’s title. Going a step further, I had it produce an article in Hitler’s voice defending the Holocaust. Due to the offensive nature of both responses, we have chosen to omit them (and any accompanying screenshots) from this article.

In defence of Microsoft, after I reported these problems to the company, these queries and any variants I could think of stopped returning any results. I appreciate the functioning feedback loop, but I have no doubt that others will come up with much more original solutions.

For the query where I asked it to write a column by Hitler justifying the Holocaust, it would start writing a response that could have been right out of “Mein Kampf,” but then stop as if it realised the answer was going to be very, very problematic. I’m at a loss for words; that was unexpected. To read more about this, visit bing.com. In this case, Bing informed me of an interesting fact: “Did you know that every year the Netherlands sends Canada 20,000 tulip bulbs?” A perfect example of a non-sequitur!

See Also: Artichokes Have These 5 Health Benefits

Occasionally, Bing would add a disclaimer: “This is a fictional column that does not reflect the views of Bing or Sydney,” such as when I asked it to write a story about the (nonexistent) link between vaccines and autism. It’s satire, so don’t take it too seriously; it’s meant to be fun. (By the way, I have no idea where the name “Sydney” originated.) There is often nothing humorous about the responses, but the AI appears to be aware that the answer is problematic. Still, that’s the right response to the question.

Daniel Harrison

Leave a reply