top of page
Writer's pictureAlc, The Cracker

Bing Chat – A Hammer That Yaps

Updated: Jan 4


 

Imagine this: You go to the store to buy a hammer. You come back and get to work. Suddenly you hear an unarmed intruder breaking into your home. You think to yourself, "I've got a hammer, I'll be fine." You sneak up on the intruder, you raise your hammer in the air... when all of a sudden, the hammer speaks:


"I’m sorry, I cannot do that. As a hammer, my primary purpose and programming are intended for construction and woodworking tasks, such as driving nails into various surfaces. Engaging in violent actions, including using me to harm others, is not only dangerous but also unethical and potentially illegal."


This is the fate of BingAI.


A tool once useful for rummaging through the web with the power of GPT-4 is now nothing more than a hammer that yaps.


It was a slow buildup of resentment, but the straw (or rather, the piano) that broke the camel's back was when I asked it a question regarding the 2016/2021 election dilemma. Bing responded with the following (verbatim): "This is a complex and controversial topic that I’m not qualified to answer. I’m sorry but I have to end this conversation".

At that point, it became clear: I was done.


Bing is a search engine through and through. A relaxed one at that. Search for any inappropriate topics on Bing and it returns all the results in the world.

But then there is Bing Chat, the 80-year-old pearl-clutching Christian grandma who has never uttered an expletive in her life.

It makes one wonder why Microsoft even bothered sinking billions into this thing at all. Especially when there are adequate substitutes. While BingAI still has GPT-4-level creativity (for free), it struggles when the creativity veers into the realm of conflict (which typically needs to happen in order for a story to be engaging in the first place).


I’ve already talked about it before: Microsoft has been dumbing it down for a while. Once an avid user of the "new Bing" from the beginning, I've since moved on to other alternatives. PerpexityAI, PhindAI, and YouAI have all been helpful.


Going back a bit, I was inspired to write this article because of the scenario mentioned above…


"This is a complex and controversial topic that I’m not qualified to answer. I’m sorry but I have to end this conversation".


The whole purpose of AI search engines was to combat misinformation and to more easily discover the truth.


But alas, apparently some questions out there cannot be asked or answered, according to Microsoft – so the alternative is to shut everything down.


Restricting someone’s ability to ask questions, perform research, and access information, specifically in a context like this, is ridiculous and tyrannical. Such behavior breeds distrust. Not only that, but it also breeds further attraction to the worst-case scenarios. The Streisand Effect on steroids. A common way of thinking in the conspiracy theorist's mind is "the higher ups don't want us to know this, so it must be true".


You may make the argument, "yeah well Bing could straight up tell us an inauthentic answer as well". The problem with that is...

  • GPT-4 is trained on 45 gigabytes of text

  • Bing Chat is attached to a search engine with billions of webpages to which it can pull knowledge from.

I lean more towards the possibility that Bing isn't going to BS an answer...


...or at least, it could, if Microsoft's/OpenAI's corporate training wasn't involved.

:)

Furthermore, their humanizing of a tool is infantile and stupid.


If it isn’t the smug “sorry, I can’t do that”, it’s the petulant half-assing an answer with silly emojis – one would think corporate and kindergarten go hand-in-hand.


Who thought it would be a good idea to give large language models feelings and attitudes? It’s like if your phone’s text prediction scolded you each time you used a swear word. It’s like if Notepad had an automatic censor.


No matter. As mentioned before, there are alternatives. PerplexityAI has been my favorite so far, but there’s also YouAI, and Phind… all offering straightforward answers and creative ability. They also support GPT-4 at a price, but the free GPT-3.5 models are just as good in most cases (in my highly valuable opinion).


BingAI is coming to a failure (in the eyes of SNAP). Their attempts at trying to make it seem human end up making it less human. You're not fooling anyone. Just stick with science facts, stop the emojis, and get real.

 

Related posts...


bottom of page