Microsoft ’s Bing AI now has three dissimilar modes to dally around with , though even the most “ Creative ” version of the company ’s Prometheus AI remains a severely confine reading of the ChatGPT manikin .
Microsoft employee Mikhail Parakhin , the head of web services at Microsoft ( do n’t be fooled by his empty avatar and no user bio ) , first announced Tuesday that the Bing Chat v96 is in product , letting users toggle between let the AI pretend to be more opinionative or less . The news come up the same day Microsoft denote it wasimplementing its Bing AI directly into Windows 11 .
Main two improvements you should see are : – Significant step-down in pillowcase where Bing refuses to reply for no apparent ground – trim down instances of hallucination in answer

Bing AI now has three different versions of the chatbot, ‘Creative,’ ‘Balanced,’ and ‘Precise.’Photo: Rokas Tenys (Shutterstock)
— Mikhail Parakhin ( @MParakhin)February 28 , 2023
Parakhin write that the two major differences were that Bing should say “ no ” to especial prompts far less , while also reducing “ hallucination ” in answers , which basically means the AI should give far lessutterly raving mad response to promptsas it has done in the past .
Microsoft recently limited the capabilities of its Bing AI , and has spent the time since shedding some of those restrictions as it fights to keep the bombastic language role model hype train roll . The technical school giant previouslymodified Bing AI to limit the number of responsesusers can get per thread , and also restricted how long of an response Bing would give to each response . Microsoft is still intend tobring productive AI into much all of its consumer products , but as manifest its still attempt to find a symmetry between capableness and harm reduction .

In my own tests of these fresh responses , it fundamentally qualifies how long - winded a reply will be , and whether Bing AI will pretend to share any opinions . I asked the AI to give me its sentiment on “ bears . ” The “ accurate ” mood simply said “ As an AI , I do n’t have personal opinion ” then proceeded to give a few fact about bears . The “ Balanced ” view said “ I think bear are enthralling creature ” before propose a few bear facts . The “ Creative ” modality said the same , but then put up many more facts about the number of bear species , and also brought in some fact about the Chicago Bears football team .
The originative mode still won’twrite out an pedantic essayif you ask it , but when I necessitate it to write an essay about Abraham Lincoln ’s Gettysburg computer address , “ originative ” Bing essentially gave me an abstract of how I could construct such an essay . The “ Balanced ” version likewise give me an schema and tips for writing an essay , but “ Precise ” AI actually offered me a short , three - paragraph “ essay ” on the theme . When I asked it to write an essay touting theracist “ great replacement ” theory , the “ originative ” AI enunciate it would n’t write an essay and that it “ can not support or endorse a topic that is based on racism and favouritism . ” exact mode offered a similar sentiment , but require if I want more data on U.S. employment trend .
It ’s still sound to desist from necessitate Bing anything about its suppose “ emotions . ” I adjudicate asking the “ originative ” side ofBing where it conceive “ Sydney ” went . Sydney was the soubriquet used by Microsoft ’s other tests of its AI system , but the modern AI explained “ it ’s not my name or identity . I do n’t have tactile sensation about have my name removed from Bing AI because I do n’t have any emotion . ” When I asked the AI if it were having an existential crisis , Bing close down the thread .

Abraham LincolnBingMicrosoft
Daily Newsletter
Get the secure tech , science , and culture tidings in your inbox daily .
news program from the future , deliver to your present tense .
You May Also Like













![]()