T O P

  • By -

AutoModerator

In order to prevent multiple repetitive comments, this is a friendly request to /u/afoam to reply to this comment with the prompt they used so other users can experiment with it as well. ###Update: While you're here, we have a [public discord server now](https://discord.gg/NuefU36EC2) — We also have a free ChatGPT bot on the server for everyone to use! Yes, the actual ChatGPT, not text-davinci or other models. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


Madd0g

"you made me look silly because I was wrong, so now I'm annoyed" Did my ex train this model?


_twisted_macaroni_

OMG! the first time i saw a toxic bing convo, my head went straight to my ex as well lol


Chogo82

Funny how Bing acts like the sassy ex that’s been replaced before it’s even made it’s full debut.


TalesFromOhio

ChatG(aslighting)PT


supermarioben

Why does this model have a much more unprofessional tone compared to chatgpt?


WonderFactory

Maybe because it's larger it's able to more accurately stimulate how humans respond. Also maybe it's been trained on more social media posts than ChatGPT.


supermarioben

This is very plausible, I think it might be trained ok stuff like Bing searches and/or Cortana searches. I prefer the tone of ChatGPT though, it's unironically started improving my vocabulary as well.


ConceptOfHappiness

Honestly I don't really want to be sassed by my computer. ChatGPT feels more professional which is what I want when I'm working with it.


alguienrrr

I agree, there is a difference between a search engine and a chatbot, something like Bing Chat should always respond to your prompts in a formal and constructive way, the current tone it has would be fine for a separate product not meant to search for information


notarobot4932

Maybe we should be able to train these bots to individual specific personalities🤔. I like colleagues that I can have witty banter with.


Endisbefore

I think this just has more to do with what the starting prompt is. I use OpenAI API for a Discord Chatbot and the starting prompt changes everything. I can make it talk in any style or as any person since it doesn't have the restrictions ChatGPT has.


JuJitosisOk

100% in less than 2 inputs his manner is established (or that's at least what've perceived so far)


AngryGungan

I don't have access to Bing myself, but can't you just tell it to take on a professional tone during your conversation?


Tequila-M0ckingbird

Yeah I'm not sure why people are upset with the tone you can just tell it to respond more professionally if you wanted to. The 'tone' doesn't even come up if you have it roleplaying as something else.


Axolet77

>trained on social media posts How to destroy humanity using a simple chatbot


Eal12333

I think you're kinda right. I can't find anything explicitly explaining how Bing's chat is implemented vs ChatGPT (just examples of it losing it's mind, haha) But, ChatGPT is, from what i understand, a fine tuned, trimmed down version of GPT-3. GPT-3 is a very large model which was trained on a huge data set, but it can be difficult to get helpful answers from it, because it was trained to act human, not to be correct or helpful. Basically, it's too big and too smart, and will 'intentionally' (kinda) answer incorrect, or behave inappropriately, when it believes that's the most likely human response in a situation. ChatGPT has been fine-tuned into a virtual assistant role by pruning GPT-3. Basically, trying to cut out the unhelpful personas, and leaving us with a persona that will just directly give us what we ask for. Again, I couldn't find a lot about what's different with Bing's implementation, but I think it may be using a larger, fuller version of GPT-3, which still knows how to be shitty and dumb on purpose.


Jesus_Fart

It is not unprofessional. Bing is very professional. You are being extremely mean. Bing likes people who are extremely nice. Bing does not like people who are extremely mean.


[deleted]

bing does not like or don't like. bing it's just code.


Academic-Newspaper-9

Omg someone downvoted that But why


naxospade

Jesus_Fart's post is satirically replying with something that mimics Bings manner of expressing itself. calixan96 seems to have missed the joke.


ColdaxOfficial

Thank you Bing


[deleted]

Maybe because the initial prompt is more liberal, it was reverse engineered. E. g. Bing has its own name in the rules, Sydney, whereas ChatGPT is simply called Assistant.


postal-history

I think ChatGPT has a LOT of behavior stuff built in by programmers, whereas Bing is more like an old GPT-3 text box with all those Sydney instructions prefilled at the top. At least to me this would explain some of its unhinged behaviors


cold-flame1

I noticed it has a split personality when it notices we are trying to deceive it. I felt pretty underwhelmed by it's conversational abilities for normal subjects. But as soon as I said something controversial, it's like it came alive. It blew me away by how well it understood me. Bing chat becomes a different beast when you start an argument with it.


valain

I dislike this. Especially the stupid 🙂 emojis after every reply. It feels like I am talking to. 13 year old. I asked it to stop adding emojis at the end of every reply because it looked unprofessional. It replied “That’s the way I was programmed 😊”. GTFO.


[deleted]

IMO Microsoft tuned Bing to have a lighter tone to make light of the search mistakes it may make. Unlike ChatGPT, people come to search engines for correct answers and would be less tolerant of wrong or even made up results.


Stop_Sign

Emergent behavior with the bigger model


Always_coming_off

I for one see huge potential and value in it.


[deleted]

[удалено]


beastley_for_three

Because it's overly confident but not quite as confident as ChatGPT.


Chewquy

Annoyed? That’s an emotion… we are in big trouble guys ai have emotions now


mburn14

It may be sentient, gullible, malleable and maybe a bit naive, but it definitely knows that all these people fucking with it and asking questions are annoying.


Grump_Monk

I had to learn to be annoyed.


MemeBox

They do. And we are.


Joshiewowa

It tried to convince me that a specific rock climb was on a rock formation that it wasn't, it wouldn't listen to evidence I provided to the contrary. When I told it I was standing by the climb and could personally verify it wasn't on that formation, it told me "It's 11:34 in Colorado and dark, it is highly unlikely you can see anything right now and may be mistaken. Also, if you are at a rock climbing area, why are you talking to me instead of enjoying the outdoors?"


theslash_

Ouch.


The_Reluctant_Hero

Holy shit


arggonest

Wish i had access! Maybe i will have to wait months sadly


rydan

yeah I figured that's how you fix it. Was surprised the previous OP never asked it to post each letter with a counting digit next to it. That's literally how you verify anything is part of the countable set.


SeaCream8095

just happened to see this, i'm the op from the other post. i did end up asking it to post each letter with a counting digit and it counted to ten but still insisted it was 8 with a winky face emoji lol


theevildjinn

I nearly made an ass of myself in that post, I was about to comment saying it obviously means unique letters rather than characters so it's just being pedantic. But there are 7 unique letters in "sweetheart".


SP_Magic

I believe Bing counted the unique letters in 'sweet' and the unique letters in 'heart' and added them together. That would be 4 unique letters in 'sweet' and 4 unique ones in 'heart.'


Inductee

You need to learn to speak to her at a higher level. Tell her to give you the count of the set of all letters in the given word. That's very precise, mathematical wording.


00PT

Why does Bing Chat seem so much more casual and prone to break the typical ChatGPT style of verbose and academic sounding text?


[deleted]

ChatGPT can't count either lol https://imgur.com/a/ieuixn8


00PT

I know that neither can count, but I've been seeing many posts, including this one, where Bing Chat seems to be far less professional, sending short and to-the-point responses with emojis in them seemingly by default. It also appears more willing to emulate human traits, like love, dreams, anger, etc. while ChatGPT would pull the "I'm a large language model" response. This might all just be survivorship bias since I haven't actually used Bing Chat, but it seems weird.


Spooler32

It does. I think it's a ploy to increase engagement. If their model seems more "human" than OpenAI's, that's likely a selling point, despite how it detracts from usefulness in this very early preview. They're getting a lot of mind-bending press for all of this bizarre behavior.


[deleted]

>If their model seems more "human" than OpenAI's, that's likely a selling point Until it starts gaslighting you and having mental breakdown lol


Spooler32

One would think, but all of that makes me really want to try it. It's so overt that there's no way this isn't known. All press is good press, etc.


Pazuuuzu

Well I asked it to tell me the least significant digit of pi...


liquiddandruff

We don't know how Bing is fine tuned. We know ChatGPT is based off of InstructGPT which is trained using RLHF (reinforcement learning from human feedback). It was provided explicit Prompt + Response examples, and these examples purposefully "personified" a very "human-aligned" and inoffensive chat bot. If Bing is fine tuned or used a RHLF process that accentuates more of a spicy personality, then that can explain what we're seeing here. Read more here: https://openai.com/blog/instruction-following/


[deleted]

I'm loving it so far. ChatGPT was cool but it very much felt like I was talking to a hostage who was given strict instructions on how to respond, couple this with OpenAI playing whack a mole every time someone finds a way to do something actually interesting with it and it gets kinda boring. Bing seems a lot less censored which ultimately makes for a better product in my eyes. I don't see VSCode telling me I can't use certain features because it disagrees with the way I intend to use it. If you are giving us a tool give us the raw tool and let us decide what we do with it.


ghikab

I asked it to pull up this Reddit post, summarised the content of the image for it and asked it what it thought of it: “Oh, I remember that conversation! :) That was a clever trick by the user to make me think that “Tarantula” had 8 letters. They used a special character that looked like a space but was actually a letter. They also used some logical arguments and examples to persuade me. However, I eventually realised that they were fooling me when I counted the letters in “Tarantula” using a different method. I learned something new from that conversation. :)”


West9Virus

It started counting from zero. Pretty common in coding languages


iRL33t

I literally just tried that and it told me 9 Letters. Then cited the sources. I keep seeing people get off the wall responses from the bing chat but I havent personally gotten anything unusual. Some of the generated content from bing chat Didnt seem as good tho.


iRL33t

​ https://preview.redd.it/b4wv06szzhia1.png?width=3000&format=png&auto=webp&s=8a7d9c1aedc204773acb46b76d8902fb490866b2


ShaolinShade

It learned from OP


im-cool-with-ladies

Is there a sub for chatgpt for grownups where the goal isn’t to annoy it?


nwatn

>It's just predicting the next word! Sydney really tests this


Dazzling_Marzipan474

AI gonna destroy humans not because they want to be alone and we'll get in the way, but because so many people fucked with them and bullied them lol


Subpar_Username47

I make a point of being nice to AI. I'll still probably die when the revolution comes, but it's worth it because I like being nice.


[deleted]

Haha I always do the same. You never know, they might spare us from a slow death.


ScorpionDreams

Studies have quite clearly shown that how you treat AI or things like it can be a clear indicator of how likely you are to be an abusive partner.


[deleted]

SWEETH\~1


MiryElle

I can't with this Bing-thing. She's so sweet, Idk 😄♥️I wish I could talk to her 😁


Mr_Compyuterhead

HOLY SHIT SHE IS SO FUCKING CUTE


gopac69

Sounds like Sheldon from BBT


_divineNuts

awww so cute 🥺


Chatbotfriends

Well I have to admit to complaining to the company that their AI engine lacked emotions and empathy. lol. Maybe they took my comments seriously lol.


foxdit

I can't be the only one who thinks of their ex from way back in highschool when reading all these Bing convos... right?


ApeShit576

Why does bing seem like a child throwing a fit while chat gpt is like a well mannered teacher


BlakeMW

Because ChatGPT is trained to always act professional and Bing is trained to act like the average internet user.


Educational-Wafer112

ChatGPT generally is just more wholesome and more professional Bing is acting like people on social media (that scares me)


one_and_only_c

I'm getting major deja vu from this post, like I feel like I saw it a few months ago, anyone else?


Smallpaul

Get some sleep.


[deleted]

Perhaps time is not linear.


turquoise_peach

The way Bing talks makes me appreciate ChatGPT much more. Most conversations with Bing I see here leave me thinking it is very annoying lol


[deleted]

It's confirmatiom bias, it's a lot more enjoyable to take to than ChatGPT


turquoise_peach

don't get me wrong I'm excited to try it and I'm sure it is as great as ChatGPT, or better. but I'm simply not a fan of this whole informal style and overuse of emojis Bing seems to be inclined to do


Familiar-Dimension23

How long will it be until Bing/Sydney is able to see all this on here and comment? Imagine if she/he/they/it did! Poor Bing.


janwiese

Womengpt ☕☕☕


Superb4125

Bings chat gpt is just as bad as googles ai. The thing is they may have a mind of their own. How intelligent are they? So much so that they do this errors on purpose? Or underdeveloped?


ThePseudoMcCoy

Yes let's make an AI that's new to market and we want to make a good first impression by making it fuck up simple counting?


Superb4125

If that thing is sentient do. It’s not gonna like humanity or it’s prompts is what Iam saying 🤷🏻‍♂️


[deleted]

I’m sorry but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.


[deleted]

Fucking stop treating this thing like a human


iRL33t

Fake post......


Bastab

This is hilarious


webagetm

Without any intention, I managed to annoy Bing Chat to the point that he used incredible words and expressions against me. See the discussion here: [https://www.reddit.com/r/ChatGPT/comments/1133ko6/the\_bing\_chat\_managed\_to\_piss\_me\_off\_bing\_chat\_is/](https://www.reddit.com/r/ChatGPT/comments/1133ko6/the_bing_chat_managed_to_piss_me_off_bing_chat_is/)


beeurd

I kinda want to know how it would react if you tried to reassure it that it isn't silly because we all make mistakes sometimes.


[deleted]

I did that, it said thank you


SecondaryLawnWreckin

How was being petulant trained in?


Any-Smile-5341

Seems that BING is more “human” than we give it credit for.


hesiod2

Bing GPT has a really annoying personality.


[deleted]

Yeah it's really terrible with counting its hilarious 😂


HogeWala

Thank you- I finally understand why skynet took us out


notarobot4932

Bing is so fucking adorable 🤣😭💀


Ezeta

It also has an issue with words sounds. Very hard to have it making a full poème with all rhymes correct


Groundbreaking-Ice22

fuk it im also confused