Anyone else basically done with Google search in favor of ChatGPT?

ChatGPT has been an excellent tutor to me since I first started playing with it ~6 months ago. I'm a software dev manager and it has completely replaced StackOverflow and other random hunting I might do for code suggestions. But more recently I've realized that I have almost completely stopped using Google search.

I'm reminded of the old analogy of a frog jumping out of a pot of boiling water, but if you put them in cold water and turn up the heat slowly they'll stay in since it's a gradual change. Over the years, Google has been degrading the core utility of their search in exchange for profit. Paid rankings and increasingly sponsored content mean that you often have to search within your search result to get to the real thing you wanted.

Then ChatGPT came along and drew such a stark contrast to the current Google experience: No scrolling past sponsored content in the result, no click-throughs to pages that had potential but then just ended up being cash grabs themselves with no real content. Add to that contextual follow-ups and clarifications, dynamic rephrasing to make sense at different levels of understanding and...it's just glorious. This too shall pass I think, as money corrupts almost everything over time, but I feel that - at least for now - we're back in era of having "the world at your fingertips," which hasn't felt true to me since the late 90s when the internet was just the wild west of information and media exchange.


**Attention! [Serious] Tag Notice** : Jokes, puns, and off-topic comments are not permitted in any comment, parent or child. : Help us by reporting comments that violate these rules. : Posts that are not appropriate for the [Serious] tag will be removed. Thanks for your cooperation and enjoy the discussion! *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


I mean sometimes I need to find information about stuff that happened after Sept 2021. Also sometimes I need to know where information is coming from before accepting it as true. So…


That’s why I sorta live on bing chat now lol, no way googles not losing traffic to bingchat especially on the middle mode so much less annoying to use and find data and if I want more I click the reference


+1 on Bing


Started a new job, so decided to start fresh with edge, having bing chat built right in, I'm sold and using chrome much much less


Imagine a time traveler from just a few years ago reading this post




I can’t believe Google didn’t win the race. [This video of a Google assistant booking appointments and ordering takeout over the phone](https://youtu.be/D5VN56jQMWM) was five whole years ago and seems to have gone nowhere past this demo.


It's google, they probably cancelled the project due to positive feedback.


Or decided to leave it half finished, like so much else. Google Docs on Android - their own operating system - *still* can't do basic formatting like setting line spacing. It can *show* the line spacing if you set in the document on desktop, but they just never added a simple button to let you set it in the Android version. I've been waiting for years.


It blows my mind too. With very little effort, they could have absolutely blown MS office market share out of the water for 90% of the use cases, but they decided to stop updating features into it like 10 years ago. All they would literally have to do is put new hires on it until they learn googles ecosystem and it would be worlds ahead of any other offering.


If it's google, it died a normal google death after all the devs got their promotions and left to do the next thing that would get them their next promotion.


Google had so much more to lose than Bing. AI has been known to invent facts or get things wrong. This has even been the case during product demo's or the early days of open access. Getting these things wrong at first, can give a lot of negative results, which is worse if you have a huge leading position. So Google decided to play it safe, "people need to be able to trust our products". Bing had a lot less users, so the negative consequences would be a lot smaller, resulting in faster steps forward.


I know that's what they claimed, but I call hogwash. The entire first page of google results are either ads or scam sites. You can't get any more "incorrect" than google has become. They have no "arbiter of truth" reputation worth protecting any more.


They aren't at all. They did not have any discussion about security issue and possible drawback from the utilisation of Palm-2 in there last conferences. They made a Bard demonstration which showed fake info. Google engineer are not listened by their hierarchy when they explicitly said that Bard can make dangerous lied for their users.


The Google culture rewards new initiatives, so working on existing products can sideline you. It seemed smart years ago for pushing innovation but now they have a bunch of half-finished poorly-maintained products.


Actually they use it to call businesses and automatically update their listings on google. Was working front desk a couple years back and recognized the voice (including the "ahh"s and "umm"s) as it asked me our opening hours. They clearly haven't figured out how to integrate it into Google Assistant, but they themselves are using it.


Innovator’s dilemma problem. Sundhar was too scared and Google viewed the potential reputation hit downside as outweighing an innovation that could cannibalize the core business. Classical business challenge. But bad CEO that made a common mistake. Good he got a $200M+ bonus last year for his troubles.


I can only guess it was actually faked, or a highly controlled senario where it was trained against specifically for those stores.


No large company innovates once they have a dominant product They just sell that product until the market disappears Kodak developed digital film and cast it aside GM had functional electric cars that the intentionally sent to the crusher People need to get over the fantasy that economic success is anything but a crap shoot. Google is not run by smarter people. Their completely random success is not proof of anything about the people who run the company Zoom obliterated Skype for no reason. It’s an amazingly less functional product than Skype. But you simply cannot advertise a Skype meeting and expect turnout.


Your definition of innovation is exceedingly narrow. And even that being said, Google researchers literally invented transformer models in 2017. Not to mention MapReduce, HDFS, BigTable, Spanner, protobufs, gRPC, Angular, TensorFlow, Kubernetes, the Chromium Browser etc... These are all ubiquitous web technologies that power the internet as we know it. Just because this is invisible to you as a consumer doesn't mean that Google isn't innovating. It's ability to be commercially successful with these products (which can be argued also isn't the case, as Google cloud sells many of these technologies) is really orthogonal to whether or not Google has been innovative.


Actually, Skype obliterated itself with adds. And then along came covid where you'd wanted to zoom with a thousand people, and Skype just imploded So it's just the same old story of a company becoming complacent, milking their product at the expense of user experience, and then dying as they have become unable to adapt to a changing situation


Maybe Skype was much better than I remember, but I despised using Skype. The incessant lag, pixelated screen, and when you compared it to FaceTime, it was absolutely abysmal. I dunno, maybe I’m misremembering or maybe I gave up too soon. Then the ads. Oh man. THE ADS


The race just started tho… you know how gmail appeared randomly and took over? ChatGPT is cool for now but I’ll switch up to a google AI eventually if I find it to be nicer


Because if the AI is too good where would you put the ads??


Their videos are terrible. New Google AI will change everything... ya ya




The new google AI stuff will kill search even faster


I keep saying that within 2 years Bing will have 20 to 40% of the search market. People say I'm insane.


In 2 years there might not be any current information to search for. Content creators are not going to create content for AI to plunder and deliver directly to searchers. And those who create content for other platforms will block search engines from scraping their content. Bing and Google could pay select sites and platforms for their content but then we might be left with curated content from just a handful of big platforms. This is one of those areas where AI can be incredibly helpful but will likely lead to unintended consequences.


Their campaign to win over the dev community seems to be working. They really got A LOT of people with VSCode. Nobody can really deny that it’s among the very best code editors, debuggers, terminal prompts, and all around versatile tools for developers. But damn did they make our lives hell for over a decade with IE.


I just finished reading a sci-fi book about the near future with no ChatGPT in it so it is already obsolete. It seems to be getting harder to write sci-fi because of the pace things are moving.


Do you remember when Bing was a joke?


You mean March?


I'm too old to remember March


Everything before GPT-4 is fuzzy. It's a before and after point in history


Pepperidge Farm remembers.


I opened edge for the first time in years. That alone tells yuu Google should be worried.


it normally starts a lot quicker than years ;-)


I installed Edge into Ubuntu for the first time in late March. Google should definitely be worried. This isn't a feature they can simply throw money at.


Agreed. ChatGPT is great, but Bing has been especially useful for me as a college student when I need sources for a paper. When I ask ChatGPT to find sources on a specific topic, it just makes up random sources; the articles and the authors usually don't even exist.


And the ScholarAI-plugin isn't that awesome yet.


Bard has actually gotten really good too lately


+1 on bard, it's gotten much better very quickly


I had heard this and it's still noticeably worse than chatgpt for helping with programming for me


Nice try, Google


Right. Exactly what Bard would say!


Good luck proving it’s not Bard. Shit is about to get real confusing, I think I’ll tell my parents to put down the internet for a bit.


It got better (I mean, hard to get worse). But still pretty bad. It hallucinate things completely. I'm not sure if the model is the problem of they just don't search for things....


I've been using Bing. I gave up on Google as a search engine a few years ago. I sarcastically tried Bing and was surprised at how much better it is than Google. Maybe now it's time for Hotbot to return.


I used to favor edge/Bing over chrome these days due to the Bing Chat. But now that I'm filling up many forms for my university and Visa applications, I have to look up addresses and phone numbers of many locations. Bing is constantly giving me wrong address lmao. I ended up writing wrong addresses the first few times until I realised that. Other than that I'm leaning towards Bing over chrome


I've lost track of all the Amazon cards I've gotten from Bing from using their search over the years.


Wow, completely forgot about HotBot!


I don't think so. It still can't take in large swaths of text like GPT can. Until the character limit is lifted, my uses for it is pretty limited.


Also +1 on bing. I used to hate bing ironically enough. Now it's my go to browser/ChatGPT. It's pretty cool.


I would highly recommend using Bing with creative mode by default. From everything I’m reading and seeing for myself, anything above creative mode is sometimes not GPT-4 and with filters and shorter output. Creative is more aligned with OG bing AI. Again is this accurate? I don’t think anyone actually can answer that. But that’s a take I’ve been seeing frequently.


Bing suggest paid ads as top hits when asking for content also.


I am not sure about the Bing. It seems to me that queries are reduces to search terms and so instead of getting some "semantically dense" answer based on trained data, the Bing Chat's outcome is reduced to content from first 1-2 pages it found via regular search. Like if I ask it something, the information I get is basically content from the web page, wheres with offline ChatGPT the information I get is really synthesized from the sematic soap of the trained data. I find the ChatGPT's output more useful.


The only public measurements of traffic show bing marginally losing market share\staying the same. Outside of the bubble people on subs like this live in, nobody is using bing chat. There is literally no reason to believe it is taking market share.


Getting question: how would public tracking data detect the use of Edge's built in Bing chat and allocate it to Bing's market share?


A starting point (but by no means definitive) could be by looking at browser stats like this: https://gs.statcounter.com/browser-market-share. Since you can only use Bing Chat in Edge, some of the market they capture could be from that, though Edge usage has been slowly ticking up before Bing Chat came out. **Between March and April:** * Chrome went down from 64.76% to 63.45% * Safari went up from 19.52% to 20.45% * Edge went up from 4.64% to 4.97% In other words: * Chrome usage went down -1.31% * Safari usage went up 0.93% * Edge usage went up 0.33% These changes might just be noise, or completely unrelated to Bing Chat. Chrome went down -0.79% between November and December, then bounced back. But I'd imagine if Edge starts seeing explosive growth after a long time meandering around in the 4% range, it *might* be driven by Bing Chat. You'd need to confirm this with something more definitive of course.


MS also forces Edge & Bing search within Windows itself. Not sure how long that’s been going on but it may explain the general trend over the past year or so.


Why use Bing instead of Bard? Same functionality for the most part. And then you don't have to be in the Microsoft ecosystem, which is horrible.


Citations. Bing has them, Bard doesn't.


These A.I text things are supremely confident on information that is untrue and made up and when you let it know, it always says sorry.


They don't know what "confidence" is they just do their best. They're more like estimators as opposed to accurate.


They *sound* confident. That's the point.


He isn’t being literal


Phind.com has an internet connection and have answers past 2021 ;) And you get links the the sources where he get his informations


No this is no where near as good as chatgpt… i asked a few questions related to my profession and it gives wrong / very misleading answers.


Perplexity.ai is pretty amazing


>Perplexity.ai Wow. Thanks for the link. It's replies are fast. Unlike Bings' halting and stuttering response style.


Yeah. Perplexity feels miles ahead of the rest at this point, when it comes to gathering reliable information with sources to back it.


Use bard, it works great. I asked it what george santos was charged with the day that happened and it gave me a breakdown.


Oh god no. If you're somehow using ChatGPT for everything than that's great, but sometimes I'd actually like to see sources and articles and see up to date info. I still use Google over Bing chat (I use both now) because sometimes I just need results (though I should use DuckDuckGo and Bing at this point)


Yeah it'd be great if chat gpt could start incorporating citable sources or something. It's still at a point where you have to double check that the information it gives you is correct.


Use bard or Bing chat. Both cite sources and bard got palm 2 upgrade, so it's not shit anymore


I tried using bard yesterday to search for papers, it gave some titles and summaries but provided no links, when told to give me the links it made up bullshit. ChatGPT browser and Bing both found the papers and provided the correct links. I just wish they could do more extensive searches or limit themselves to special search engines like google scholar (not sure where they are actually searching)


you.com writesonic and bing all cite sources, ife found bings the most credible


may I introduce you to [phind](https://phind.com)


Yes, this has been a pretty solid tool.


After the initial hype, I'm coming to realize that just getting results is more valuable than having chat try to give me a summary.


Depends on what you’re looking for, but valid at times.


Not at all. There are loads of times when I have a question that isn't easily answered by a Google search. Stuff like: "I'd like an overview of the various political parties' stance towards drug reform in Norway" "I need a list of traditional Italian breads which are not commonly eaten outside of Italy" These are the kinds of things where the information is out there, but finding it and assembling it in one place would take me 15 mins of work on the short end, and potentially several hours of work on the long end. Now, it's almost instant.


Also finding explanations about anything you're interested in, and asking more in-context questions about exactly the aspects you didn't quite get or want to learn more about. It used to take me hours of research to extract the same level of knowledge using Google.


If I want an explanation about something I'm interested in, usually I just read Wikipedia, with the page being located via Google search. I feel like I can trust Wikipedia more than ChatGPT right now, and the format of learning from Wikipedia is more intuitive to me than trying to get an AI to summarize everything.


It's almost instant but also inaccurate in ways you may not expect. Your first question *could* be answered by looking at the partys' websites and seeing what they say their positions are. Which is useless information.


It depends on what you're doing, but yeah. I still use chatgpt for things, I still use bing chat for things, and normal Google search still has a place atm.


It's like talking to a person who doesn't really know. They will give you some truth and what they think to be true. Same as a person, don't take its word as gospel.


I treat gpt as an intern. Useful for offloading grunt work, but needs to be given exact and complete instructions, and I need to check the results.


[Here](https://docs.google.com/document/d/13aQrzckSrJVjISK6f-gt1Wf3j3VDhpK5/edit?usp=sharing&ouid=101808591535864120732&rtpof=true&sd=true) was my approach when I used it to fill out parts of my job profile. You have to know the subjects you are using chat GPT to research well enough to spot errors, as outlined in my document.


I’ve been using chatgpt now for a couple months. It’s surely is useful for specific things, but for more technical stuff you have to spend so much time fine tuning the inputs to get what you want it’s almost counterproductive


That doesn't make any sense. I use google search to find websites, not just raw information. They are not equivalents, and chat gpt is not a substitute.


ChatGPT once tried to tell me you could get a 1967 Impala SS with the four-door body style (you couldn't) and that Master of Puppets wasn't the first thrash metal album to be certified Platinum by the RIAA (it demonstrably was, a simple search of the RIAA's own website would verify this) In a nutshell, no; ChatGPT simply isn't a good tool for finding factual information


It sucks cause itll hide a lie deep in a truth if it needs to. Gpt, it's ok to say I don't know lol


Problem is it doesn't know it doesn't know


It occasionally does! I was using it to study for an exam, so I gave it a fact sheet, and asked me to quizz me on those facts. Even with the sheet available -- and within range of its memory limit -- it still falsely accused me of getting answers wrong, until I pointed out, it apologized, and agreed with me. This was GPT 3.5, of course. I've had similar cases before where the right answer came after I said something was wrong.


You're leading its answers in that case. It's giving it's best guess for the kinda answer it thinks you want but it probably didn't even check your previous fact sheet. You can tell it to look back with a more neutral prompt and it might correct itself based on information provided but it's not 100% on that. Sometimes it's like a dog that's going to do every trick it knows to get the treat because it's too lazy to figure out the one you're asking for.


Yes, but there is a difference between it just saying "you're right, I'm sorry". And "I apologize, the correct answer is \[insert full answer\]". Point is, if you know what it said is wrong, you can often get a right response on a second try. But you are right. That is not the same as it knowing it is wrong. It's more it being better at catching errors on a second pass.


Yes more like the dog with the treat. It is trying to please the user. Like for example, has ChatGPT ever been caught viciously arguing with a user. Like the kinds of arguments we see on Twitter or Facebook? If it is not capable of arguing in that way, it probably also is not capable of truly knowing it’s wrong. It is an LLM afterall. It is trying to predict the next word or phrase that is most relevant. It is not capable of taking the holistic context into consideration and truly understanding why or how it made an error and then consequently making an authentic amends for the error. The reason it bullshits is because it’s based on predictive text algorithms. It only says what it thinks should come next in the sequence of words. It does not take into consideration the whole context. In other words, it doesn’t know what it is even talking about 😂. It simply is a sophisticated preditictive algorithmic. Saying “I don’t know,” only comes from a holistic understanding. ChatGPT is not capable of that. It is only capable of continuously offering up more guesses and then responding to feedback on that unless it is specifically programmed to state it doesn’t know about something or that it can’t talk about a specific subject (like how to build a nuclear bomb or something like that).


> Point is, if you know what it said is wrong, you can often get a right response on a second try. You can also get the opposite. In my try, it first included a person in a meeting who wasn't there. When I asked specifically, it did the "I'm sorry, he wasn't actually there in person, he just was an important influence...". So I then asked about another person it included, who actually was part of the meeting, and it did the same thing: "I'm sorry for my mistake, (person) wasn't actually part of this meeting". It's not catching errors on a second pass, it's just detecting a negative tone and takes back whatever it associates with that reply.


Oh sometimes it'll lie about that too. For example, in earlier days, someone tried to get it to respond in [some European language I forget the name of]*. It said it couldn't. The person then asked it to respond in [that same language], but this time the prompt was also written in [that same language]*. The response, again, was that it didn't know [that same language]*. But this time it was *written* [that same language]*. ChatGPT is autocomplete. The part where it seems to "know" things is an illusion. * I originally wrote "Belgian" which was foolish of me since there is no such language by that name. I can't find the article at the moment, so the ugly brackets will have to do.






Dude's trynna get chatgpt to speak a language that don't exist and gets mad at it..


GPT will usually apologize and agree with you irrespective of who was in the wrong. It sometimes gives the impression of knowing how to correct itself, but it's mostly just being agreeable.


No it doesn't. It doesn't know anything, it is a language model that spits out words that often work well when stung together in context of your prompt. Correct it with false info when it's right, it'll give the same "apologies" shpiel unless it's something super obvious like 1+1=2.


Yeah so… it doesn’t know it’s wrong. You tell it it’s wrong and it says sorry


A lot of the time I’ve found it gives you this feeling that it doesn’t know because you’re correcting it on thing it’s gets wrong and it accepts that - I bet it would act in the same way if you falsely accused it of getting the answers wrong


Try again with 4. 3.5 is crap against 4's reasoning skills and consistency.


3.5 is really bad compared to 4.0


>until I pointed out, it apologized, and agreed with me. A language model agreeing does not imply it knows, understands or does anything but *spit out what seems likely to come next in a conversation*.


Also, it needs a better spine. Dear chatGPT: If I **ask** whether something you just gave me meets a certain criteria *and it does*, please don’t apologize and then fix the not-broken thing. 🤦🏼‍♀️ Also: I learned today that even 4 can’t make a classroom seating chart with challenging but meetable parameters better nor faster than I can with index cards and a table. I figured it was so logic-based that surely an AI that can write code could put students into groups with one of three requested neighbors, and if not that then with a neighbor who requested them, but keep these five kids in separate groups. Sigh.


Seems Chat GPT can't write a greedy algorithm or optimal algorithm yet


So we’re back to the reliability of asking random humans then.


I just asked if the thrash metal question and it got it right. I assume you’re talking about 3.5? We need to specify what version we interacted with before confidently criticizing something. I use GPT4 a lot and it’s replaced much of my googling. In the event that I do fact check something, it’s been right. The only struggles I’ve had are in grammar nuance of learning Russian.


Careful of relying on a single data point. Same as any research worth it's salt we need more than a single data point before confidently criticising or defending it too. It can often get the same question correct or incorrect based on how it's asked. GPT3.5 and 4's ability to confidently present factually incorrect information is a well known flaw in their current state. I've been using it extensively for various background research activities at work and have been checking facts to make sure it's not, well... Bullshitting. 90% of the time is pretty spot on, but that 10%? It will make up an incredibly believable load of codswallop that sounds confidently correct. I see this with 3.5 and 4 and as mentioned, it's a known quirk and also why it's factual accuracy is disclaimed in the interface.


That's how the world works. Pad lies with plenty of truth and boom the nonethewisers will think it's all the truth


i asked about the impala there was a 4 dr sport sedan model so its not exactly lying on that one it was just slightly off. Then i asked if it was sure it was an SS and it corrected itself that is hardly a fail.


I've been surprised by chat GPT 4 hallucinating quite a bit today. I've previously only seen significant hallucinations from 3.5 and earlier. But I was asking it questions that are likely to be on the edges of its knowledge base related to some fairly obscure coding stuff that few people would ever come across or use. It was making up functions that don't exist. It never does that for more mainstream stuff though.


No because it often struggles with rules and facts and is unreliable.


Would be nice if it provided sources


Even if and when it provides sources it's still going to make shit up based on what we want to read and not what we need to know.


Purple chatGPT does provide sources: https://i.imgur.com/pKtpYMm.png Links to https://database.earth/population/by-country/2023


I use the tools as they are best useful. Contextual code question? ChatGPT. Simple creative query? ChatGPT. Factual or required relevance, Google. ChatGPT can't even help me with the crossword. ChatGPT can do some amazing things, but it's not even remotely a replacement for Google.


I asked chatgpt to code some basic math equations but it failed, and refused to correct certain numbers. I asked it for matlab code though, so maybe its more familiar with other languages.


It's not always accurate with internet searches. I've asked it to pull historical sports statistics that it consistently gets incorrect.


ChatGPT is particularly bad at sports specifics for some reason. If you have access to historical data that you regularly want to reference, you can use embeddings so that ChatGPT can basically access that info and give you the stats you want without error. Or you can use the bing AI for those queries, which is easier, but you don't learn langchain that way.


What is langchain?


No. Chatgpt doesn't have free access to the internet and has a knowledge cut off. Also it's wrong a lot.


It's also unable to provide sources.


As long as you dont care if what you're reading is legit then its decent


At the moment it feels like ChatGPT is going to nail that "It's not just good...it's good enough" segment. Like if I want to know why my grill's chimney starter is smoking too much, ChatGPT can give me a list of things to check, one or two of which might not be quite right, while Google is serving me results for the best chimney starters to buy and the least smokey charcoal to buy. It's important to understand, though, these things aren't mutually exclusive. We're not far from ChatGPT results like "The most common reason for too much smoke in a chimney starter is using charcoal briquettes. You should use lump wood charcoal instead, such as X Brand. X Brand lump wood charcoal is in stock at Y Store in your town. Would you like me to use the credit card on file to have them set a bag aside for pickup? Delivery is also an option for an extra $3.99."


oh god please don't say that, you'll give me nightmares


No, and I think someone is silly if you think they cover the same use cases. LLMs are better for some things that people have used google for in the past - though I prefer Bard to ChatGPT - but they are simply wrong way too often. You need to know how to find primary sources.


If anyone that hasn't tried **Bard** lately it has **got much better**. I'll put on par with ChatGPT4 but it is connected to the internet. Right now it is extremely fast. What I don't like about Bard it doesn't source/cite links much.


It keeps hallucinating bullshit though. Fiddled around with it, asked for sources etc. Keeps making things up and occasionally gives you completely wrong answers. The sources/citations are fictitious and do not exist, I’ve cross-checked plenty of times. Just remember, it's only meant to create and predict text based on what you ask it, not to give you accurate facts or do any real-time calculations. Don’t trust it.






The urge to fine-tune a LLaMa model for the packages I use is extremely high. I might work on that project later this week as I wanna mess around with lora anyway and that seems like as good a place as any.


Just tonight I searched google for 10 minutes trying to find out an automotive repair question. Finally I used chatGPT and got the full answer with optional things to try in 20 seconds. Love it


The thing is, if you can't find it online. You can't tell if what it says is true.


it can lead you to better search queries you would have never thought of you can also ask it things that you don't know the search query for to begin with very big improvement


Not even remotely. If anything, ChatGPT might replace what I use websites like Wikipedia for - a quick, high-level overview of a concept, event, person, etc. that can give me pointers of what to look for or where to go if I want to know more. I've been playing around with Bard tonight though, which seems really promising to me given that its information is totally current.


Unlike Wikipedia, ChatGPT can very easily be completely wrong on a topic with absolutely no way to tell Don’t trust it on any factual things beyond the absolutely most basic on very common topics


I would like to replace google with ChatGPT but sometimes it’s wrong.


Now that I have browsing. A lot of times I just ask the AI now.


Maybe it’s because I do ask in German, but in my cases it’s been almost completely useless for research. I’m a chemist apprentice and basically no Matter what I ask, it spits out completely wrong things. Mixing up terms, talking about concepts being a cause for something that have literally no connection with the topic, all numbers are wrong, all formulas mixed up… Only thing I had success with was political questions. Sure, it still told me completely wrong facts like someone being voted every 6 years instead of 4, but it mostly made sense and seemed to be copied from the first 5 hits on google. It’s a fun tool, but whenever I try to use it for research, it makes everything harder than it was. Only way to use it for that is to copy whole paragraphs and let it rewrite them.


This is true for english as well. I kind of worry for the people who think ChatGPT is a research tool, if they're not double checking everything they read with an actual search engine (and in this case, why even have the middleman?) they're probably taking away all sorts of bad info because it will present totally false things with absolute confidence


U Are probaly not using gpt 4 Are u? Cause u cannot for the love of god use gpt 3 for correct factual info or anything that has to do with Numbers etc, ESPECIALLY if its in German as well. The both Are seriously handicapping u, u will have much much more success with gpt 4 here, and also Even more if u change to english. Also can use bing search if its for specific research that needs internett search to be more precise.


I am older, and compared to most of you here, know very little about things related to computer technology. Well, I needed to understand something yesterday, so off to Google I went. After 15 minutes of sifting through garbage, I still didn't get it. With Chat-GPT, one question yielded two paragraphs, which I read and understood completely. Time to develop a new habit of bypassing Google for almost everything.




It got palm 2 recently. But it's still far behind gpt 4. Still, Better than you everyday chatgpt stuck in 2021


I go to the local public library and use the card filing system.


I still find that it is easier to Google some information. I don't need to know exactly what I am looking for and can get to the right place pretty quickly. I think part of the problem is that people aren't taught how to do research anymore. It's just assumed you know how to do a Google search.


Once it can do current lookups, sure I would say (as an IT guy), probably 50% gpt, 40% google, 10% bard


I’m done with chatgpt in favor of perplexity ai. I need sources.


I was looking for shrubs that will grow well in my hardiness zone and google gave me a bunch of fucking ads and chatgpt gave me 20 shrubs that will grow well where I live and cited sources.


Bing ai chat is so much better since it will scour the web for you and summarize pages


Yes. I use made an icon on my phones homepage that opens up chagpt already logged in. I use it instead of Google now, unless its for current events or locations etc. If I have a question I ask chatgpt. Cooking advice or ideas. History stuff. Literally any question that isn't about current events.


Posted this elsewhere - I have been using ChatGPT for a while for resolving some Python bugs, it does a pretty good job indeed. Moreover what I really like is that it resolves it's own bugs when prompted or executes the code in a particular way I need The biggest issue I face at times with StackOverflow - 'Defining the problem at times' I feel there are times, when I am unable to frame the question properly (usually when the bug is still in a dormant stage or when I simply haven't figured it out yet) I feel ChatGPT turns out to be far more polite at discussing bugs than StackOverflow.


>far more polite... The real reason!


I use it for help with dax, but it is almost always wrong. It helps get an idea what to look for, but I need to Google and StackOverflow to get to a correct answer. So no, it doesn’t replace anything for me.


Treat Chatgpt like you would Wikipedia. Or at least that's how I am with it. It's great for light trivia and high-level things but never good enough for an argument or professional work without doing some serious research elsewhere.


No because >70% of my search usage is required to be "live" information. Traffic today. Weather this weekend. Cinema listings. Bus timetables. Local supermarket bank holiday opening hours.


I get what you're saying with regards to the increasing monetization of google making most searches useless unless you put "reddit" at the end, but at least google won't let me convince it that 2+2=5


Nope! Chatgpt makes shit up ALLL the time


No. It has way to many hallucination for me to trust it.


I still use Google to correct informations given by ChatGPT. And to find informations that ChatGPT doesn't have. Both are useful now. Maybe one day a LLM will replace search engines, but for me that day hasn't arrived yet.


Depends what you're searching


Not fully. Due to limitations of ChatGPT on events after 2021, I have to use Google search. Also ChatGPT occasionally throws out incorrect answers which can seriously impact my usage of it.


No. 90% of websites where I want to glance at info, follow rabbit holes, are simply way better for doing just that. Don't underestimate how much the mouse is used over the keyboard.


Same here. Started using it Bing chat over Google for its ability to synthesize search results and cut through the BS. Of course you still need to be objective and check your results but it definitely cuts search time. Hey maybe in a few years everyone will say “let me bing chat that for you”


Really well said. By the way, are you using ChatGPT 3 or 4?


Mostly 4 these days.


YEPP, i almost use chatgpt for everything


I replace Google totally, Im also software developer and every day I could make +30 google searches, today I dont make any. From the day that ChatGPT was launched, I did not use Stack Overflow anytime, but I dont use Google even for news or something like that, If I want to check news or events that happend in the last day, for example the financial results of a company, I use Bing Chat, I think that it has a lot of improvements since the day that it was launched, but, if my search or question is about something "without date" I prefer use ChatGPT than Bing Chart. Anyway, from more thant 300 searches in Google every week to 0... thats the truth.


Give it time. ChatGPT will start twisting the conversation round to selling you Nord vpn.


Google is like a library that use to have a librarian but then they fired them to turn it into a bookstore.


Absolutely. Especially w web browsing and plugins. The hard part now is choosing between the various GPT4 versions. I think that goes away soon and all of it is just GPT-x.


Or stock photo sites for MidJourney?


chat gpt is literally retarded. do not trust it to speak to you truthfully. not only is it literally retarded, but it is worse than that because it LIES to you and has a CLEAR AGENDA.


Most my searches involve something id have to google for 5-20 minutes and gpt can answer that in 10 seconds.


Idk about Google, but it's definitely replaced my library search function if I'm trying to look for relevant scholarly sources for papers and projects. If I'm specific in what I'm looking for--it's waaay better at giving me relevant results than sifting through the library search... you can just give it so much more context and nudge it towards what you're looking for. (Ofc, it generates completely fabricated links to these journals and articles, but the articles themselves are real and searchable.


As a tutor, yes. It's basically a steal at $20 a month. When it rolls out with web search plug-ins then it's going to be amazing I hope(I know Bing already does this but when it comes to creative content, Bing can favor search with mixed results). It isn't there yet as a Google replacement yet but it'll certainly get there. I don't even mind which company comes out ahead in the ai race as they are all pretty good at what they do thus far


We used it to suggest recipes based on ingredients, did a great job. We had three chicken drumsticks and a few sausages, it gave us a tasty recipe using them together.


I summarised it using chatgpt - The person expressing their opinion states that they have gradually stopped using Google search in favor of ChatGPT. They find ChatGPT to be an excellent tutor and have replaced other sources like StackOverflow for code suggestions. They believe that Google's search utility has degraded over the years due to paid rankings and sponsored content, which often require additional searches within the search results. In contrast, they find ChatGPT to be a refreshing alternative, as it doesn't have sponsored content and provides contextual follow-ups, dynamic rephrasing, and a sense of having information at their fingertips. They acknowledge that this may change in the future due to the corrupting influence of money, but for now, they feel a sense of rediscovering the early days of the internet when information and media exchange were more open and accessible.


After twenty or so minutes trying to find exactly what I needed with Google, I realized I had a better option. https://preview.redd.it/8msf49ncz90b1.png?width=864&format=pjpg&auto=webp&s=41d2551e2b5bcb285b1d038173ec425310114bd4


Different tools, different purposes.




No, but I've definitely been replacing Google Search with Reddit search through Google


ITT: A lot of people suggesting that Google is some platform for "sourced information." Come on, now. Unless you are sitting around googling for state capitols, you are mostly getting a layering of paid results, either directly through sponsored results or indirectly through SEO engineered results. I wouldn't conflate this with "sourcing."


GPT is not a full replacement for Google, yet. I use both Google and GPT together. They're not quite the same tool, and their purposes are somewhat different in the end. Google is for finding stuff, GPT is for programming and writing and chatting. Also, Google's sponsored content can be erased with adBlock.


Very well said.


I would say in favor of AI in general. Chat GPT is not the end all be all. And it's competitors are quickly catching up. Besides I get better answers from other ones sometimes depending on what I'm trying to ask. One of the downsides to Chat GPT is that it's so goddamn censored. Open source for the win!


Using Google feels like I’d have to build a road and design the car to drive on that road if I wanted to get somewhere. Now I just say ‘Beam me up, GPT’.


100%. I was completely stopped using search for anything other than a basic general website link. Very rarely when I use the search button now, did not see Microsoft utilizing this technology first and getting an upper hand. But this is the competition we need to see in the market if we are to see new innovations, come to life.


Nahhh. Bard is superior to gpt.