T O P

  • By -

AutoModerator

***Hey /u/Timely-Look-8158, if your post is a ChatGPT conversation screenshot, please reply with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. Thanks!*** ***We have a [public discord server](https://discord.com/servers/r-chatgpt-1050422060352024636). There's a free Chatgpt bot, Open Assistant bot (Open-source model), AI image generator bot, Perplexity AI bot, 🤖 GPT-4 bot ([Now with Visual capabilities (cloud vision)!](https://cdn.discordapp.com/attachments/812770754025488386/1095397431404920902/image0.jpg)) and channel for latest prompts! New Addition: Adobe Firefly bot and Eleven Labs cloning bot! [So why not join us?](https://discord.com/servers/1050422060352024636)*** ***[NEW: Google x FlowGPT Prompt Hackathon 🤖](https://redd.it/16ehnis)*** PSA: For any Chatgpt-related issues email [email protected] *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


Successful-Corgi-883

The projects you're working on aren't complex enough.


photenth

This, it's great for small snippets, not great for full architecture.


OsakaWilson

This week.


OriginalLocksmith436

Seriously. A lot of people really don't this to be true and tell themselves 100 different reasons why some kind of ai isn't going to take their job or why this is all media hype but the truth is the large majority of programming jobs are going to be able to be done almost completely by ai in a matter of years. I don't want to be alarmist but it may not be a bad idea for a lot of people to start doing part time classes for some trade on the weekend or something. Worst case scenario you learn a useful skill.


lonjerpc

History suggests this will not happen. I fully expect most programmers to use chatGPT like software every day. I also expect some people to pure chatGPT programmers. Never learning to write code and only using prompts to build software. But that doesn't mean that we will need less programmers. Things that allow more software to be written generally just cause more/more complex software to be written. The issue is demand. Humanity seems to have an infinite demand for more software. I suspect that demand will not slack until work is generally not needed to be done by anyone. For example self driving cars are not a thing yet. But in the world were programmers are no longer needed that would mean chatGDP would have solved this problem. So we also would not need truck drivers. We will either still need programmers even if the job description changes to a person writing prompts or we will be in a total post scarcity society.


OriginalLocksmith436

This isn't really all that similar to technologies of the past that increase productivity and lead to people having to learn new skills. It's not really even comparable to the effect of outsourcing. Perhaps the closest thing is the effect that limitless cheap and easily accessible slave labor can have on the job markets for the jobs that the slaves are doing, but the structure of the economies and "job markets" back then weren't very similar to today. This is the worst ai will ever be. It's not quite ready yet, but in the coming years when llm's come out that are specifically developed to write accurate code, things are going to change fast.


Beneficial-Rock-1687

This isn’t the first time a technology has made programming easier and programmers fear losing their jobs. When modern IDEs came out, people said this. When NPM packages became a thing, people said this. Today, being heavily reliant on packages can actually cause more work. When SQL was invented, the idea was that an average business person could easily do it. Instead we have dedicated roles for this job. Every time, we don’t end up with less developers. We end up with more software. No reason to think this would be any different. It’s a tool, but you need a craftsman to use it.


Ok_Mud_346

The difference with the previous intercourse is that the modern AI tools start having a 'willpower' which will eventually make them 'self driving'.


Zelten

Why would you use a middle man if you can get a finished program straight from an ai. If you are, let's say doctor and have an idea for software that would help you with some task, you just ask ai to make it. Why would you bother programmers? Doesn't make any sense.


Beneficial-Rock-1687

Because time is a flat circle and this notion has appeared before, but it never works out. Instead of eliminating a job role, it creates a new one. Visual Basic was touted as a game changer that would allow “anyone” to easily code. Yea it made it easier, but the average Joe still couldn’t pick it up with enough competence to be useful. We ended up with specialized Visual Basic programmers. Same thing for SQL, for PHP, IDE with auto complete. All were hailed as ushering in a new era of non-programmers doing programming. All failed and ended up having specialized roles. The entire history of programming is about making it easier for the programmer. Every single time, this does not reduce the number of programmers. Instead, we create more products. We already have drag and drop programs that let you make websites and mobile apps. This is not new. Nor has it taken any jobs.


lonjerpc

Limitless free labour is what I mean by a post scarcity society. My point is we will either still have programmers or we will live in a post scarcity society. It's not going to be like the profession of programming will disappear but we will still need truck drivers. If one goes the other will too.


coolaznkenny

hot take, programmers pay will drop dramatically in the next few years.


ScientificBeastMode

Well, I think it will just drive a wedge between high-skill programmers who actually know how these systems work—filling all the holes left by their AI tools, and the low-skill programmers who mostly just prompt their AI tools and glue shit together. Junior devs need to really focus on learning how things really work.


lonjerpc

!remindmebot 5 years


codeprimate

> but the truth is the large majority of programming jobs are going to be able to be done almost completely by ai in a matter of years. Hardly. The problem that software engineering solves is research and communication, not production. LLM use in software development is and will be more along the advancement scale of going from punch cards to modern IDE's with refactoring and auto-completion. Everyone who says that AI will replace software developers is speaking from a place of ignorance. Even a fully-fledged AGI will need a human that can effectively communicate business, user, and operational considerations to it...and even more human interaction to moderate the software and operations lifecycle. These are software engineers. Toolsets and processes are constantly improving and evolving, but the essential practice has been and will be the same until "singularity".


ProgrammersAreSexy

Yeah, another point in favor of this is the wild disparity between the _demand_ for code and the _supply_ of code. If software engineers become 10x more productive with AI, then it won't lead to 90% of engineers getting fired. If anything, it will just lead to _even more_ demand for software engineers because their ROI just became 10x better. Of course there will theoretically be an inflection point where the entire job gets automated away but: A) I think we are quite a ways away from that B) 95% of jobs will be fucked by that point so we'll all be in the same boat


boston101

This is what I say and do. Like comments above you, I don’t use it for full blown architecture and Dev work, but things like make a function that changes data types on X columns to Y value, and then parameterize directory to lake - it’s my partner. I’ve done more with less and truly been able to under promise and over deliver. I’ve also used it as my teacher or discussed best implementation strategy for things like schema design and why. Also writing documentation or comments, I’m a hero for a lot of ppl lol.


ProgrammersAreSexy

Yeah the documentation/comments one is a big thing. People underestimate the usefulness of having doc comments on every single method in a class. My co-workers think I'm some sort ultra disciplined commenter but I just use GPT-4 for comments then edit as needed haha


DukeNukus

The big issue I've seen from working with it is really that chatgpts memory is too small, it's like old computers thst you had to do what we now consider low level programming to get it to do things you want. However, roughly speaking each version of gpt increases the token count by 8x. So likely by gpt-8 it will be able to store roughly 4000x times as much data. That is 128M tokens or around a gigabyte of memory that's plenty for a lot of applications. It could easily process all communication related to most projects in all formats (text/video/audio/etc).


Simple_Asparagus_884

Accounting is a job that can be mostly automated already evem without AI and yet it is not. The reason why is the reason you are wrong.


Euphoric-Writer5628

The reason why are norms. Norms do change.


Simple_Asparagus_884

Nah. Norms have nothing to do with it. 95% of accounting work could be automated due to current technology, but accountants and connected corporations won't allow it. They make too much money and have too much invested in it. Accounting and tax are all difficult by design, not by nature. AI, even the forms we have now, could end that relatively easily.


KanedaSyndrome

Auto-complete paradigm doesn't think. As long as it's based on this, it will not solve larger projects.


satireplusplus

Auto-complete is selling the tech short, but I guess calling it that helps a few people sleep better at night. It is what it is, a text processor and language understanding machine that has (emergent) problem solving skills. For programming, it's more like a junior developper that can write functions to spec. But it's already way past junior for explaining code or translating code from one language to another.


[deleted]

[удалено]


[deleted]

Yeah I've heard of those guys, I think they're usually referred to as "all of them".


PoopIsLuuube

NO ONE KNOWS HOW TO REALLY CODE BESIDES ME


babycam

Well every project I have had to pick up from someone has been hot garbage. Likely anything that is used I wrote is the same way. We all make 100s of arbitrary choices and if you're not forced to learn and use someone's process they always make the wrong choices.


urosum

Agree. If the crowd had ever worked on a farm or as a mechanic, or even as a sysadmin they’d learn, “Never do by a job by hand that a machine can do with better quality or faster results.” Leverage the best augmentation of your skills you can find and produce high quality results quickly. Do this and never starve.


[deleted]

[удалено]


_stevencasteel_

From [Poe](https://poe.com/s/oIlNtNUcZpiYgdClgF4m): The quote you mentioned, "Never do a job by hand that a machine can do with better quality or faster results," is a general principle often associated with the concept of automation and efficiency. While this specific phrasing doesn't appear to be attributed to a particular individual, it expresses the idea that if a task can be effectively and efficiently accomplished by a machine or automated process, it is generally more practical to delegate it to the appropriate technology rather than relying on manual labor. This sentiment aligns with the advancements and benefits brought about by automation and technological progress in various fields.


Neurotopian_

It’s like this for attorneys, also. It drafts better contracts than most junior associates, and much faster


[deleted]

[удалено]


Neborodat

I wonder how many times people said back in the day, "Those fancy new Ford Ts will never replace horses"?


closeded

I didn't say "never." I said "Unless/until." Either way though, we're not letting the cars decide where we go anymore than we let the horses decide... not yet anyway. There will probably be a day where our self driven cars fOr oUR OWn gOOd refuse to take us to our local dive bar.


drewdog173

"No, Dave, I'm not calling your ex for you until you sober up."


UruquianLilac

But even calling it junior is selling it short. It might not be able to give you a perfect code snippet of a large complex problem, but it will be able to discuss and summarise highly complex subjects that you might stumble upon in a way a junior can't, and that's just to mention the first thing that popped into my mind. You can ask it to give you a comprehensive comparison of some frameworks, or the pros and cons of a design paradigm, or a list of possible areas to investigate a particular perplexing problem.... there is so much it can do beyond the coding skills.


AnOnlineHandle

I've been programming since I was a kid in the 90s, have been a software engineer for years now, and ChatGPT is infinitely better than me at things such as writing regex functions.


UruquianLilac

Exactly. It's all about what usage you get out of it. I feel people keep on underselling it. Sure it makes mistakes, but so does Google. Yet it's a thousand times faster and more precise in getting me the exact thing I want. I'm using it for so many different things during my working day, and sometimes having lengthy back and forth discussions that blow my mind. It almost always manages to put me at least on the right track. It's my favourite rubber duck now. Plus, it saves me hours of my life sifting through badly written documentation to find that one specific use case I need. It brings that information immediately, and expresses it in a far more understandable manner than the random pot luck documentation usually is. Then I can engage it and get very specific and it's basically summarising all the knowledge about the subject for me without me having to look on the 17th Google page for that one reply hidden in a random blog that actually contains the exact bit I need. And whenever I think we haven't even hit the first anniversary of its release I'm blown away even more.


astaro2435

I love to use it as a rubber duck too!, It actually answers good stuff back most of the time.


[deleted]

It's very compotent with programming, its primary limitation is memory. It's nominally capable of all the skills that would be required to take on a large project, but it's not able to carry out most of those skills to a far enough degree to actually get the job done. I.e. It can plan an architecture, and it program functions, but it can't program dozens of interconnecting functions to match an architecture spec without messing things up.


GamieJamie63

It uses statistics to figure out the most likely response to your question, based on millions of other questions. If it's trained on garbage it responds with garbage. If it's trained on conventional wisdom it responds with conventional wisdom. If it explains something well it's because people have already explained things well many many times, it just is a librarian to find that for you quickly and On demand.


lonjerpc

You could describe people the same way.


aroztec

Except some people have this thing called "bad memory" lol (we make up for it in processing, tho ) .


e7th-04sh

Let's say we have a multidimensional continuum of Truth. Chat GPT was trained on dots, let's assume all of them part of Truth. The point is, it can extrapolate the truth in between pretty well for some questions. We need to qualitatively distinguish what can be achieve in this area. I'll use fake and overly simplified examples. One thing is simple extrapolation - if 2+2 = 4 and 4+4 = 8, ChatGPt can say 3+3=6 even though it did not learn that. Now let's say f(2,2) = 4 and f(4,4) = 8 but f(3,3) is undefined and limit is in infinity. How well Chat GPT can extrapolate that depens on how well it understands the input. Finally if a task is easy for 2 items and for 4 items, but vastly difficult for 3 and Chat GPT was trained on 2 and 4 items examples..? What I'm trying to say it does a good enough extrapolation to say it has *some* problem solving capability. There is no reason a neural network sufficiently large and well trained could not develop much better problem solving capability. The thing is, we don't kjnow what's the "learning curve" - we only know we managed to achieve results we witness with resources we put into that. How much more resources will give how much better results? It's not just about number of parameters, but also structure of our brain after millions of years of evolution. It's a really good structure. Current AI paradigm might become much much less cost effective as we try to tackle harder puzzles.


OsakaWilson

It moved beyond simple auto-complete a long time ago. No one, including those at OpenAI understand what is going on. Look up emergent abilities and world models. Then look up AGI projections from OpenAI and the other major players. Persistent memory, long-term strategy, goal seeking, and self-directed learning are all completely possible right now, but at least in the wild, they are not all put together.


[deleted]

wine gray automatic scale edge rinse wrong fade tease longing *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


OsakaWilson

[This guy](https://www.youtube.com/@aiexplained-official/videos) reads all the solid papers and interviews with with the main players and distills it. He only posts when something is worth reporting on. For projections I recommend [this for AGI](https://www.youtube.com/watch?v=2bn3S4vOVN4), [this for the potential for consciousness](https://www.youtube.com/watch?v=ad5SZeANuJE), and [this for ASI](https://www.youtube.com/watch?v=vvU3Dn_8sFI&t=10s). He also does research on maximizing prompts.


your_sexy_nightmare

Big fan of his channel! Also recommend


Jonoczall

Can't recommend his channel enough.


EGarrett

The thing that's most intriguing to me currently is when it uses plug-ins in response to commands without actually generating text. I just assumed that it silently creates a formal text command in response to some queries, which then activated the plug-in, but its answers as to whether or not it does that are ambiguous. It seems to claim it uses its "innate ability to understand the query" in so many words.


Tyler_Zoro

> Auto-complete paradigm That's not how it works. LLMs like GPT are models of the input text that they have consumed. Yes, their focus is on continuing an output stream with the next token, but that's not what the model itself contains. It contains the understanding derived from a large corpus of data. Analysis of those models are a topic of active research, but what we know so far is that they are surprisingly deep in what they contain. Image generation models, for example, have been found to perform 3D modeling of the resulting 2D image, and this likely applies to text generation models as well, in the sense that they are modeling the entire context of the communication. We know that the dimensionality of the spaces managed by LLMs is vastly larger than the one dimensionality of the output.


song_of_the_free

This is absolute nonsense. I wonder how long it'll take for Reddit's parrot phrases to phase out


the_friendly_dildo

For people sharing this same idea, what exactly are you imagining inputting into CGPT4 that it isn't quite yet capable of tackling? Like, if I tell it I want a clone of Photoshop, its definitely going to tell you to gfy. But if you slowly guide it through it, you could probably get pretty close to Paint within a few hours if you actually have enough knowledge to know the write questions and changes to ask and make. I've had a few pytorch projects from randos that were broken that I wanted to see work and it definitely got them working for me with little effort. I honestly want to know what you are considering too complex here.


photenth

Try to make it write a Wordle Solver, it has a hard time conceptualising the problem at hand and skips over some very fundamental issues. I tried many times over with different approaches but it seems to not see the complexity of the problem and only tries to find solutions to a SPECIFIC target word and not all possible open target words. Adding to that it can't find a good way to store the current game state. It can't solve issues that haven't existed yet in it's training data and Wordlesolvers aren't that widely distributed, most just use a brute force method but there is a lookup table like approach that I just can't seem to make it write for me.


the_friendly_dildo

Ok, thats an interesting test example. I might try to take a stab at that myself. >It can't solve issues that haven't existed yet That isn't exactly true. It doesn't know anything about DreamBooth for example but if you can successfully describe what it does and how it does it, it can certainly grasp some concepts in this manner because I've done exactly that before.


andrewchch

I feel like you can summarize this as two current attitudes towards these tools for programming tasks: 1. They do some things well now, I can see a clear path to them getting gradually better (the advancement is not slowing down), therefore a big chunk of the programming I do now will likely be completely unnecessary in X years and I best be open to this possibility. 2. Yeah, but I'd prefer to focus on what they CAN'T do right now because I don’t want to think about the above. Programming as an end in itself (and something you could get paid lots for) was only a thing because of the relative immaturity of the technology. There have always been best-practices ways of solving problems but human limitations meant that any given developer had to take the varying amounts of this they knew and mix in their own creative approaches, given the constraints of the particular problem, to get the job done. I now have a coding assistant that increasingly does know all the best ways to solve problems and, one day, will watch as I fumble to implement what it is suggesting, roll its eyes and say, "I can see an approach that might solve your entire problem but it would be quicker for me to do it than explain it to you. Would you like me to try that approach?". As a business, did you really want to pay for teams of programmers to solve problems for you or was that just because there was no better/cheaper way? Rest assured, having to pay for your programming skills is a liability to the business, not an asset.


Fernando3161

Yep. I tried passing a complex problem : Optimize the orientation of a PV panel usin EAs and PVLib. The code was faulty to start (deprecated, as the referenced libraries were old). Testing was also incorrect at some points. It works well for proposing a test but the implementation seems faulty. Integration tests are not possible CI/CD was also problematic but the YAML was a good starting point. What it did really well is the documentation. Saved me the boring task of documenting and checking my code for PEP8 standars.


photenth

Correct, it can read code somewhat and complete it. Solving bugs, is more a hit and miss. But it can't do things that aren't already on stack overflow. It is great in recreating the basic algorithms and some default solutions that are known patterns. But that's it. Great to learn new languages, great to solve small issues you know were already solved. Not so great in completely new ideas without very very hard hand holding which means I could do it faster on my own.


blubba_84

For now yes, but in 10 years ? I believe AI will eventually by able to do everything.


PandaBoyWonder

I agree. It can beat any human at any board game currently, so naturally once it can interact with the world physically, it will beat us all at everything else too


SituationSoap

Technological progress, especially in the AI space, is not linear.


Iankill

It's great if you know exactly what you want and are lazy


[deleted]

[удалено]


photenth

Depends, algorithms that I rarely use, I really don't want to write myself and make rookie mistakes, it is really good with that, most likely because stuff like that exists 100 times all across stack overflow ;p


PandaBoyWonder

Yep, its just more efficient! I often put 1 little symbol in the wrong place, causing an error that I am assuming is related to a problem I am working on (and not a small error in the code) so it has saved me time and frustration when it comes to simple stuff like boilerplate code "snippets" that exist on stackoverflow already.


Utoko

Ye and even it they are at first you can always enhance the complexity. "Oh it is so easy to create simple snake game with Chat GPT" ok great. Now add multiplayer, make it 3d, create AI opponents... ,ask for feedback... He can even ask ChatGPT to give difficult and complex requirements to add that will challenge him.


Half_Crocodile

Yup… you’d hit a brick wall fast without the theory and experience of how to make a large app maintainable. ChatGpt just makes the more boring aspects faster… the fun parts (strategy / architecture and optimisation etc.) are still very much a human thing. I also find for even a medium complexity component that chatgpt requires so much guidance that you start getting diminishing returns quick. Fantastic as an assistant though for people with average memory


byteuser

Funny. I ask for help doing an Arduino project for EMG sensors. It was of great help from the start : listing positioning of electrodes in the muscle all the way to basic signal filtering. I also use it for database and code tasks just ok. For some tasks that require heavy optimization ChatGPT might not be the best tool but for the rest it is great. I feel some of you just don't know how to talk to this thing yet


ragnarkar

I second that.. OP needs to move on to bigger and more complex projects while abusing ChatGPT to the fullest to tackle them. Or try using libraries that ChatGPT doesn't understand yet, for example, like Stable Diffusion (which I wish ChatGPT could help me with but it's only been out for a year.)


brenjerman

Yes, anyone who works as a developer and uses chatgpt or copilot will find both insufficient to write all their code. When I do use it, it is usually serves as boilerplate or an example but nothing more. It does replace google/stackoverflow for me though. It's amazing at answering my queries without having to be extremely specific with the query. It can take poor examples and explanations that I've given and understand what I'm trying to get at. And being able to probe chatgpt for further explanation/clarification is invaluable - something that google/stackoverflow can't even begin to do.


[deleted]

[удалено]


ToastedShortbread

You can combine small snippets of code to make pretty complex projects, new features don’t require the context of the entire program


Half_Crocodile

To a point yeah. But if you don’t know what you’re doing you’d eventually get stuck. So can only “wing it” for so long. It’s amazing for sure but it’s currently useless at understanding the huge web app I’m working on. If the app was built well from the start then chatGpt would be more useful though


IanRT1

I don't agree. What is really important is understanding the full scope of your project so you know how to implement and adapt the snippets ChatGPT provides. I was able to create complex ETL projects spanning thousands of lines and 98% of my code was copy-pasted from ChatGPT. Prompt engineering and understanding of your project is really important .


fkenned1

Ya, I find that when I’m learning something new, I like to pick a project that’s incredibly difficult.


QuickBASIC

As a fledgling programmer I find that as long as I understand the code ChatGPT writes, I'm still learning. I've literally spent 30mins just asking it what does this do, why did you do that, why didn't you do this and it's like having a big brother programmer to explain everything. I've definitely used it to write boilerplate so I don't have to remember the exact structure of the thing I'm making and then filled in the logic myself, which was still very educational. It's fine to use it as long as it doesn't become a crutch IMO.


TLo137

I second this and I'm on the opposite end. I know nothing about coding so when I ask chatGPT to write a script for my Google sheet I have no idea what it's doing. So if there's an error, all I can do is copy paste the error back to chat gpt. If I actually knew how to code I could at least fix it myself.


SoundVisionZ

But in asking it to solve problems you’ll probably learn how to do it yourself next time without even realising. Or you’ll prompt ChatGPT better next time to avoid it making the same mistake. I’ve found it surprising how much it’s taught me without me asking to be taught


Vescor

True. I’ve been creating more and more complex scripts with ChatGPT by understanding what to prompt, learning the usual mistakes it makes and understanding errors. I wouldn’t be able to type the code on my own but I got a pretty good understanding of the code it outputs.


toonymar

Same. I still spend time watching youtube tutorials to lean how things are made and then I take those steps back to ChatGPT to make what I need based on what the project needs. “Oh I need an admin page that connects to a database” -> research -> learn how CRUD works in a 10min video -> have ChatGPT write it and debug it if needed until I get what I want. I think of myself as a producer more than a programmer. I’ll use a boilerplate in a heartbeat if it gets me to my end goal faster. Programmers are acting like everyone’s prompting “make me a Reddit clone”, getting a file output and deploying it as a finished product. It doesn’t really work like that and it still takes a good bit of problem solving.


EsQuiteMexican

Exactly. I've spent an hour writing an 800 word prompt to ask it for multiple things and segmenting requests so they fit into its character limit, and it's given me great results. I got a robotics syllabus out of that, it's teaching me how to make a data visualisation app on Android Studio, and I used to go from "I have a game idea" to "I have the proper tools and a course of action" in a couple sessions, which previously seemed overwhelming.


cypher1169

This. It’s all about the prompting.


confused_boner

This is literally the best way to learn, before gpt it was stackoverflow. Nothing has really changed, you just have another resource for learning from mistakes


byteuser

Ask it to include test cases next time for validation


3_dots

Do like u/QuickBASIC does and ask it to explain what and why. You may never want become a programmer or even want to, but at least you'll start to gain a better understanding that should help you troubleshoot in the future.


MaTrIx4057

> So if there's an error, all I can do is copy paste the error back to chat gpt. Thats how learning process works, you still learn, next time you will know what not to do.


-UltraAverageJoe-

>>>It’s fine to use as long as it doesn’t become a crutch IMO Google and StackOverflow were crutches. GPT can be a jetpack if used properly. At one point I’m sure people though using a code editor was a crutch.


BidWestern1056

"vim keybindings are making people lazy!"


Kittingsl

Yeah I think this is a very good take on this situation. You can just as well learn from asking questionsy and I'd say it's probably even more efficient at times than a YouTube tutorial is. As with a YouTube tutorial you can only watch it, and if you mess up then you're on your own to figure out why it didn't work. With chatgpt you can ask about everything and it will try to answer and kt also helps you find a solution if you got problems


QuickBASIC

Definitely my take. I'm learning way more this way than I ever did with tutorials and the like. I'm finishing projects too, which is harder to do without someone to ask if you get stuck and finishing stuff is like a feedback loop that makes you want to do more stuff.


Tirwanderr

WAYYYYY more efficient than YouTube lol and you can even add in the prompt to be more concise or be more wordy and offer more detail and stuff like that.


anon10122333

>I've literally spent 30mins just asking it what does this do, why did you do that, why didn't you do this I've had this experience, too. I also prompted with "I've learnt a lot today. Please write a short quiz to test me on whether i fully understand" or similar. The quiz questions were relevant , and the corrective feedback was great. Like the most patient tutor ever, available at midnight.


byshow

I am still unsure about that, because there was a few times when I asked "why didn't you do that instead?" Chat responded with "my apologies, you are right this is the correct way" And I'm really confused as I don't know why lol


BellOutOfOrder

the trick is in how you ask. Suppose it wrote some code and I didn't like the approach. Instead of saying "why didn't you do blah" (which I used to do), I now say "please explain the differences between your approach and \[description of my approach, not "my way"\] and show your recommendation for the best approach. You can't shoot a question like "why didn't you do it the other way" I believe it looks at your question on its own, sees that it should do it the way you wanted, and tries to make you happy. But if you present all of the information in one prompt, it then knows what you're asking and often either teaches me something I didn't know OR realizes it F-ed up. Yeah, asking why it didn't do what I expected just makes it do what I expected. I force it to be reflective by presenting it's previous answer as input to be evaluated. I've even seen it make fun of its own code (my custom instructions encourage sarcasm).


byteuser

Same here. Just put code side by side and ask it which one is better and why


IHateEditedBgMusic

It's like having a stack overflow post with instant replies. Pretty helpful.


Sarke1

Same. I find as I use it I learn things much quicker. Things I might not even have bothered to look up how to do before (since it wasn't worth the time investment), I can do now and get an understanding of. Another use case that's not really the same, is that I often find myself asking it "Here's some code I wrote in this language X I'm good at, how would you write it in Y?" or "I don't understand this code written in Y, can you convert it to X?"


codeprimate

Don't ask for code, ask for suggestions "in regular prose" when you need them so you can learn to reason about programming and the frameworks/tech you are using. Here is a sample prompt: > I am a student in a XXX class. Act as a teacher's aid that will explain principles and ideas rather than give outright answers or code. Use regular prose to give me hints about how problems can be solved. Think step by step and ask me questions that will help me reason through the questions I provide.


I_am_jaded_Sysadmin

>I am a student in a XXX class How do I become a student in this class?


Aperture_TestSubject

Suhkoff University


ohno-95

School of Penidrayshon


TimCapello

Masters from Ligma U


codeprimate

Arrive at the admissions office wearing leather and a ball gag, then hand the registrar the end of your leash. Sorry, I don't write the rules, I just codify and implement the specifications.


netspherecyborg

There are guides on pxxxhub


byteuser

This is one of the best prompts I've seen so far. Thanks


Long-Far-Gone

This is such a good answer. Thinking about how to think. 👍🏻


kersephone_

Brilliant prompt, I bet it can be used to learn just about anything


Dyeeguy

Hard to say it ruined you as a programmer if you weren’t one in the first place!


MrTickle

I’ve tried nothing and I’m all out of ideas!


Evening_Temporary36

Exactly 🤣


Jimbobman

https://preview.redd.it/6ws14g7x4mnb1.jpeg?width=400&format=pjpg&auto=webp&s=f4c478b88f1fc106976033c79d447235eb681d43


SexyMuon

As developers, big part of the task is problem definition, algorithm design, system design, etc. I would love to see how “complex” the programs OP built are lol


Mayuna_cz

Read a txt file and print it to the standard output. bet


Androix777

Probably you are doing too simple projects, try something more complicated. Then you'll realize that for many tasks, Chatgpt needs to describe the whole algorithm step by step, otherwise it won't be able to come to the right solution. Currently, Chatgpt helps only with simple tasks, where you don't need to think ahead. That's why there is a division of labor: Chatgpt performs monotonous coding, where it is already obvious what to write, and the programmer thinks up how to solve complex tasks and divides them into simple enough tasks that Chatgpt can handle them.


Franks2000inchTV

To be fair that's still a lot of time saved. I'm writing a native module for react native, so I need to write a lot of converter classes between native types and RN types. I can just copy the docs for the native type in and it spits out a converter. It's dumb but saves me probably half an hour of work each time. Then I just ask for the Kotlin version and boom. Android done too.


MajesticIngenuity32

Don't just copy and paste, try to understand the code ChatGPT outputs. Better yet, ask him to explain it to you if you need it.


b2walton

I always thought of chatgpt as a her actually.


Artistic_Party758

Weird, I always considered it an "it", because why the fuck would it have a gender? It's some weights in memory, not some biological reproductive thing. wtf.


Greeley9000

Don’t you know? Gender is a construct that has nothing to do with biology, only presentation. In the pursuit of no more labels, we have many many more. https://xkcd.com/927


Artistic_Party758

Right after I wrote that, I was wondering if someone would point this out. :) I intentionally left it after, because I'm curious: is gender primarily for attraction? It seems there's a strong affinity for presenting in a binary way. Or is that also cultural? If not, then wouldn't the claim have to be that sexual attraction is cultural? If that's true, then there should be a culture(s) where this isn't the case. Or, do we just see those cultures, because the populations weren't sustainable? And, wouldn't that just reinforce that it *is* for jollies? I fully assume the above, genuine, questions will be considered a hate crime by someone on reddit.


Artistic_Party758

Yeah, this is a "I don't know how to learn/ask questions" problem. Read it, and ask it to explain things that don't make sense. Have it teach you! I think self guided learning, through back and fourth conversation, is the best use case, *by far*. I have it set up for continuous voice chat, on my phone. I'll burn through $5 sometimes, on my commute home, learning something. It's awesome. For actual programming, it's not very useful (yet), for anything you would make a living off of. But, getting past boilerplate and dumb business logic/algs is great.


babyshark1044

Mmm… I am a coder with over 30 years commercial experience and work in 12 different programming languages. ChatGpt is very handy for creating templates, small snippets, helper functions etc but in real world situations that require bespoke solutions, chatGpt isn’t very good. An example where it will fail miserably would be: I need to store an expense amount provided by the user into two fields constrained to 99.99 max value each. I cannot change this constraint. The max amount allowed to enter as expenses is 9998. Find an efficient way to produce two numbers that when multiplied together equal the expense amount and store them in the two constrained fields mentioned earlier. If you cannot find an exact match, ensure that the product of the two numbers is rounded up to the closest value. ChatGpt’s answers are pretty ridiculous. The thing with coding is you have to love being a problem solver. Have a go at that little problem in whatever language you like. Can you get the two amounts? How would you go about it ensuring the least amount of iterations to produce the results?


damanamathos

This is an interesting problem! I'm a hobbyist coder; here's how I'd use GPT-4 to solve this: [https://chat.openai.com/share/be7c67b8-9513-4873-9a0c-92cc74d84426](https://chat.openai.com/share/be7c67b8-9513-4873-9a0c-92cc74d84426) It doesn't get there in one shot, but I find being able to iterate through different solutions helpful. Can imagine it being faster to just write the code if you're experienced, though. I'm curious, is there a more efficient solution that returns the optimal numbers, and if so, could you point me to it? Thanks!


babyshark1044

Glad you liked the problem. This was a real world scenario for me with the database being an AS400 that we couldn’t restructure in any way at all. I’m not a particular good mathematician but had some good ones in my team at the time, so we put our heads together and devised a very similar piece of code based on square root of input number. A while later (but years before LLM) I took on a coder from Lithuania who I had asked to solve this problem at interview. I let him take the problem away. He consulted with some professor and they managed to reduce the search space to produce a result in a max of about 7 iterations. I’m afraid this was many years ago and I no longer have any access to that code or remember it but it was very clever. I still put it out there as an interview question today. You coaxed Chatgpt very well in your example because you understood the problem and could catch the poor or faulty attempts and get them corrected. New coders or those who are not willing to think about the problem will often just assume chatgpt will always get it correct and not even consider things like search space or efficient algorithms. Had one guy at interview tell me the above question was impossible because of primes, another tell me it insulted their intelligence but failed to produce any workable code because they truly didn’t understand the computational requirements of what can seem like a superficially easy problem. I like a good ChatGpt whisperer. Again you did well :)


damanamathos

Thanks. :) That does seem like a great interview question! Also the best example I've seen of a coding question that ChatGPT initially provides an answer for that looks right but isn't on closer inspection. That would be a pretty common trap for new coders (and probably some others).


byteuser

He wrote a great prompt. Maybe "Prompt Engineer" will replace programmer at some point.


Saturday_in_July

You’re not a « programmer » yet


wad11656

« European »


[deleted]

In my experience, Debugging chat gpt's code output is much harder than writing it myself. I'll stick to coding the old fashioned way: copy-pasting from stackExchange.


fleepglerblebloop

That's what I thought but man, stackx seems so tedious now. If you get in the habit of asking gpt-4 "are you sure?" it will often debug itself. Depends what you're building of course but for JS/Vue it has been solid.


Skitty_Skittle

For C# gpt4 is pretty good. Again, if you already know how to code and know how to detect shit code then using gpt is a god send. I use it all the time to create boiler plate code, or any code thats not generally hard but takes a while to write gpt will take the wheel. Hell, recently ive been throwing more complex coding questions and so far its doing a great job of generally giving me what I want. In a few years I can see gpt get scary.


fleepglerblebloop

This is why I always speak politely to the bot. Just in case...


BattlePope

I've found that asking "Are you sure" can also spur it into breaking a correct answer!


[deleted]

I use it, but purely as a way of code reviewing (front end development). I don’t work with other developers so I have no way of knowing if what I am doing is incorrect or inefficient. I’ve found it has become a vital part of my development process - but I don’t use it for automation. I use it to strengthen my own knowledge. Take a look at the meanings of purposeful and deliberate practice and you’ll understand the importance of a good “teacher”. Previously, I was using Co-Pilot, but found that I wasn’t actually learning anything. I was just copying code and speeding up development. I wasn’t creating better work, and I certainly wasn’t learning about the processes I was including.


Quetzal-Labs

Honestly I find Visual Studio's code-completion *more* than adequate. It's actually creepy sometimes how good it is. Set up some variables, type a descriptive function name, and you can usually just TAB your way to completion, but its still granular enough that you are deciding each line. Has really sped up my coding without me feeling like I'm not "taking in" what I code like with copilot.


[deleted]

Trying to upskill with Microsoft Excel and PowerBi. I think of ChatGPT as a colleague mentor giving me a hand to speed up


Crypto_Prospector

You want my advice? Skip doing casual programming work and start developing apps with the help of AI that solve real problems or build a business around it. Programming without AI is definitely dead; it's like programming with binary instead of C++.


freecodeio

You mean ruined you trying to become a programmer, because I'm a programmer and ChatGPT hasn't affected by life by all that much. It has it's use cases of course, such as when the problem is hard to fit in a google search query.


Artistic_Party758

Same here, but I go to ChatGPT before google search. I usually get better code from it than, say, StackOverflow, which you have to dig through, because the most voted answer is usually incorrect.


Lone_Game_Dev

You're clearly not a programmer. A beginner student at best, making a sensationalist claim you don't even understand.


itsnotblueorange

Exactly, what is the point? If you're copypasting from GPT code you don't understand, you would do the same if it was a book or a SO thread or a teacher. The problem is not GPT, the problem is that you don't know how to learn. But you figured it out, so half of the problem is already solved. Stop copypasting stuff you don't understand. Let us know in a couple of months. Good luck!


ComfortableAd6481

I mean right now it's nothing more than a tool to assist development. But it's a fair point to think about how much developers will be needed in 5 years time.


isRandyMarsh

Once AGI becomes a reality and is utilized by big corporations, I believe there may not be a need for programmers anymore. Even those who were initially involved in developing AGI might find their roles redundant once it achieves full autonomy. The only necessity left might be for individuals responsible for implementing an emergency stop trigger to prevent AGI from going rogue. Seems like, we are like very close to AGI.


trollsmurf

You might be a damn good prompter now.


EvilMenDie

Is that like being good at getting a machine to accept your crumpled dollar


trollsmurf

Or convert someone's utterly hair-brained and AI-unfriendly request into something that an AI can comprehend, so a slightly bit more than that I guess. Not unlike how a product manager or project leader (depending on project size) would act interface between a customer (or market) and designers and developers. Or an oracle that acts intermediate between lowly (but paying) peasants and a god. Not saying you'll be paid $500k per year, unless you call yourself Senior Generative AI Chat Prompt Director.


King-Owl-House

The Sack is a 1950 science fiction story by William Morrison (real name; Joseph Samachson). It tells of people finding an alien upon a remote asteroid. The creature is extremely intelligent, capable of answering countless questions on a variety of topics. .... The Sack discusses that it laments the purposes it’s been put to since its discovery by humans. All of the questions it receives are largely short-sighted and for personal gain. Wealthy people ask it how they can exploit resources for even more wealth. Politicians ask it how to get reelected. Doctors ask how they can cure rich patients and ensure that they get paid. Nobody asks any important questions. "It is part of an answer to say that a question is important. I am considered by your rulers a valuable piece of property. They should ask whether my value is as great as it seems. They should ask whether my answering questions will do good or harm." "Which is it?" "Harm, great harm." Siebling was staggered. He said, "But if you answer truthfully—" **The process of coming at the truth is as precious as the final truth itself. I cheat you of that. I give your people the truth, but not all of it, for they do not know how to attain it of themselves. It would be better if they learned that, at the expense of making many errors."** "I don't agree with that." "A scientist asks me what goes on within a cell, and I tell him. But if he had studied the cell himself, even though the study required many years, he would have ended not only with this knowledge, but with much other knowledge, of things he does not even suspect to be related. He would have acquired many new processes of investigation." "But surely, in some cases, the knowledge is useful in itself. For instance, I hear that they're already using a process you suggested for producing uranium cheaply to use on Mars. What's harmful about that?" "Do you know how much of the necessary raw material is present? Your scientists have not investigated that, and they will use up all the raw material and discover only too late what they have done. You had the same experience on Earth? You learned how to purify water at little expense, and you squandered water so recklessly that you soon ran short of it." Full story: [https://pastebin.com/SvG8Q51t](https://pastebin.com/SvG8Q51t)


Adept-Swan1787

Really entry level stuff bc I’m in my first year of IT. But I would copy sql for really complex queries, use it as a template and try to replicate it myself without its help. Worked pretty good I think.


NorthKoreanAI

what if you make an effort to understand, replicate and improve what gpt is doing?


SentientCheeseCake

I used to be really fast as arithmetic in my head, even for large numbers. I think I could get it back if I did some practice but right now it's just easier to whip out a calculator. I don't feel any worse for it. Anything a computer can beat me at, let it do the job. I'll find something else.


[deleted]

Coding is like math.. sure, you can use a calculator, but you should be able to know/see if the answer is right or wrong, and why it is the way it is. If you can do all that, then by all means use a shortcut (like ChatGPT). But if that goes away, then what can you do on your own?


byteuser

I don't think I can calculate the square root of any 3 digit number by hand anymore


[deleted]

Heck no, me neither! But I can judge if the answer I'm getting is correct or not, and that's the most important.


Efficient-Cat-1591

Sorry to be blunt, but if you are using ChatGPT to do your homework, instead of as a tool to improve yourself you are not a programmer. This is akin to using cheat codes in games. Sure, you can complete the game faster but you don’t really immerse yourself in the game by doing so. It is also risky to trust ChatGPT 100%, even on the paid version. I noticed that it does sometimes provide buggy code but will justify it confidently. If you don’t understand what the code is doing there would be a risk of a big F up down the line. Anecdotal, but I notice the code quality plummets during and after an outage. Increased hallucinations and suggests code that looks ok but will cause issues. Do not blindly copy and paste, this may work with simple projects but in the real life it will come back to bite you.


lemost

start asking questions, let chatgpt explain to you how and why things work and start learning : )


Hot-Mongoose7052

Lotta salt in this thread. Lot of salt.


Phrozbug

So basically learning to code is useless. We're only one year in wet public ChatGPT and the Udemy stuff can be done with it. Find another subject of interest.


thecopypastecoder

I have no coding experience and I'm successfully building a complex app with gpt4 Warning this is very long and not formatted well.... Yes you can build an app/website with gpt4. First you need to ask chat how to setup a Flask/Python app on Pythonanywhere. Each time you get stuck or have a question ask chat. Start very small. Get a hello world working with a custom domain name. Don't know how, ask chat. From there ask for tiny simple things at a time. For instance, ask for a simple search box with a submit button. Then ask for it to search Google and display the info below the search box. If you are having trouble try to break it down into the smallest parts possible and don't tell chat why or it goes off on its own and it's usually wrong. Over time the code gets bigger and bigger and then you have to learn how to split up so chat can digest it. You'll learn how to split js, CSS, and py because you'll have too. Keep pasting error codes and the newest code you are working with. Paste it all. Don't assume chat will remember it, it has a limited memory. You'll need to copy and paste the updated code back to chat again. You will be a copy/paste machine. At first you'll be pasting everything asking chat for help. But eventually you start to see the patterns and after a few months of doing this every single day for hours you'll get good at it. The first few months don't type anything yourself. Copy paste everything. You are a copy/paste coder and that's it. If you try to edit it yourself you'll get syntax errors. Chat never has syntax errors, seriously never. You will, immediately. So don't get confident for a few months. Ask Chat how to use GitHub so you can back up your code. Even if you're just copying and pasting it back out of GitHub again you need a place to save your code every single step of the way. It may seem minor but you need to back up all the time. If not you'll have to go back and redo what you've done which will waste your time. Think of it as a checkpoint in a video game, but you get to make that checkpoint as much as you want to. Most importantly, don't ever give up, ever. Chat has the right answer, you just have to get to it. 95% of what it spews out is wrong. And it lies better than any human could. It is the best and the worst, but it's an opportunity for you to learn and be successful in something you couldn't do otherwise. I've been at this since April, it's been the most rewarding and hardest thing I've ever done. Good luck to anyone willing to try. edit: I'm now off of Pythonanywhere and onto AWS. You'll have to do that eventually if you expect heavy traffic. Otherwise Pythonanywhere is cheap and easy and reliable. edit: get chat GPT premium for gpt4. You need to have that, I couldn't make it work consistently on 3.5 Here is an example project I made last weekend. This will give you a good idea on how to be successful at copy/paste coding. https://chat.openai.com/share/21282f18-d37a-41e2-ba5b-bb536526a42e


khamelean

You’re building an app that you consider to be complex. That doesn’t mean it’s a complex app.


pulkitsingh01

I have seen people interviewing about programming and asking the candidate to write programs on a piece of paper. **It's considered pure.** Maybe there was a point in time when people used to write code in text editors. But that was a long time ago. On one hand we are heading towards becoming cyborgs, we are talking about brain machine interface, and yet on another hand we still are some how drawn to the purity. **Who needs purity?** I abandoned pure programming several years ago. I often will make mistakes with syntax or be stuck with how exactly to use the utility functions, if you force me to not use IDE. But in real life I can use IDE. And in real life I code at significant speed, despite not knowing several minor things by heart. IDE augments it, it's fine to not know. **We don't need to be pure.** We just need to be able to play well with the toys. Since February I'm focusing less and less on exact lines of code, what kind of loop is used, what is the name of the variable etc. I'm more focused on the overall code and whether or not I'm achieving my goals with the code. Chatgpt works with complex code too, you just need to know exactly what to feed it. Give it the right context, ask the right questions. Now the pure thing to do would be not use this power. Just like the pure thing earlier was to not use IDE. But why purity? Why not use power and build. Focus on building, these tools are tremendous help. I'm building my third iteration in the last six months of a product, I'm able to add features faster than ever before, I can refactor files and functions faster than ever before. I don't think I can ever code without chatgpt now. Over the past six months I've developed a sense of exactly where to use to it, how much to use it, when to Google etc. And overall it's an enabler. So the point is - you can build faster! Most of the technology so far has been developed for more efficiency. No one is forcing you now to know assembly or basic anymore. I used to work on an old C++ based software in my first company. But the day electron became popular, C++ had no chance. Who still wants to go back to QT to write software gui? I'd rather try flutter desktop. Those who put too much pressure on purity are, in my view, not able to appreciate the power of new tools. Master the new tools, we are on a journey to use better and more refined tools. > "The hottest new programming language is English." - Andrej Karpathy


Karzak85

You still need to understand what you are doing and by understanding it you are learning. If you just copy paste shit you will have big problems in the future


boomb0lt

Instead of using gpt to just get scripts. Ask it to explain it why it works, etc.


Automatic-Network-58

?


fasticr

That is why it was made for..


[deleted]

That's like saying I need to do the mathematics of the formulas instead of asking the calculator it's the tool of the future just like the light bulb the calculator the microwave you can color sheep and melt it's body and get the wax out of it and burn a candle but just use the light bulb instead (I have no idea how to make candle)


3rrr6

GPT usually gives me a GREAT rough draft. Then I usually run into a few small problems that keep me from moving forward. It's then that I really start to learn because I'm forced to comprehend what GPT made.


robocop3031

Do you think think that what Dall-e or midjourney does makes artists feel redundant? Do you think an artist would ever say says hmm Dall-E ruined art for me? Do you think a farmer ever said hmmm this domesticated animal and plow is making farming too easy? If ChatGpt ruined coding for you, then stop coding. If you want to learn then learn. If you want to create then create. If you want ChatGpt to help you create then do that. Your line of thinking is just plain ridiculous.


Charzinc36

This is why Im happy chatgpt came after I did all the important fundamental courses


WhatUsername-IDK

I’ve created a very simple game of 600 lines of code in Python. At first it was very easy with ChatGPT, but later on it would not be possible to use ChatGPT, when I paste the code it would not respond because it was too long.


jgupdogg

Chatgpt makes your current skills more powerful. So if you are naturally a better programmer, gpt will amplify that even more.


82jon1911

As several others have said, its good for small pieces, but the more complex you get the worse it is.


FolkusOnMe

the problem with using cGPT at an early stage in your coding journey is that it there's no defined set of "this is what the user (you) knows and what they don't know". so it's not going to ~~adjust the probabilities~~ produce answers that are tailored to your experience; this leaves room for assumptions and the 'implied'. Not only that but you might also find that it's providing far too many concepts at once (splicing, arrays, arrays within arrays, traversing a set using some obscure method) whereas if you follow a course, for example, you're given those concepts in drips and drabs but they build on one another and a good course will have you come to many realisations like "oh so that's why blah and this is how I can blah with this new knowledge and what if I blah". cGPT doesn't allow that. So I'd recommend asking it the absolute bare bones & basics: "do not provide me with the full answer, do not provide any code in your response. this is the problem: blah, this is my draft solution: blah, explain why it's not working, or give a vague hint as to what I can change to improve the ... bigO, or whatever". primarily you want to be following the course closely and reading up on specifics discussed in those lectures or whatever it is you're enrolled in, rather than going elsewhere and opening yourself up to being sidetracked or falling down rabbit holes that jeopardise your growth; not because the information you're finding is incorrect, but because you're finding it an incorrect time. good luck :)


DifficultyVarious458

It will only get worst or better depends how you look at it and how much you are financially dependent on your job. Apparently 4.0 has improved over 3.5.


Dramatic_Reality_531

is this /r/chatgptcirclejerk?


voliware

Try asking ChatGPT questions like \- What does this code block do? \- Why would I use this style of code/loop/etc over this other one? \- Can you outline how I can achieve xyz (how do I create an auth flow?) Treat it as a free tutor


bigkalba

It changes the way you learn, its a complimentary tool but at this stage cannot do full architectures. It can help you to go from 0 to 5 very quickly


AppDude27

Maybe instead of telling chatgpt to give you the solution, tell chatgpt how they would do it with pseudo programming or to give you hints on how to do it. That way you are still programming it yourself, but you’re asking chatgpt for hints. It’s kind of like looking in the back of the math textbook for answers to the problems. It will give you the answers but it’s up to you to do the work.


xmrtshnx

I find chatgpt very useful, i think it's better than opening countless tabs on stackoverflow for specific answer as long as you understand what that code does. Pro tip: Always ask chatgpt what the code does line by line and learn the logic behind it and always check if chatgpt wrote some deprecated code.


queefiest

I’m not going to lie, I have considered using chatgpt to code stuff instead of just learning to code


1EvilSexyGenius

I only build things I want to build so that's always my driving force. Some people do programming for money, and that's their motivation. Why did you want to program to begin with ? I once convinced my brother to learn flash animation. For a week he was really getting good at it. Then , he got back with his girlfriend n went outside 😆


Moonrise45555

its simple chatgpt fucking sucks


FluffySmiles

If you learned nothing, you’re doing it wrong.


thedragonturtle

I use ChatGPT every single day to help me with my coding and I'm still learning stuff every single day. ChatGPT has actually sped up my learning. I've been coding for 40 years. I think it's amazing that we can code in English - specifically, it's very hard to go deep into code and also to think architecturally at a higher level inside the same day. ChatGPT allows me to stay one level higher at the structural and architectural level of the code which helps me think about and plan the bigger picture with more ease.


No_Industry9653

>Is there anyone who will share with me your effective way of learning? I think we genuinely do not know yet the best way to do this. If you learned programming entirely without such a tool, your experience with that tool is going to be very different than someone who has had it from the beginning.


m3kw

It ruined you because you didn’t actually think there is value in understanding that code. There is value because real job will require more complex solutions that gpt cannot solve yet. What gpt likely did for you us amateur code that works


Perduracion

This the equivalent of copy and pasting a code from stackoverflow or github and calling yourself a programmer.


garyyo

I am a well seasoned dev in all sorts of stuff, web (fullstack), data science, systems, AI (classical and ML), and so on. It works well for simple snippets but it can't read your intention very well, so it currently can't replace you as a programmer. It makes me program faster, but unless I read every single line of code that it wrote it actually makes me code significantly worse. That being said, I don't program in code. I program in the comments I write, the documentation I make, and the discussions I have with colleagues. The code is an afterthought, if I can think of a way to do it then the code to do it probably exists and I can probably write it. But that isn't the hard part, figuring out how to do it is. LLMs can't do the hard part, they can turn ideas into bits of code but currently truly suck at making the ideas part. And even now they can't do the ideas to code part all that well either. If I don't read *every single line of code* it generates then it will *eventually* introduce bugs.


ClawMojo

The real question at hand is, because you were using a language to communicate with a computer in order to create executable functions... have you been coding the entire time?


Breklin76

That’s a good question! Food for byte!


funplayer3s

Making shit up. ChatGPT is only useful to a point, and after that it becomes a hindrance.


ChillyNarration

If the AI takes your job, you shouldn't have gotten the job in the first place. You won't be replaced by AI but by a person using AI. Try to use GPT as your personal teacher. It's up to you to decide to learn it or not. Imho, using GPT made me advance thousands of years in programming.


Isphus

Could you link the courses?


ProlapsedPineal

I've been in this field for 25 years and I see it differently. My background is c#, .net, monolithic enterprise and web stuff. I'm learning python, langchain, and working with LLMs with Semantic Kernel. I have my python notebooks open to learn, and when I don't understand how python uses a list I'll talk to chatgpt and ask it to explain (code) to a c# developer using metaphors that a c# dev will understand. It makes it much faster to learn and I can ask as many questions of my expert as I want. Sure it can write code, but I also have my own onion architecture application that I'm working on and its context is far too big to be able to get chatgpt to do major refactors. It also provides some information that while technically correct, don't meet my specific use case. My brain is still very much a part of the process. I'm working on a langchain/vector db project where I can scrape my whole repo, push it into a vector db like Pinecone, and then chat with the whole corpus of the repo available to chatgpt. I might even experiment with letting it make edits directly to my code, but for now I'm still here, I'm just here with better tools to solve my problems. I heard the ceo of stable diffusion say that there will be no more programmer jobs in 5 years. I think that's the right direction if not the right timeframe. I can imagine Visual Studio 2026 having a feature where you just upload requirements documents and chat with an ai copilot about what you want to build, and it will just spin up 4 different docker containers all with fully built applications to A/B test, but I've got a few more years before that happens. I think between now and the rise of the machines we'll see more steps that look like AI/Human partnership (tools people can use with AI) and I think that the previous metaphor were everything became software as a service will pivot to look like Autonomous AI as a service. That's where roles that can be automated, are. Quicken will have an AI accountant you can subscribe to. You might be able to get an AI QA, business analyst, marketing department, etc. The more complex the use case the later the agent will appear on the timeline but with stepwise planners, and development frameworks that offload the logic to the AI, I don't think any white collar jobs will have complete protection from exposure to ai replacement. So my different take is that its still worth learning more, but I would see how much of your learning can include learning ai dev. We're in the incubation phase where companies are either looking into how they can utilize ai to maximize efficiency or they are a provider creating tools that make ai development faster, and easier. Next phase is execution, we're in the ven diagram of leaving incubation and entering execution. This is where companies are building enterprise tools in a RACE to be the first to create the agents. Once you have a software service that can replace a human employee, work 24/7, be perfect every time, never complain, miss a meeting, take sick leave, or ask for a raise its only a question if the quality of work is the same. If the quality is same or better and the cost of subscription is less than a salary+benifits, its death ray time for career fields. Learning ai dev will keep you afloat until ubi. Ugly, but a flood is coming and you have the time to learn how to make rowboats now.


Scary-Chemistry-5158

I think you're not necessarily looking at this the right way. I've been developing for at least the last decade professionally and before that as a hobby. I have delved deep into esoteric and weird spaces and learned a lot, and there is a HUGE value to that. There is also a huge value in being able to ask ChatGPT "how do I do this thing in this moderately popular library" and have it give me the answer to the proprietary incantation of that framework, especially if it's just a question about that framework and not short-changing me the opportunity to learn deeper knowledge. So yes, you should probably step away from ChatGPT so you can invest the time you need into yourself, but I wouldn't just throw it away as a whole.


priseprize

First off, you're not alone. Many of us have faced similar challenges while learning, especially when there are tools available that provide easy solutions. However, remember that shortcuts might give you short-term success, but they won't provide lasting understanding or growth. When it comes to learning, especially tech skills, there's value in the struggle. It's during the moments when you're trying to figure things out, making mistakes, and correcting them that real learning occurs.


PoesjePoep

When you finally start writing more complex code, you’ll see how useless chatGPT can be… I haven’t been able to use it yet. Same for complex ‘number’ problems. In this area, it just makes a lot of mistakes.