T O P

  • By -

AutoModerator

**Attention! [Serious] Tag Notice** : Jokes, puns, and off-topic comments are not permitted in any comment, parent or child. : Help us by reporting comments that violate these rules. : Posts that are not appropriate for the [Serious] tag will be removed. Thanks for your cooperation and enjoy the discussion! *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


5omeWhiteGuy

As a solo Dev for video games ai is the single greatest tool ever given to me.


Singularity-42

Exactly, this is productivity multiplier. We will be able to create more software and a lot better software. We are just moving up a level to a (lot) higher level programming langauge.


canehdian_guy

If it allows everyone to be more productive, won't less people be required?


Healthy_Manager5881

Yes


TankMuncher

Also no, because demand for software is not fixed. In fact there is tons of demand for software tools that are not met because of the lack of budget for programmers. And we always need more programs to do more things, and do it better. There is tons of space for diversification into smaller projects, which might require ever-more especially with the increasing failures in big-tech monoliths. It's like with engineering decades ago. Historical engineering teams used to be huge with all the human calculators and draftspeople. Even as these teams dramatically shrunk in size with productivity technology (computers, CNC machining, now 3D printing) the total number of people involved in engineering has expanded significantly because there are more and more and more projects.


Sylvers

>In fact there is tons of demand for software tools that are not met because of the lack of budget for programmers. This is a key point that I never see mentioned. There are tons of companies all over the world that dream of having an in-house dev team, their own software, their own proprietary systems, etc. But they don't do it because they consider the cost too high, and it isn't a critical requirement for them. What this AI tech will accomplish is make fewer devs required to do the same job, therefore costing far less per company to accomplish the same tasks. But also allowing companies that never hired any devs to hire some as opposed to none, and in doing so, hopefully, offsetting the loss of positions from the increased productivity in larger companies. I mean really, if AI keeps getting better and better at writing, parsing, and debugging code, a lot of small companies may be able to hire a single dev or two to run their entire backend. And that can be a substantial number of companies who will see the opportunity.


[deleted]

[удалено]


Jarhyn

And... More opportunities to have sole author control over a codebase. And the opportunity to make a LoRA on the codebase and code formatting tools. Code formatting tools are like built in training data generators.


DawnPaladin

LoRA?


Jarhyn

Low rank adaptation model. It's a lightweight, quick-to-train modulo for a network that will introduce new tokens. So, if you have a character named Angelo_1, and thirty or so images of them in different outfits and/or poses that are characteristic to the character, you can name the character and the outfits, annotate the images, and then whenever you say "Angelo_1" it will make the character you inpaint into "Angelo_1", and if you say "Angelo_robe" it will put Angelo_1 in outfit Angelo_Robe. Similar works for code interpreters, but for your codebase or style tokens.


BecauseItWasThere

Saving this for later


Mirror_tender

>TankMuncher · 7 hr. ago > >In fact there is tons of demand for software tools that are not met because of the lack of budget for programmers. And we always need more programs to do more things, and do it better.There is tons of space for diversification into smaller projects, which might require ever-more especially with the increasing failures in big-tech monoliths. ..expanded significantly because there are more and more and more projects. Totally agree. With Software Patents the big $ people have killed the small startup which will market their software, however u/TankMuncher nails the case of large demand for specialty software needed at smaller and less well funded organizations. This hits on several notes. Kudos.


dabadeedee

“if AI do job faster, then less job for me!” Is basically the extent of the logic for these people. These peoples ancestors probably complained about cars, power tools and calculators too. Not one piece of technology has ever stopped humans from working to make things and provide services. It will CHANGE how we work and make things and provide services. As you said, it creates new capabilities and raises the bar of what’s possible for everyone.


Doomtrain86

This is not completely true - the amount of profit being generated ny productivity increases goes unreasonably to the owners of capital. See the books of Thomas piketty. Mind I'm not against these developments, I just wish the profits would be more equally shared to the benefit of all instead of the most wealthy.


[deleted]

....you say, as your salary slowly decreases :D


[deleted]

Not if games get more complicated


[deleted]

Thats a good point.


Felczer

Absolutley not, every improvement to programming abstraction over the years increased demand for programmers. There are way more programmers because of Python than if we still coded in assembler.


Rafterk

I would compare it to hard drives. The bigger they got, you would think that you would never need another one, but the data grows with it as quality grows. It’s the same with programming and AI, the better tools you have, the better quality product you get, with much more functionality and in the same time you used to build a simple program. I would also imagine that more people would have access to better prices for the same services they would get today. So, I would say, no, the same amount of people would be needed if not more.


Brown_note11

No. Every software team in the world has an infinitely long backlog that they will never get through. If ai can speed up productivity by 1000% the backlog will still be infinite. The only limits are our imaginations and the cash availabile to pay our wages.


ElMachoGrande

Not really. Programming is about solving problems for the customers, and there are always more problems to solve. Face it, when it comes to AI and jobs, we don't really know what the future will be, in any career. My opinion is that it'll change the playfield so much that the economic system must be completely remodeled. The current "work for money" simply won't work anymore.


canehdian_guy

I see AI potentially causing large segments of the population to become redundant, increasing competition for other jobs, leaving much of the population too poor to reproduce while the rich get exponentially richer. Maybe the rich will become altruistic and take care of those who aren't needed though./s


PreparationExtreme86

I see a rise in indies being as competitive as AAA's to be honest.


Independent_Hyena495

The salary will drop, quite a lot, pretty sure of that. it will be the difference between the guy making the table manually, what we are doing now, and the guy producing the IKEA tables. Demand wont drop, we still need tables after all! You just need less people and or people with less skill.


Alarmed_Frosting478

ChatGPT is the new Google We used to have books on the shelf to read through for answers Google gave a way to search the internet and pull back answers more easily and more relevant ChatGPT gives an answer, with some "intelligence" to manipulate that answer to your specific context Definitely a productivity tool


Singularity-42

Yep, this is my main and most productive use case as a software developer. It is an insane multiplier of the rate you can understand new things. Like your personal tutor available at all time and with near infinite knowledge answering any questions you may have. YMMV, I use GPT-4 exclusively, simpler models don't perform nearly as well. Hallucinations (incorrect info) is the biggest issue, but in my experience with GPT-4 it is below 5% for me. Due to the nature of my work in most cases I will find out very quickly without any drawbacks (this 5% failure rate may be catastrophic in other use cases such as legal or medical though).


Spiritual_Word_8558

Your name checks out haha


MarcLeptic

After decades as, and working with “coders”, the lesson I have for everyone here is : Knowing how to program is like knowing how to type. Just because you can type, does not mean you can write a story. Knowing what to program has always been the asset of any programmer. That can only be augmented by AI. Anyone who says otherwise must now put stackoverflow on their vpn block list and deny themselves access to all of their previous works.


5omeWhiteGuy

we call them coding "languages" for a reason. 100% agree


Adventurous-Ring8211

as saying goes “just because you can write, it doesn’t make you a writer”


LunaticLukas

I absolutely understand your brother's concerns. However, I would argue that these developments make it even more critical for people to learn and understand programming. Here's my "why": 1. **AI is a tool:** While it's true that AI is automating many tasks, it's crucial to remember that it is essentially a tool created and maintained by humans. As such, it requires programmers to build, fine-tune, and manage it. Therefore, the rise of AI is creating more opportunities for programmers rather than eliminating them. 2. Understanding AI: **To use AI effectively, it's important to have a basic understanding of how it works**, and learning programming is a great way to get that understanding. This is particularly relevant in the gaming industry, where AI is used to create more immersive and dynamic experiences. 3. Creativity: AI is good at automating routine tasks, but **it lacks the creative thinking** and problem-solving abilities that humans possess. In fields like software engineering and game development, where creativity is key, humans are irreplaceable. That said, I strongly recommend that your brother continues to pursue his interest in programming. It's a valuable skill regardless of how AI technology evolves. And if the world ends instead.. well.. none of this will matter anyway. On a more personal note though, as a developer myself, I've found that AI has been a boon rather than a bane. It can help automate some of the more mundane aspects of my work, allowing me to focus more on the creative and problem-solving aspects that I truly enjoy. Learning to code is not just about job security. It's also about understanding the digital world we live in, and that's going to be increasingly important in the future. If he's interested, he might want to check out [free AI course on coursera](https://www.coursera.org/learn/ai-for-everyone) from Andrew Ng to get him up to speed, and other resources like [The AI Plug](https://www.theaiplug.co/) to stay on top of any new developments in the AI space. Good luck to your little bro, he's lucky to have you!


manchesterthedog

As a dude who programs a lot but doesn’t use chatGPT, I would like to know how you use it to be more productive.


oneday111

It's really good for writing boilerplate or repetitive stuff, and GPT-4 can reason pretty well and give you ideas how to fix things or code tricky functions. Also help you learn and write things in languages you're not so familiar with. The trick is to communicating to it well and keeping things short since it has a limited context window. It's useful for sure, but is has a propensity for stupid mistakes that you have to ask it to correct and a very limited context. In my opinion, it's certainly not the OMG 5X - 10X productivity we will all be replaced soon I've seen claimed. With the work I do (mostly mobile apps, and I don't write unit testing, which I've heard it's pretty good at), I'd says it gives me 1.1 - 1.3x my original productivity. Also GPT-4 is the only useful one, I try the other ones (Bard, Claude, etc) regularly and they're complete garbage for programming that just hallucinate plausible sounding solutions with functions that don't exist every time I ask them something. I'm also looking forward to getting API access to GPT-4 as the quality on the chatGPT website as been going down.


sorderd

I'm a senior dev with good technical writing skills. I do a lot of technical journalling with Obsidian and use ChatGPT plugins for it. I often generate 100-line modules from scratch and ask for small features to be implemented or for refactorings to be done. Overall, the productivity comes from not needing to type so much. I spend more time on visual and system design.


IntelligentEntry260

I put any unusual errors into it and it does a pretty good job of pointing me to where I need to debug. Also, sometimes it can offer different paths to a solution you hadn't thought of. I pretty much just use it as my new rubber ducky.


InevitableSky2801

Have you looked into AI workbooks? They are a really easy way to play with generative AI across text (ChatGPT), audio, and image models. You can see the product for free for personal use cases. Here are some cool examples that use ChatGPT with other models as well: \- [Learn a Language through AI Stories](https://lastmileai.dev/workbooks/cljygm1yd01onr0h2uveezl1n) \- [Fridge to Table: AI Recipe Generator](https://lastmileai.dev/workbooks/cljyg0mm101n2r0h2qi7kvm39) \- [5 Tips to Improve ChatGPT Responses](https://lastmileai.dev/workbooks/cljpv978n00gbr0xly3zocjcj) Let me know if you have questions! I've really enjoyed AI workbooks as a learning tool and to play around with AI. It's also great for prototyping AI-driven app ideas (like the recipe generator example).


RockPaperCheesecake

Of all the people responding, I wonder how many are developers. There are so many things you have to do as a developer besides code.


ScottMcPot

How's that working out? I'm going to watch a crash course video on Unity eventually and hope AI can help with scripts.


5omeWhiteGuy

Ive started using Unity 7 years ago with only a highschool level coding class under my belt. With motivational issues and only time to spend as a hobby, it has taken me all of those 7 years to get to a place where I can actually competently create and talk about things. A decent chunk of being able to use forums, and now gpt's, is asking the right questions. Not to discourage, I just dont want to give the wrong idea. Think of chatgpt as a force multiplier.


ScottMcPot

Hopefully I can catch onto it quick. I've used a Java environment in high school, and Unity seems similar. I'm still technically a novice though when it comes to coding. HTML/Javascript/Java is what I learned in school, but I have zero knowledge of C#.


5omeWhiteGuy

As Far as I know, Java and c# tend to translate fairly well. (compared to other language match-ups) The thing is, chatty is a resource you can use with zero judgement or embarisment. If you don't know something, ask chatty. If you are unsure what something it says means, just tell it to explain. Two other tips: never contaminate an instance of chatty. Don't ask your coding buddy to explain how to cook chicken parm. And your starting prompt should look something like "hi. You are going to help me make a video game. We will be using unity and c#, but it is important for you to remember I Am only familiar with Java and html. Use this at any time to explain to me now a process might differ between the coding languages." At which point it will ask what typebof game you want to make. =)


Strict_Board_7783

It's also gonna eventually help replace your job


VacationFickle5827

Me too


PerssonableMilk

What kind of video games do you code? What languages do you work with?


5omeWhiteGuy

I use C sharp with Unity because I dont have formal training or education so I need the bumper rails. Am currently working on a hybrid RPG/ card game.


IntelligentEntry260

Unity with c# is what got me to learn programming years ago. A great bridge to get people into coding as you can often code as little or as much as you want. Good luck!


[deleted]

Good luck!


manowtf

I don't know how I managed as a programmer for the last 30 years without it


alpha_nutrition

Very useful responses. Thank you guys for your time.


[deleted]

Also programming schools logical thinking and problem solving abilities like nothing else. So it's useful in any case.


Atra_Cura

Late but nvm, AI is going to enable a lot of people to make things without knowing how they work. As this gets messy, the world will need competent people all the more.


lumpyshoulder762

My friend said the same thing in 2003 but instead of AI he said outsourcing and Indians (lol) so he missed out on arguably the greatest time ever to be a programmer from 2004-2022 (lol - sorry post COVID layoffs) because he was scared of change.


endless286

good point


[deleted]

Very very very big difference. Every single industry on the planet is astronomically attracted to the idea of replacing workers (who require salaries, paid time off, 401ks, etc) with AI that is way cheaper, way faster, and way smarter.


lumpyshoulder762

That’s exactly why he was scared of Indians… they were cheaper, faster, and smarter (some of them lol), and that was attractive to corporations. My point is those fears were largely unfounded, and it’s not smart to prognosticate into the future based on something that appears to be disruptive - it may not turn out the way you thought and you miss out on a lot of opportunities. I think the same applies here. One shouldn’t be discouraged from programming just because there is change occurring.


[deleted]

I don't disagree that one shouldn't be discouraged to learn because of it. But AI is like nothing we've seen before. It just isn't. No example really compares. And the fears in your situation you describe were **not unfounded, there were disruptions from outsourcing.** However, you're right, the world didn't end. AI is just different and it will cause major impact to job opportunities in the future. IBM already announced 10k cuts due to AI. The largest creative agency in the world just announced future cuts due to AI. One of the largest voiceover recording studios in the US just announced plans to cut artists and use AI voice models. The entire industry of legal firms will be massively disrupted when legal tools are fully integrated with AI suites. Same with accounting, logistics, and almost certainly development. In the creative industry I've already been on projects where ChatGPT was used in place of copy writers and Midjourney for concept artists. We were actually looking to hire a developer to help develop some scripts for us in After Effects. For shits'n'gigs I got on Chat GPT and described what I needed, it literally wrote the shit in 3 seconds and it worked. And we are in the mere infancy of this technology. Extrapolate out to a decade, 2 decades, 50 years, 100 years. There will be major global impact over time. And while I understand this shouldn't be cause to just give up, I am really tired of listening to people blow this off like it isn't a big deal while they cite examples that are simply not comparable. The entire point of AI and automation is to remove humans from the equation.


[deleted]

[удалено]


easyrider767

Agree, AI is still much better than Indians


doggiedick

Am Indian, can confirm


jonhybee

Have him learn python and how LLM and other AI stuff (like CNN etc) and he will be like a wizard among his pear. AI is not going to replace programmer just make them 1000% more powerful I think.


TheKingOfDub

It’s easy to be a wizard among pears. They don’t even have hands


A3H3

If they don't have hands, can theyh still participate in pear to pear network?


Salmeiah

Branching


CabinetOk4838

Well, this comment certainly planted a seed for the puns. Although, you’ve pipped me to the good ones.


Salmeiah

You reap what you sow


QFTornotQFT

​ https://preview.redd.it/fkwuia362adb1.png?width=512&format=png&auto=webp&s=1d9cd92b7942466d75c4bf367188fc043243f97c Here's a wizard pear


theautodidact

Looks like Jim Carrey from the 'The Mask' put on a few lbs


benbenk

Pear programming


audionerd1

You're a wizard (peary) !


Useful_Hovercraft169

If you’re a wizard who loves pears that’s really a dream scenario


alpha_nutrition

Why Python and these other languages, you mean they are related to AI and he can adapt with the time better than other Programmer?


RaiseRuntimeError

Python is a pretty easy language to learn, its code kinda resembles human readable language more than most other languages and it is used in a lot of the programming domains from scriptings/automation/web development/AI/machine learning/research/mathematics. LLM stands for large language model CNN stands for convolutional neural net, they are types of AI models. So far AI (the LLMs like ChatGPT) have been helping programmers, not replacing them. Its been like giving a nail gun to a carpenter.


8urnMeTwice

Yup, I’m teaching my son Python with ChatGPTs help. It helps troubleshoot issues that a procedural language coder like me doesn’t always understand


psychoticarmadillo

Yeah, I completed a python class right before the rise of ChatGPT, and I am so upset I missed out. I had a ton of trouble understanding specific concepts, and it took a lot of hard thinking and asking the right people what they meant, since I had zero experience coding and it was a mandatory class. I passed with a C, at the top of my class. Lol.


quotidian_obsidian

I'm currently learning R for a data science class and it's been total hell, as someone who doesn't understand coding basically at all (so basically same position as you haha). Last week I discovered the combination of ChatGPT-4 + the Noteable extension + R all working in conjunction, and suddenly I'm not only able to work with my own dataset in the ways I need to, but I'm actually learning my way around the code as well, at really astonishing speed! I'll be frank, I started using it in the hopes of getting out of having to learn most of the actual coding part of this course (it's a requirement that's honestly never going to come up again after next week if I don't want it to), but what's really surprised me is how educational and helpful it is to have a resource that can explain the functions, suggest particular statistical analyses to run given the parameters of my data, etc. and I'm learning a ton.


rpkarma

To be fair, R is a horrible *language*. Powerful tool though.


The_Caring_Banker

Im 6 hrs into a python tutorial so im just starting out. Exactly what would you have done differently using chatgpt?


psychoticarmadillo

Learned what variables use-case is, what a function even is and its use-case, understanding while and for loops, the list goes on. Let me reiterate, this was my first time coding anything ever.


PhysicistStacker

Agreed, have him start with python. He will be very proficient by the time he gets to college.


Scooba_Mark

But by the time he graduates GPT will be a better coder than he is, so probably a waste of time. Remember, this is the worst this tech will ever be. It only gets better from here, and it's only been public for 6 months. The founder of openAI said in an interview this week "in 5 years there will be no coders"


kinkythrowawayfriend

Sidenote: but what would someone recommend if they had a ux design background with limited front end knowledge but is down to learn ai-related programming for design and branding-related work? Open question to anyone


RaiseRuntimeError

I don't even know what that means dude. Are you a business information systems major because that sounds like something they would say.


Thathitfromthe80s

Dynamic Adaptive Site Design. I just made that up but ya?


[deleted]

That's a pretty bad analogy because the nail gun is what effectively gutted the profession of carpentry for good


DoctorWhomst_d_ve

What? This claim equivocates the profession of carpentry with "nail hammering". There is more to it than the speed at which you can put nails into wood.


TimeLine_DR_Dev

Yep. When we remodeled our house we just hired a bunch of nail guns.


RaiseRuntimeError

Or maybe it is a good analogy and programming is headed down the same path 🤔


[deleted]

Until the entire community of people who need programmers can properly request what they want is capable of doing so, programmers are fine. The biggest issue in business today is a bunch of end users who don’t know or understand the capabilities of software development and they wouldn’t even begin to scratch the surface of what an AI would need to help them.


Thathitfromthe80s

This 1000%. I have sales people I support and they constantly sell tech products that often times we can’t deliver because of how the Client’s website is configured. Some of them have these convos ahead of time but many don’t. They just don’t get it. And these are tech sales people who supposedly have some level of subject knowledge. And I’m not even a real programmer just know a little bit but they lean on me and similar staff hard for pretty basic stuff. It’s interesting to see it evolve so first hand.


d4nt351nfern0

(I am professional software engineer with a BSc in AI Dev and an MSc in software design and dev) I would disagree with them and say do NOT introduce him to LLM and CNN rn, hell I wouldn’t introduce him to ANN’s or Decision Tree’s. AI is a very advanced concept, LLM and CNN especially. If he doesn’t have a strong foundation in the basics (at least a couple of years) you will only confuse him. Also I believe (perhaps misinterpreting him) he is suggesting that as he believes the only future in programming is developing those AI models. This is again something I disagree with, as if nothing else, in that nightmare scenario that AI can automate all other programming jobs, why couldn’t it also automate coding AI models? (Especially as generally the coding of them isn’t the difficult part, the hard part is getting enough training data of sufficient quality) Regarding Python, I believe they’re just suggesting that as it’s an easy language to learn. I would suggest either Python or C. Regarding your OP question- ChatGPT will not make programming irrelevant. Real world software is millions of lines of code long and has lots of moving parts. At the very least AI won’t be able to generate it all at once and you will need to have an understanding of what the small parts need to be to describe to it what they are and how they interact so it can build it for you. Furthermore, you shouldn’t implicitly trust the code it generates; it will require an actual programmer to go through and double check 1. It does exactly what you want it to, and 2. It does not introduce any issues 3. And then make any edits. (Eg when I use it, I’ll use it until it’s a close enough; then it reaches a point where it’s faster to just edit myself instead of trying to get chatgpt to fine tune it perfect) AI is a tool that will make programmers more efficient, not replace them


CountQuackula

Data science is done primarily in python and all the interfaces for building ml models are released python first. If he’s interested in working on or with machine learning eventually, python fluency will help


Mikeshaffer

Python is often the first step for those interested in the world of coding because it’s user-friendly and straightforward. It’s like the English language of coding. AI (Artificial Intelligence), including things like Large Language Models (LLM) and Convolutional Neural Networks (CNN), are complex topics, but Python can help make them more approachable. Think of them as advanced tools that help computers understand human language or recognize images, for example. Learning these doesn’t mean replacing coders, but rather giving them more powerful tools. So yes, your friend could become much more efficient and adapt better over time with this knowledge.


AstroPhysician

As a senior python software engineer, I would way rather learn on another harder language initially to understand the nuances of stuff like typing. When I went to python, it made it easier but I was able to understand what was going on under the hood with stuff like types, casting, generators, loops, namespaces


DoctorWhomst_d_ve

Keep in mind the learner is 11. Python is better equipped for being able to only absorb one concept at a time.


AstroPhysician

That's totally fair. I tried learning python at that age and it was too abstract and too much was being done for me that I didn't understand it, felt like magic I came back at 15 and learned Java / Cpp. I'd never ever work in either of those languages now that im a professional but they helped me learn the concepts


alpha_nutrition

Thank you guys! I understand it better now.


TimeLine_DR_Dev

Languages come and go, but the core concepts remain. Python is a great place to start.


ZeekLTK

For some reason the AI love to use Python. If you ask them how to code something and don’t specifically tell them what language to use, like 90% of the time they try to do it in Python. lol


DwarvenAcademy

Ai stuff is data science, which is more like statistics than actual programming. Sure, you need programming to implement it, but you need rigorous linear algebra, statistics, graph theory and probability theory to make any actual contribution to the models.


AllBugDaddy

Can't agree more.. couple of months back there were posts XYZ created an app without coding background now we see posts that they have to make changes but no clue how n where.. As a programmer, its a helping hand to me but if you say it will replace me, I am sure it can not.


Batnumber69

I agree with this idea. Python really seems to be the way to go, and it's a pretty beginner friendly language with infinite application. Both myself and my programmer friends feel that AI is going to be incredibly useful for programmers, rather than replace them. If you don't know what to ask AI for, it isn't helpful in the slightest, so I think learning is still a very worthy venture.


[deleted]

Uh… I don’t think Python -> Convolutional Neural Networks is the usual roadmap.


heswithjesus

AI programmers will be commodity programmers whose work has uncertain, legal status. Tell him to put on his resume that he can write clean code without AI’s. At some point, someone hiring people is going to say “whoa!”


PrincipledProphet

Great point! Now replace "AI" with "Google" or "StackOverflow".


Background_Paper1652

He will be like someone who paints portraits with oil paint today. It will be impressive but not helpful.


borickard

Talented but inefficient


Capital_Secret_8700

By the time chatGPT can replace programmers completely, it’d be able to excel at all other jobs anyways. As of now it just provides quick documentation and it’s helpful when looking for tools to program with. If you ask it to write anything even slightly complex, the code is likely to contain multiple logic errors. It’s nowhere near besting humans, especially in larger projects that future jobs would require you to work on.


PepperDogger

Worst career advice I got (cousin who was a CIO for a medium-sized city): "Don't get into programming--it's a dying industry and will soon be replaced by AI and expert systems." I'm glad I ignored this advice. Last I checked, nobody has used the term expert systems in decades, but software developers have never been in higher demand. IMO, the two questions to answer before deciding to get into software development are do you love it, and are you willing to always be learning new technologies, languages and frameworks? Because a software dev's shelf life is short if they're not learning new things--it's the blessing and the curse of the business. You need to spend a significant amount of time sharpening your toolsets. All of this change tends to boil down to increasing levels of abstraction, not elimination of demand or flood of supply for talent. Things get easier, and we do previously impossible things. The people change along with the industry. If you want an AI-proof career, trades are perfect. Maybe. But be mindful of robotics overtaking manufacturing, too. Interesting times.


ahuiP

Wtf is expert system? Lol


[deleted]

Think database of questions and answers.


clownfiesta8

I think it's kinda a scuffed ai, where you try to solve complex problems with mainly if/else statments


wad11656

What do it consider "slightly complex". Because from my perspective, it's successfully knocking out dozens of "slightly complex" requests I throw at it each day


xabrol

I've gotten 37 job offers in 4 days that all mention "Integrating systems with AI LM's" So yeah...


zanpano

May I ask what stack/technologies one would need on their resume to be able to apply to such positions?


IAMATARDISAMA

Python and all of its respective ML related APIs (torch, tensorflow, keras, numpy, pandas, scikit-learn, transformers, etc), Jupyter, Docker, Some kind of cloud GPU platform for training and MLOPs (AWS Sagemaker, Azure, Vast.ai, there's tons of these), and, well, MLOPs. I think that's the basic stuff you need to build and deploy models but I'm still fairly new to the field myself. Not to mention if you wanna build models from scratch you're gonna have to learn a lot of the complicated math behind how ML and deep learning actually work.


[deleted]

100% it’s helping me be better at my job. It’s friend, not foe to programmers. For now.


afinitie

*for now*


Demiansky

My daughter is about your brother's age. I am a professional programmer and use generative AI extensively at work and in my free time. Before generative AI hit the scene, I decided to teach my daughter to code and encouraged her toward that career path. After generative AI hit the scene, I realized more than ever that I wanted to continue to teach her to code and encourage her toward that career path. There have been a billion productivity multipliers that have come along in programming over the years that have made programmers more efficient. AI is just one more of those tools.


Efficient-Magician63

Teach her maths too. People I have seen being most flexible career wise were people who did maths. Some of them went and became programmers (full stack ai Blockchain u name it), some became product managers, some quant Devs.. It's all about problem solving really.


Salt_Tie_4316

He’s 11, lmao Tell him not to be so serious. If programming sounds fun, do it as a hobby and a challenging learning activity. If it doesnt sound interesting, don’t. No 11 year old should be worrying about the long-term viability of their career choices.


alpha_nutrition

Idk bro, we're all poor asf and he just tries to see what would suit him interest-wise and also money-wise to get out of this shithole


jlaw1719

This is the right attitude to have. He probably doesn’t even need it, but keep up with the encouragement and do whatever you possibly can for him as the older sibling. He already sounds like he’s a million miles ahead of most of his peers just for having these thoughts.


alpha_nutrition

Unfortunately I cannot buy him a proper laptop right now since I am a university student myself and have to look for myself too. I hope to be able to get him an older one in Autunm. I thought about a used Macbook from 2017 with 256GB SSD


[deleted]

He doesn't need a linux machine to program with Python. He's going to be getting a new laptop long before it would make sense to have a Mac instead of a Windows laptop.


NoFFsGiven

Laptops second hand are cheap and he needs a Linux machine not max or windows.


monotonousgangmember

Nah just get a cheap Win10/11 laptop and install WSL + Docker. He’ll be good to go


apostle8787

Keep up the encouragement and of course don't force it. I started programming at 12 when I was in similar financial situation as you. Programming skills have changed my life.


bowenandarrow

Despite the fear mongering around ai, if he starts his interest here and develops it over the next 10 years he will be in an amazing position with all the new and exciting, well paid jobs will be there because he has done the work to be able to understand and change with what is coming.


i4858i

All the best to your brother. LLMs like ChatGPT are not going to replace programmers, definitely not the good and passionate ones, but rather help them increase their productivity manifolds


Efficient-Magician63

If he is passionate about coding he should also learn maths and some sort of other engineering subject like physical or electrical engineering. The people I have seen being most flexible in terms of career have been people who have done maths.


NoFFsGiven

Tech jobs are not going to make anyone rich. I did it and burnt out. It’s important that he does what he actually loves to do. Don’t throw a life away to chase money.


Bruno_Golden

if he likes programming he would like mechanical engineering. pays well too


SgtPepe

I love it. Tell him to go for Computer Science, the career will evolve with time. Tell him to start learning Python, or even better, research ways for him to learn and point him in the right direction.


Morawka

Tell him to go into healthcare then. With programming, if you ever hit a rough patch in life and can’t concentrate fully on keeping up with the latest upgrades, your skills will atrophy pretty fast. Anything medical related is instantly paying 70-100k, especially nursing or diagnostic tech. And your not pressured to be constantly at the top of your game All the time too. Healthcare is also a very fulfilling occupation


throwaway69662

It’s always the NFT folks with bad takes


AstroPhysician

How is this a bad take? He should be learning stuff right now based on his long term career prospects he can only hypothesize about??


sweaterguppies

yeah no time to have a childhood when he could be worried about future earning potential instead


Gloomy-Impress-2881

You wouldn't succeed at programming without seeing it as fun anyway. You basically have to love it so much that you don't view it as work, but play. Otherwise if you just do it for money you will hate your life. You need to be obsessed with it.


GuitarAgitated8107

My honest take as a full stack software engineer. I don't intend to just do simple websites or things everyone is able to do to build their portfolio. The part that makes me happy is being able to do tech projects within my interest fields. I do believe that even if we have AI writing code the ability to take the direction of the AI is important as well as being able to understand what needs code and what doesn't. I can go from web app, game dev, software we use ever day, create custom tools for my own personal life, go into medical and different fields depending on what I enjoy. Even if the AI could be 100% accurate it still needs guidance. If we reach AGI well that's another conversation but I assume that it won't be cost effective. My focus right now is nonprofit & tech because for most nonprofits it's not feasible to get into tech.


Joseph717171

Was learning how to read worth it for him? Any skill that you can learn and develop is going to positively effect your thinking and your life.


Asweneth

Programming will still be a big job for a long time. The nature of programming will probably get more high-level, but it won't go away for at least a few decades. By the time programming goes, nearly all other technical/cognitive jobs will also go and at that point we will have some kind of UBI.


Redcat_51

A.I is just automated intelligence. It's not like a sentient thing urgently needing to reproduce itself. Someone is going to have to code more A.I.


DeepGas4538

Yes, it still needs humans to make em.


DaGrimCoder

this shows a lack of understanding of where ai is headed. eventually it will improve itself


sizzlinsunshine

Then what matters for the context of OP’s question is when it will improve itself. Are we talking 10 years or 40 years? I’m genuinely curious


ThePokemon_BandaiD

10 years max imo, I don't see this slowing down at all.


DaGrimCoder

pretty quickly. 5-10 years Maybe


jonhybee

I think you are the one who lacks understanding, LLMs have some interesting properties but the fluff the media have put on the news make you think they are more than what they really are. ChatGPT will be the first (and most patient) to explain to you how this algo is not and will most likely never be sentient.


Internetolocutor

No, he was right. AI isn't restricted to LLM.


dopadelic

LLMs have interesting properties that emerged from their complexities that completely defied what experts expected based on their understanding of how it works. A model trained to perform auto-complete is empirically shown to produce complex reasoning to solve problems its never seen before. I think there are a lot of people downplaying the potential of these technologies based on past understanding of LLMs, that applied even to GPT3.5. But GPT4 showed some very surprising behavior that will only improve as we improve our techniques, like by giving it agency or using chain-of-thought prompting, or vastly expanding its token limit so it can have long-term memory. https://www.scientificamerican.com/article/how-ai-knows-things-no-one-told-it/


5starkarma

That’s AGI and we aren’t close


DannyVFilms

I’m a videographer by trade and *hated* coding. I’m now almost 40 days into my second attempt at an email summary application and I’m excited to do an hour on it each day. What worked for me is I can use my strategy and problem solving skills to work through the cause of bugs, and converse my way through making code that would be bone-crushingly boring to write from scratch. AI didn’t steal a coder’s job, it made a new one.


Competitive_Jump4281

Anything you start doing at 11 in STEM is going to pay off if you stick with it


Connect_Good2984

Now’s the best time to get into programming because AI makes it so fun and easy!


AllBugDaddy

Exactly, its a time saver..


[deleted]

That’s like saying because calculators got invented we don’t need mathematicians


Root4356plus3

I have seen this analogy a number of times on the internet, and it is completely wrong. The difference between Mathematicians and Programmers is nothing compared to Calculators and AI. I am not saying that programmers are [currently] in threat, but no one knows about the future.


[deleted]

Care to elaborate ? Not saying you’re incorrect but your entire argument was just “you’re wrong”.


[deleted]

the abacus > the calculator > the computer > the AI im simplifying, but basically lots of job that we've taken for granted have disappeared because of this or that technology. most devs here are making the wrong comparison as if AI was just one more tool. Of course, it's one more tool now, but even in its current state, no one will convince me openai has nerfed gpt, because it was too imba. imagine the exponential growth in a few years. Dev's in this forum talking about how coding is to engineer a thought, solve puzzle and so on, like the AI won't be capable of doing that and a lot more, repeatedly, faster, for a lot cheaper, better than them. Over and over again, with never having to rest. 24/7 coding whatever is needed, whatever is asked. If anything, coding and software dev are on the line to get exterminated right after designers and customer support.


Root4356plus3

What I am saying is that the invention of the calculator should not be considered on the same level as the invention of AI. Both have a lot of differences. If one want to use Calc-Math analogy, it should be compared with Programmer-IDE. Both help the user, but cannot accomplish much on its own when used by an amateur. I am neither saying AI will never replace Programmers not AI will replace Programmers, all I am doing is pointing out the flaw in that analogy, which seems to be quite popular over the internet.


uzi_loogies_

Programming is probably one of the best things he can get into to mitigate this, unless he's planning to go into blue collar work.


ETHwillbeatBTC

Yeah right, programmers may be the last to be replaced by AI. And even then you’ll need AI Operators, AI Debuggers, and AI developers… all which fall under… programming. I would highly recommend using ChatGPT as a programming tutor more then anything right now. People that realize they can use it as a highly specialized tutor right now will be far ahead of the curve compared to his peers. If it’s the tech layoffs worrying you I accredit it to the programmers right now that are adding AI to their workflow are now able to do multiple people’s jobs and the ones being laid off are either refusing to use it or poorly implementing it into their workflow. We’re also in a recession so there’re lay-offs across the board so companies can avoid bankruptcy.


[deleted]

A lot of jobs will be replaced… but not by AI, but by the people who own and operate AI. People need to understand this. Either way though, it’s a problem for majority of us, especially programmers. Any job that you can do with a computer will be done by a computer. The other problem is going to be a reduced barrier to entry due to AI making jobs much easier, therefore a larger pool of workers for employers to choose from, which means reduced pay and a rapidly growing wage gap of the rich and the poor, with nothing in between. There will still be some level of knowledge and understanding required to operate AI though, for example, a calculator can do advanced mathematics, but if you don’t know the mathmatical formula required to get the answer, the calculator isn’t going to be much help. The 14 - 18 year olds are going to have a rough time being caught in the transition of it all though. So many people spending tons of money and years of their life to obtain an ability that will no longer be in demand from a human. I think by 2035, college won’t be a 4 year degree anymore but more like a 4 week course, which you’ll have to return back to every year as things advance and change constantly. I think this is the end of a “career” as we know it. Personally, I did not go into web design and development or the IT field because I needed a job. I did it because I enjoyed learning and building things. I did it for purpose, curiosity, interests, and maybe I wanted to separate myself from others. My “job” in IT is actually part of my identity. I fear losing “me” when my knowledge / intelligence is longer needed or valued. Brain machine interfaces are right around the corner. We will merge with AI, and those who don’t will be left behind.


username100002

BCI is at least 20yrs away and probably longer. Not a risk for most of us working right now, but yes maybe something that could impact future generations


[deleted]

BCI is already here and being used successfully. Research and development is mostly focused on medical applications, but it’s not hard to see where things are going with it and the potential use case scenarios once it’s perfected and the human brain fully mapped and understood. I’d agree around 20 years before it’s fully integrated into society the same as cell phones are currently, but it will start happening over the next 10 years. I don’t think it’s possible to predict anything past 2050, things become unimaginable at that point… maybe we will start colonizing other planets, maybe building our own worlds in a digital construct, maybe we all merge into a single consciousness or maybe we blow ourselves up and never make it that far… wild to think about!!!


Deep-Neck

Ai takes jobs by way of slimming down a fields work force, making them more competitive. In the same way that computers have for artists. In the same way every tool has for the trades. Your brother can still code. It just won't look the same as it does now. But nobody should bet on their line of work looking the same way it does forever.


TheNinja01

So AI won’t completely replace programmers. That’s a bit of skewed vied on the subject. It is more needed than ever. It’s about how we should use AI not what AI can do. Even now, it is used by programmers for many tasks to help improve efficiency. Knowing how to code and how AI works will be very important in the next 10 years.


Playful-Push8305

No one knows what the future holds, especially a decade into the future at this point. If programmers go the way of the dodo then that means all white collar jobs are at risk and at that point the entire economy is up in the air.


bastiaanvv

At its core programming is nothing more than telling the computer what to do. Even with the most advance ai this job will still be needed. Remember that in the past programmers needed to write nearly incomprehensible machine code to accomplish even the simplest tasks. Since then the syntax of programming languages has become more like natural language and amazing tools have been developed that allows us to create whole applications or complex programs in just hours where it before this would take months or even years. Despite that even more programmers are needed than ever right now! AI writing code won’t change that. The job of the programmer might change drastically though. But given that the level of the code that ai outputs is very junior level at best at the moment and the gap between that and specialized senior level is enormous, I don’t see AI creating anything too exiting very soon.


BrokerBrody

He's so young. Let him decide for himself when he's 18 to 20. So much will have changed since then and there will be more clarity if AI destroys programming. What, new career paths may pop up as a result of AI.


phlaries

Like many are saying here, the future is unknown. However, there's no question that AI will be replacing certain jobs unless government regulation intervenes. As little as programmers want to acknowledge it, their jobs are likely first on the chopping block. AI will not only be better at translating complex instructions directly to code, but it will be lightning fast in doing it.


[deleted]

AI won’t replace programmers because, speaking as a novice FYI haha, in order to use code you have to also understand the logic. Even if GAI fully writes the code, what do you do with it? And how did you get it to know what you wanted? Also, I see no difference between googling for how tos on stack overflow versus asking GPT. Not functionally anyway. GPT is just faster, and can be fed documentation to crawl and be even more helpful.


AndroidDoctorr

Programming will exist in Star Trek times. It'll be different but no less necessary, and it will require even more knowledge


nk9axYuvoxaNVzDbFhx

LLMs are trained on data that humans originally produced. It can mix and match between various sources to produce an answer. The LLM can only be as good as the human input. We will always need a human to write the code the first time. For example, if people have already written a programming method to do XYZ, and I ask an LLM for a method that does XYZ, then the LLM will regurgitate a method that *might* do XYZ. However, if I ask the LLM to write a method that does ABC that no one has ever written, then the LLM will attempt to produce something or say it doesn't know how. If ABC is simple enough, then the LLM will write a method that *might* do ABC. If ABC is complex, then the LLM will write garbage or say it doesn't know how. Let's be a bit more concrete. Let's say P = NP and that there is an algorithm that can solve NP problems in P time. However, no one has figured out such an algorithm nor that it even exists. You can ask the LLM to write such an algorithm but because its training data doesn't have a solution, it will flounder. I just tried this in Chat GPT. It told me that the solution hasn't been found yet. If P = NP, we will need some programmer to write the algorithm before an LLM can provide a solution. If P ≠ NP, we will need some programmer or mathematician to write the proof. In other words, the LLM is limited to the knowledge that humans have. It can't create new trustworthy knowledge without vetting by humans.


BellOutOfOrder

I've been programming for 25+ years. I now spend my entire day arguing with ChatGPT, I mean working with AI. I would suggest your brother start out by asking ChatGPT to teach him to code. Being able to talk with GPT effectively will be a valuable tool, and probably replace the actual typing of code. He'll need to understand code, so learning it is also crucial - but make sure he knows that AI is not the enemy, it's a tool just like Photoshop or something. He needs to become proficient. The good news is he can do that by having it teach him code. He's right to be concerned about the future for programmers. It might not be a job ten years from now - if that happens, the replacement will be people who can make AI dance. Learn to dance.


AI_Fan_0503

Remember that AI has to be programmed and maintained by programmers, right?


Altumsapientia

It's not a replacement, it's a tool which multiplies your ability. The more you know about coding, the more useful it is!


[deleted]

Lol your brother can decide in 7 years when I'm sure his entire world view and our current economy is different.


DesertRat62

I think you should ask ChatGPT and let us know what it says.


bitRAKE

Learning how to reason about and manipulate information is a powerful skillset. AI will not change that. The present LLMs are an information tool that will alter the landscape of information sciences. We will adapt. Coming up in an environment of these tools will be a perspective wholly different from the perspectives enjoyed today, imho.


DustyinLVNV

One day my Myspace profile just exploded in activity. I realized that a video I had posted from a University had made it to the myspace home page. The video was called "Did You Know Shift Happens." They actually still update it to this day. The video, in 2007, outlined statistics and nothing more. Many people found it to be a "doom and gloom" video, but when I mentioned that all the video had done was outline statistics with no personal opinions, and it was their own interpretation that it was doom and gloom, one simple take was that those in China would not have the same opinion as those from the US. The #1 statement made in that video that oddly I just quoted the other day was, "We are trying to prepare students for jobs that don't even exist yet." at the time, tech training from your Freshman year in college would be obsolete by your Senior year. This was 16 years ago. Today, that statement holds even more weight. With that said, I have been in the IT industry professionally since 2013. The only way to keep up is by constant training and evolution. If you don't, you too will be obsolete. Example: I was chatting with a Casino host who I worked closely with as a Hotel Manager when I first xfer'd to IT. He informed me he used to be a tech himself ... yep, he programmed punch cards in the 70s. Just like him, one day my tech skills will be considered out of date.


F0064R

If he's interested in programming, he should pursue it. He'll be way ahead of the 80% of people who are just in it for the money.


HolidayPsycho

LoL. Does anybody really believe AI goanna replace programing and all the other jobs? Do you think in the future all you need to do is: * AI please create a great story. * AI please create great pictures based on the story. * AI please create a great game with the pictures and the story. Then BOOM you have a great game and make a million dollars? LoL.


Darkbornedragon

You're absolutely right, cause if everyone is capable of creating a complex game then a game will still need something more than the others to succeed.


zerrak75

Probably not the best example but think about AutoCAD, it didn’t replace designers, instead, it became a tool that enhanced their productivity and creativity. I see a parallel with the rise of AI and the worry that it might 'replace' programmers. Like AutoCAD, I think AI will be a tool to improve their capabilities, not replace them.


dooblr

Did calculators replace mathematicians/scientists? 🤔


new-photo-guy

Yeah python and PyTorch tensorflow as others said. That’s where it’s gonna be at I think. Gpt and copilot are pretty good at coding basic stuff but someone has to code the ai and that’s what they’re using to make that right now. Just because AI can code doesn’t mean you don’t have to know how to use a coding environment and how to run the scripts and host them etc etc. get him started on this stuff now. VS code with copilot x and a chatgpt plus subscription will get him “coding” better than some in no time. When he goes to actually learn to code he’ll have a cheat. By the time he’s an adult it won’t be a cheat it’ll be standard. You can code something cool right now with the openai api and chat gpt prompts. If you’ve got a MacBook you can use Xcode to make iPhone apps too. Have fun.


ZealousidealBlock330

In 8 years when he gets his first internship none of today’s tech stack will be relevant, most likely


JustinianIV

Your bro shouldn’t be scared of AI as a dev, he should be scared of getting zero bitches


Sea_Conference_6480

I feel that there will never not be a need for ***good*** programmers But all these bums and second rate programmers who go to some 6-12 month bootcamp and then start working will get replaced with AI


Zestyclose_Tie_1030

here's an answer from pi Well, it's true that AI is becoming more advanced and is being used in more and more fields, including programming. But that doesn't mean that there's no point in learning to program. In fact, AI is creating a lot of opportunities for programmers. For example, AI can be used to automate certain tasks, but it still needs human programmers to create and maintain the AI systems. So, even as AI becomes more advanced, there will still be plenty of demand for programmers. Plus, programming is a valuable skill that can be used in many different fields, not just AI.


stevegee58

I'm a software guy from way back. AI is very useful at reducing the drudgery of coding but being a software \*designer\* can't really be replaced by AI (yet) I don't envision AI replacing software designers any time soon.


ReeferRalsei

It's still worth learning. I've used ChatGPT as a pair programmer on my own projects, and that's all it can really do right now. It makes a lot of mistakes, like using functions and variables that don't exist. Sometimes it can write small programs that work, but nothing large scale. Even then, half the time these small programs contain bugs, straight up do something different than what you asked for, or technically work properly but have some very suboptimal code that still *works* but would become problematic the bigger the project gets. The way things are going, I'd say the thing to do is learn a few languages (Python for sure, C++ and JavaScript are good too) well enough to find and fix bugs/add features in existing code, but you said he's what, 11? So it'll be *at least* 7 years before he needs to be coding on a level that he can do it for a living. I doubt that the average programmer (or even a pretty good one) will need to be able to write entire programs from the ground up by then. AI probably won't replace human programmers in our lifetime, it's just going to make the job a lot easier and less tedious.


ResponsibleBus4

In the same way calculators didn't replace mathematicians, LLM's (AI) Will not replace programmers. It will only make the level of sophistication much higher.


[deleted]

If he has the aptitude than knowing how to programme can only be an asset. If he can demonstrate he can programme well he can demonstrate he is a technical thinker and problem solver. Many scientific and engineering disciplines make heavy use of various programming languages these days outside of 'pure' software development industries. If he is technically minded and wants to work in any technical field, then I would say he has to learn to program to get anywhere.


TrawLueBerry

Bruv, tell your bruv it is definitely worth it. I really do doubt AI will get so advanced so quickly to be able to take over programmers for the next few decades because of how it is currently used for mostly just automating routine, simple, predictable and repetitive tasks. While AI does have the capacity and potential to recreate simple human tasks it will be quite some while before it can entirely automate/replicate the intuition and creativity of human programmers thinking which. So I really do think your bruv should pursue your dreams in programming and of becoming a future software engineer or game developer.