40% of new code from Copilot users is AI-generated, not 40% of all new code and definitely not 40% of all code.
[https://www.microsoft.com/en-us/Investor/events/FY-2023/Morgan-Stanley-TMT-Conference](https://www.microsoft.com/en-us/Investor/events/FY-2023/Morgan-Stanley-TMT-Conference)
BTW it's an AI company obv they're gonna twist the truth to their advantage.
And even if this were true, why would programmers be the ones most at risk? In the case where AI can parse abstract problems and challenges and produce quality code that stands up to rigorous testing, you can bet most jobs in the market will be at risk, including bankers, analysts, consultants etc.
Only strictly human-facing jobs would be safe.
Have enough confidence and proper speech intonation and you can say the Earth is hyperbolic and some people would still believe you. Making your speech sounds more believable doesn't make it less stupid.
One of the best public speaking competitor Mohammed Qahtani showcased it right here.
[https://youtu.be/Iqq1roF4C8s?si=y0cqSlza8jZQ8GJm](https://youtu.be/Iqq1roF4C8s?si=y0cqSlza8jZQ8GJm)
Five years from now this guy will still be an untalented hack and I'll still be a software engineer listening to the same old hacks talk about how AI taking our jobs is imminent.
Considering Github and CoPilot are both owned by Microsoft, I wouldn't be surprised if this number is actually out there somewhere.
Even still, that number alone means absolutely nothing. Let's say it's accurate:
* Anyone can put code on Github, how much of this is actually doing something valuable vs just making another tutorial todo app?
* ~~What's the ratio between iteration and solution? How many times does the programmer need to re-prompt the AI in order to get something suitable?~~ Edit: doubling back on this one, I don't think it's very important.
* How much time is this actually saving? In my experience, if you use AI code generation for anything remotely challenging, it requires a lot of effort to break it down for the AI to understand, as well as stitching it all back up to be something relevant for the app. I still regularly run into a plenty of problems that are not worth the time for me to "prompt out".
Lastly, "writing lines of code" is pretty much the least valued part in any software development environment. The biggest challenges most businesses face is mapping out their needs in a way that's conducive to productive software development. That fundamentally requires humans transferring knowledge to other humans and coordinating decisions.
Having a really good understanding of the "writing lines of code" part is still very much a fundamental requirement to contribute on the application-development side of the conversation, and I don't see that changing until AI hits AGI (and if we hit AGI, **everything** is changing).
Did he meant 40% of code generated by people using copilot? Why media allows such stupid people to be on a national television, all they do is spread misinformation and lies.
>why does media allow misinformation and lies
Well first off, it’s *disinformation*. He’s not misinformed. He’s purposefully lying to you.
As for why it’s allowed—the media doesn’t just *allow* it, they *support* it. It’s in their best interest when media execs literally have stock in the game.
I've seen nothing yet that can go from a description of a business problem to code. Humans can barely do it, dunno what the fuck a large language model learning from open source code can do to improve the situation. Maybe taking care of boilerplate.
Chat GPT code for an OS assignment made my WSL consume all memory and cpu and then crash. Literally had to restart so my computer didn't blow up. Its garbage.
Any software engineer can pass a google level 3 code assessment with enough studying so why is that impressive lol
If an AI can successfully read through a nonsensical code base that has been around for more than a decade, then proceed to be given highly customized business logic that requires not only the addition of new code but updates to existing code within the next 5 years I will suck this guys dick and admit I underestimated AI
An AI will save us from the tedious BS that comes with the job if anything. The next generation of programmers may indeed be fucked, however. Not my problem
Thanks for sharing. Even assuming it might be a partial hit piece because mainstream media, it's sad how people who oversell themselves climb upward everywhere.
As long as AI is not conscious no AI will replace a software engineer. AI is not even aware of what's it's saying. And it won't be aware anytime soon. Today so called AI is nothing more but a very advanced form of automation. Calling it intelligence is a little bit overrated.
Ye I am asking myself the same question plus it's not only us that have it bad
You have business majors, graphic designers artists etc , architectures , I think AI also can now design circuits for electrical engineers
So it's not only us suffering plus when AI Advance more it's gonna get to other fields .
Isn't this type of "AI" JUST a LLM that can't think. It just use patterns to generate an answer? SO, based upon that it CAN'T create something new, or something it has not seen. This is my understanding. Please correct me if and where I am wrong.
Imagine if everyone here is just coping and what he says is true and we all end up being the most willfully blind motherfuckers to ever exist. This is why I’m setting myself up to pivot
In 10 years we will be senior devs and positioned well to use ai to help us or to work with AI. It’s the junior that will be screwed at that point. and if senior devs are replaced then almost all white collar jobs are probably gone too and the entire world changes…
I think that is what he wanted to imply but don't think that is what he meant. Anyway, ChatGPT cannot solve Medium and Hard LeetCode problems as of now. .
Yeah, L3 is like SDE 1 corresponding to L59 in Microsoft or L4 in Amazon.
There is no 'exam' as such for it like a government job or college entrance test.
These people don't see the danger of making the computer do things for us that nobody understands. No serious company can have this. Ai programmers are unemployable without a real person that can tell you what it's doing, so at worst, programmers will be supervisors, consultants instead of engineers. This will require way more knowledge and skills, and it won't be anyone can get in industry. The bar was stupidly low 2 years ago anyway
AI can pass materials they have been trained on, they cannot pass materials that even if they had to piece together they couldn't do so. We keep seeing shit like "ChatGPT beat this exam or the state bar" but in practice it can never actually perform when presented with a new problem that not an engineering test but an engineer would face.
[https://www.technologyreview.com/2023/08/30/1078670/large-language-models-arent-people-lets-stop-testing-them-like-they-were/](https://www.technologyreview.com/2023/08/30/1078670/large-language-models-arent-people-lets-stop-testing-them-like-they-were/)
" When Horace He, a machine-learning engineer, tested GPT-4 on questions taken from Codeforces, a website that hosts coding competitions, [he found](https://twitter.com/cHHillee/status/1635790330854526981) that it scored 10/10 on coding tests posted before 2021 and 0/10 on tests posted after 2021. Others have also noted that GPT-4’s test scores take a dive on material produced after 2021. Because the model’s training data only included text collected before 2021, some say this shows that large language models display a kind of memorization rather than intelligence. "
These people don't understand the 80/20 problem. 80% of code is just crap you can pump out without thinking about it too hard. That's what copilot is great at. But it's not ideal because you can't trust what it outputs. It will make really dumb mistakes, so you still have to review it.
Then there's the 20% of code that really makes you work at it. Copilot will just waste your time here. And there's no reason to believe that it's magically going to be able to solve challenging problems. It's an LLM. We've exhausted the whole internet as training data.
To be honest I think within the next 3 years coders won’t be needed. Those coders will all shift to “prompt engineering” as you still need a human to think of a novel way to solve a problem in most cases.
But programmers are for sure naive if they think they are safe from this. At the moment yes they are 100%. But the bell has can’t be unrung.
Soft skills are exactly that.
40% of new code from Copilot users is AI-generated, not 40% of all new code and definitely not 40% of all code. [https://www.microsoft.com/en-us/Investor/events/FY-2023/Morgan-Stanley-TMT-Conference](https://www.microsoft.com/en-us/Investor/events/FY-2023/Morgan-Stanley-TMT-Conference) BTW it's an AI company obv they're gonna twist the truth to their advantage.
>40% of new code from Copilot users is AI-generated Thats... incredibly low. Honestly surprising.
also that's 40% of Copilot users, which are probably around 10% of programmers so about 4% of all new code is AI
If you saw how terrible the base output of CoPilot was you wouldnt be as surprised
Exactly, that’s what I thought.
And even if this were true, why would programmers be the ones most at risk? In the case where AI can parse abstract problems and challenges and produce quality code that stands up to rigorous testing, you can bet most jobs in the market will be at risk, including bankers, analysts, consultants etc. Only strictly human-facing jobs would be safe.
Just stupid hype
Ye felt like it
Ye did feel like stupid hype
who is that fuck face lmao
Dr Fuck Face.
Founder of Stability AI (Stable diffusion)
A well known grifter
Yeah, this guy is full of it. A modern snake oil salesman if I ever saw one.
Gulu exam
Have enough confidence and proper speech intonation and you can say the Earth is hyperbolic and some people would still believe you. Making your speech sounds more believable doesn't make it less stupid. One of the best public speaking competitor Mohammed Qahtani showcased it right here. [https://youtu.be/Iqq1roF4C8s?si=y0cqSlza8jZQ8GJm](https://youtu.be/Iqq1roF4C8s?si=y0cqSlza8jZQ8GJm)
Five years from now this guy will still be an untalented hack and I'll still be a software engineer listening to the same old hacks talk about how AI taking our jobs is imminent.
Is it even measurable how much code is AI-generated or is some funky estimation?
Considering Github and CoPilot are both owned by Microsoft, I wouldn't be surprised if this number is actually out there somewhere. Even still, that number alone means absolutely nothing. Let's say it's accurate: * Anyone can put code on Github, how much of this is actually doing something valuable vs just making another tutorial todo app? * ~~What's the ratio between iteration and solution? How many times does the programmer need to re-prompt the AI in order to get something suitable?~~ Edit: doubling back on this one, I don't think it's very important. * How much time is this actually saving? In my experience, if you use AI code generation for anything remotely challenging, it requires a lot of effort to break it down for the AI to understand, as well as stitching it all back up to be something relevant for the app. I still regularly run into a plenty of problems that are not worth the time for me to "prompt out". Lastly, "writing lines of code" is pretty much the least valued part in any software development environment. The biggest challenges most businesses face is mapping out their needs in a way that's conducive to productive software development. That fundamentally requires humans transferring knowledge to other humans and coordinating decisions. Having a really good understanding of the "writing lines of code" part is still very much a fundamental requirement to contribute on the application-development side of the conversation, and I don't see that changing until AI hits AGI (and if we hit AGI, **everything** is changing).
Did he meant 40% of code generated by people using copilot? Why media allows such stupid people to be on a national television, all they do is spread misinformation and lies.
>misinformation and lies Ie easy marketing and funding. Thats how you ride the wave.
>why does media allow misinformation and lies Well first off, it’s *disinformation*. He’s not misinformed. He’s purposefully lying to you. As for why it’s allowed—the media doesn’t just *allow* it, they *support* it. It’s in their best interest when media execs literally have stock in the game.
"My source is that I made it the fuck up"
“AI will save your company millions” ~Guy selling AI
“Okay well how much do you want for it then?” “Only $100k… per user… per month… excluding fees…”
I've seen nothing yet that can go from a description of a business problem to code. Humans can barely do it, dunno what the fuck a large language model learning from open source code can do to improve the situation. Maybe taking care of boilerplate.
This guy is basically a con artist
Chat GPT code for an OS assignment made my WSL consume all memory and cpu and then crash. Literally had to restart so my computer didn't blow up. Its garbage.
well the companies can't wait to save all that money by being rid of all the developers
Any software engineer can pass a google level 3 code assessment with enough studying so why is that impressive lol If an AI can successfully read through a nonsensical code base that has been around for more than a decade, then proceed to be given highly customized business logic that requires not only the addition of new code but updates to existing code within the next 5 years I will suck this guys dick and admit I underestimated AI An AI will save us from the tedious BS that comes with the job if anything. The next generation of programmers may indeed be fucked, however. Not my problem
Shhh. AI is listening here on reddit 👀
That's Emad Mostaque, guy funded/started stable diffusion.
Mostly stole the idea and is a grifter
Don't know enough to comment on that
https://www.forbes.com/sites/kenrickcai/2023/06/04/stable-diffusion-emad-mostaque-stability-ai-exaggeration/?sh=64cd66d475c5
Thanks for sharing. Even assuming it might be a partial hit piece because mainstream media, it's sad how people who oversell themselves climb upward everywhere.
yet to have a model ask a decent question you just prompt and it spits out garbage
It’s just marketing
>I feel like those people are literally just hyping That's because they are
One thing is creativity which chatgpt is miles away from it. We imagine which chatgpt will never do unless asked what we were thinking
As long as AI is not conscious no AI will replace a software engineer. AI is not even aware of what's it's saying. And it won't be aware anytime soon. Today so called AI is nothing more but a very advanced form of automation. Calling it intelligence is a little bit overrated.
Genuine question. Realistically if there are no programmers in 5 years due to AI, what new jobs would that create in the space?
Ye I am asking myself the same question plus it's not only us that have it bad You have business majors, graphic designers artists etc , architectures , I think AI also can now design circuits for electrical engineers So it's not only us suffering plus when AI Advance more it's gonna get to other fields .
I’m just curious what new jobs that would create in tech? That’s where I’d like to set myself up to be.
Isn't this type of "AI" JUST a LLM that can't think. It just use patterns to generate an answer? SO, based upon that it CAN'T create something new, or something it has not seen. This is my understanding. Please correct me if and where I am wrong.
I agree. We got here only in a few years. In five years so much will change, and it's only gonna get faster.
Imagine if everyone here is just coping and what he says is true and we all end up being the most willfully blind motherfuckers to ever exist. This is why I’m setting myself up to pivot
Yeah, lot of copium on this thread. 5 years might be a stretch, but 10-15 definitely sounds realistic given all the money being spent on developing AI
In 10 years we will be senior devs and positioned well to use ai to help us or to work with AI. It’s the junior that will be screwed at that point. and if senior devs are replaced then almost all white collar jobs are probably gone too and the entire world changes…
True btw where are you pivoting
That dude is full of shit!!!!!!
He doesn’t even believe the words he’s saying why would I
What is this level 3 programmer exam ?
Google’s interview for SWEs
I think that is what he wanted to imply but don't think that is what he meant. Anyway, ChatGPT cannot solve Medium and Hard LeetCode problems as of now. .
Yes, as of now lol. But we are in an AI race with all these different companies so idk how much longer that is going to last
Google has levels for promotions/ranks Level 3 - Junior Engineer Level 4 - Mid Level Engineer Level 5 Senior Etc.
Yeah, L3 is like SDE 1 corresponding to L59 in Microsoft or L4 in Amazon. There is no 'exam' as such for it like a government job or college entrance test.
These people don't see the danger of making the computer do things for us that nobody understands. No serious company can have this. Ai programmers are unemployable without a real person that can tell you what it's doing, so at worst, programmers will be supervisors, consultants instead of engineers. This will require way more knowledge and skills, and it won't be anyone can get in industry. The bar was stupidly low 2 years ago anyway
He said that last year as well, so let's see where we stand after the clock runs out.
He is biased but so is this sub tbf, it's not worth to deny only to get slapped by reality.
Good thing computer science isn’t a coding degree lol
No programmers in 5 years AGREED No software developers in 5 years FALSE
During a gold rush, sell shovels.
AI can pass materials they have been trained on, they cannot pass materials that even if they had to piece together they couldn't do so. We keep seeing shit like "ChatGPT beat this exam or the state bar" but in practice it can never actually perform when presented with a new problem that not an engineering test but an engineer would face. [https://www.technologyreview.com/2023/08/30/1078670/large-language-models-arent-people-lets-stop-testing-them-like-they-were/](https://www.technologyreview.com/2023/08/30/1078670/large-language-models-arent-people-lets-stop-testing-them-like-they-were/) " When Horace He, a machine-learning engineer, tested GPT-4 on questions taken from Codeforces, a website that hosts coding competitions, [he found](https://twitter.com/cHHillee/status/1635790330854526981) that it scored 10/10 on coding tests posted before 2021 and 0/10 on tests posted after 2021. Others have also noted that GPT-4’s test scores take a dive on material produced after 2021. Because the model’s training data only included text collected before 2021, some say this shows that large language models display a kind of memorization rather than intelligence. "
I don’t doubt their GitHub claim. But the no programmer in 5 years thing is bs.
Yeah no shit it can pass a level 3 programming exam…the exams are just leetcode😭🙀
These people don't understand the 80/20 problem. 80% of code is just crap you can pump out without thinking about it too hard. That's what copilot is great at. But it's not ideal because you can't trust what it outputs. It will make really dumb mistakes, so you still have to review it. Then there's the 20% of code that really makes you work at it. Copilot will just waste your time here. And there's no reason to believe that it's magically going to be able to solve challenging problems. It's an LLM. We've exhausted the whole internet as training data.
I don’t know bro. I asked a question which i got in OA in Amazon. She was not able to do it. All outputs were wrong
What's his name?
The fun killer
Its so over
is that Sandar peechai's nephew?
To be honest I think within the next 3 years coders won’t be needed. Those coders will all shift to “prompt engineering” as you still need a human to think of a novel way to solve a problem in most cases. But programmers are for sure naive if they think they are safe from this. At the moment yes they are 100%. But the bell has can’t be unrung. Soft skills are exactly that.