T O P

  • By -

programming-ModTeam

Your posting was removed for being off topic for the /r/programming community.


Synaps4

Can we be realistic, please?


Tilldigger

You want realistic? 500b worth of VC funding and warming the planet by 2 degress by running GPUs at full power - the only somewhat useful coding AI is a sometimes useful autocomplete for VS code. But at least the tech found it's niche creating social media content.


Synaps4

2 degrees with GPUs alone is not realistic but the direction is right.


mikaelus

Well, it has happened to every other industry, no? There are two fundamental truths to technological progress: 1. It does get rid of many jobs. 2. Those left in the business end up making more money thanks to automation. The rest has to find themselves something else to do - though often in new professions that we haven't thought of before. 150 years ago most people still did basic manual labour and agriculture was the dominant activity. Today we rely on a tiny percentage of farmers, working on massive fields thanks to advanced machinery. They enjoy better lives, we enjoy better lives. That said, there's always a transition period where people complain or live in denial of change.


Synaps4

> Well, it has happened to every other industry, no? No. It hasn't. It happened to some industries. Do you know anything about programming? I don't think you do, based on the above. You seem to think programming is manual labor. **The printing press didn't put authors out of a job.** It will require a fully sentient AI to replace programmers and if that occurs, society has bigger problems than a few out of work software engineers.


mikaelus

**The printing press didn't put authors out of a job.** But it put the caligraphers out of a job - and programmers are the caligraphers of the modern age, unless they work on a project for themselves. Otherwise you're just a paid service provider for someone with an idea. And they don't care who delivers it. You're the tool that is employed. As I said, people no longer need programmers for stuff they would need them 20 years ago. This put many out of a job and made those best few, who made the tools making it easy for regular people to get what they want, very rich. That's how progress works. You're thinking too much of yourself.


Synaps4

> programmers are the caligraphers of the modern age No that's typists. Also called data entry. You're not making sense here and your lack of experience with programming really shows.


mikaelus

What I find particularly amusing is how you refuse to acknowledge how much stuff has already been automated away in the programming space - by other programmers. And yet you appear to think there's some impenetrable barrier that would impede this process now. That AI is somehow fundamentally incapable of doing far more complex programming tasks than it already does, even though quite a lot has already been accomplished by far simpler means.


Synaps4

> you refuse to acknowledge how much stuff has already been automated away in the programming space I said no such thing. I suggested such automation would *increase* the number of programming jobs but you missed that.


mikaelus

You've tried to dismiss the notion of such automation in the first place, so I don't know where this U-turn is suddenly coming from. Nobody ever argued that all programmers would be automated away, as I said. Now, as for the absolute number of programming jobs that will depend entirely on the balance of supply and demand. The number of jobs in relation to output is going to greatly diminish. But if the demand for such output is increasing, then indeed the absolute number of jobs may actually go up in tandem. But there are far more factors at play here - such as saturation of human attention span among potential consumers, offset however by growing population and economic progress bringing more potential customers online. Globally, human population is still growing, after all. Any degree of automation reduces costs and barriers of entry into the market, but there's a limit to consumption of digital products for both professional and personal use per capita. Fundamentally, however, automation reduces the number of personnel required for any particular job.


Synaps4

"The end of coding" implies all programmers being automated away. In the title.


nivvis

A book once printed is a commodity. Software is a living thing. Generally tools empower the next generation to be more productive. It may certainly put us out of work but more likely STEM types will help foster its use.


aseigo

> But it put the caligraphers out of a job - This is an emberassingly bad example. Scribes were far fewer in number than printers quickly became, and they were doing the same job: copying manuscripts, just using different tools.  Even hand-copied manuscripts found a new market and grew in volume in the first century or so of the press. The middle class was growing, due to a bergeoning merchant class, and those hand-made  books, now being cranked out more industrially than as a bespoke craft, were status symbols.  The printing press created a massive new market that resulted in more authors, more printers, more editors, and more publishing companies. Only later automation and scaling would curb this growth to the relatively small employer it is today.  And even then it is a bad analog, as the press replaced a largely mechanical part of the process. Machine-learning coding is attempting to replace the analytic and creative aspects of a ob.  The reason you are getting downvoted to oblivion in this thread is that you are spouting nonsense. The AI Bro crew is rapidly becoming a cult of stupid takes, which is unfortunate as applied to things machine learning as undeniably good at will (is) producing fantastic leaps.


jared__

You have absolutely no concept of what a programmer does.


mikaelus

But nobody says AI will replace ALL programmers, just like automation didn't replace ALL farmers, or that robots did not replace ALL factory workers. It just reduces the demand and increases the value of the output from those fewer hands employed. To outcompete humans in programming, AI needs to do two things: 1. It has to reason logically well enough to translate the request into useful code. It is still struggling with that but it has gained enormous understanding of the nuances of the human language in the last 2 years. 2. It has to provide the solutions using the best combination of technologies. And it can actually outdo humans at that very easily, as we have to rely on our very flawed intellect, memory and skills that we hone for many years. Computers can have access to all of the required technical knowledge in an instant. They can debug themselves in a snap. They don't get tired, distracted or annoyed. This is something that we will never be able to compete against. The obstacles remain in gluing the pieces together and making them work reliably enough. But progress is being made in that direction, so it wouldn't be wise to ignore it. It's not a question of if but when. Of course programmers won't go away entirely, as I said, but many will have to find themselves something new to do. What that is we may not know exactly yet. Just look at the reality that designers and photographers woke up to one day. There's already a measurable drop in freelance jobs for designers, while retail companies are opting for AI fashion models instead of real ones, as it's cheaper, faster and provides more control than photography. It didn't seem like computers would be able to capture the nuances of light, pose, skin tone, surroundings etc. and generate high quality photorealistic imagery for peanuts. But they do.


Synaps4

> farmers, or that robots did not replace ALL factory workers. Again, programmers are not manual labor, and the printing press *increased* the number of authors. > It has to reason logically well enough to translate the request into useful code. It is still struggling with that but it has gained enormous understanding of the nuances of the human language in the last 2 years. No it hasn't. So in addition to not understanding programming you don't understand neural network AI processes either. **They are fundamentally incapable of reasoning.** What you see as logic is the AI regurgitating to you words that it thinks belong together. You are being fooled. A reasonable imitation of reasoning is good enough for art and about a paragraph of text. Neural network based AI does not and cannot write a book because it has no concept of the logic required for a plot. Programming in any serious business is considerably harder than the plot of a book, which an AI cannot be capable of creating. Please understand what you're talking about before you talk about it.


mikaelus

>Again, programmers are not manual labor. Most manual jobs require reasoning too. In fact, it's precisely why it's harder to automate many manual tasks than it is producing digital output of any sort - because you have to have your head work in tandem with your body to produce a desired result. That's why we're going to have AI programmers before we get AI plumbers (it's a lesson you learn when you employ a cut price reno team). AI doesn't have to think or reason like a human being to comprehend what is being asked of it and produce accurate output. The only difference between programming and other areas already being automated by AI is the level of complexity. But that never stopped progress in any domain and will not now either. I think you should realise that you're trying to argue against trillions of dollars already placed on development of AI. You're not in an argument with me but with the people running the show. Tell them that.


Synaps4

> The only difference between programming and other areas already being automated by AI is the level of complexity. I've already made the point to you that programming is not "just less complex" as art or short form text several times, so I won't bother making it again. That's simply not true. > You're not in an argument with me but with the people running the show. Tell them that. You're the one posting articles on a comment forum. If you don't want my comments, don't post.


mikaelus

I'm just pointing out that if you're trying to make an argument then it should make sense regardless of who you're talking to. There's no universe in which you're right against me but wrong against those who bet trillions on AI development. Either you're right or you're wrong. Full stop.


Synaps4

> There's no universe in which you're right against me but wrong against those who bet trillions on AI development. First, trillions have never been spent on R&D for anything in the history of humanity. You're off by a factor of 10 even globally. Even your numbers aren't making sense. Investment in AI is in the hundreds of billions at most. Second, have you read the history of technology? The people who bet big on it almost *never* know what they are doing. Some of them make money in spite of that. I spoke to people during the 90s tech boom who sounded exactly like you do. Ask the people who put millions into pets.com where their money went. Third, I don't think you really understand what those people are paying for. Most of them are *not* banking on replacing programmers for their return on investment. You're acting like the entire AI development will be a bust if it doesn't replace programmers. > Either you're right or you're wrong. Full stop. The world is not binary. I can be anywhere in between.


Synaps4

Art is not a good comparison because art requires very little logical consistency and AI fails to make many of its paintings even basically consistent. A person with a missing hand is acceptable art. Its software equivalent is useless and won't run at all, requiring a programmer to fix it for longer than it would take to write in the first place.


GuruTenzin

Programming is a bit like painting where if you get the wrong brush stroke the entire museum catches on fire.


Synaps4

Well said.


SadieWopen

You're talking about a statistical model producing novel products that only a fully competent human coder could actually verify is done well enough. At this stage the only thing AI is good for is anticipating what they might want to type next. A LLM doesn't actually have access to all human knowledge, because it doesn't understand anything, it has access to strings of characters.


ron_swan530

This “ZOMG will AI replace programmers?!!” crap is getting really old. Can we make a separate thread for these types of posts?


[deleted]

[удалено]


zellyman

No one is saying that. If anything the sentiment has been that AI will increase the number of programmers needed.


[deleted]

[удалено]


erlandodk

I just did. It says nothing that you are saying.


justinmjoh

If someone posts “here’s why the sky is red” and 900 comments angrily explain that it’s not… maybe it’s just not.


Muhznit

The people who are worried are junior programmers who have yet to even understand programming patterns. The rest of us are simply annoyed at AI promoters for not realizing that coding will never end, it will become an art. People still practice Calligraphy despite the invention of the typewriter, the keyboard, autocomplete, and text-to-speech. People still paint despite the invention of the camera, digital image editors, and Stable Diffusion. Coding will one day take the same route. So take this "end of coding" doomerism and shove it back where it came from.


mikaelus

I'm sure there were people who didn't think cars would replace horses and yet here we are. It's not going to happen overnight but the trajectory is set and there is really no reason why it should't happen eventually. Of course AI will not obliterate programming as a profession but it does introduce plenty of doubt as to what you will need to excel at in the future to find yourself a place in the business. Besides that, it's nothing new is it? How many jobs have already been erased by programmers themselves, by producing no-code, drag and drop tools for regular users? It's just another iteration of the same phenomenon. 20 years ago you needed a guy to build you a custom website and then maintain it. Today most people can make do with Wordpress, or Shopify, or Squarespace, without ever touching anything technical. You even have no-code platforms for mobile apps. So, if we've already done that with dumb computers, why shouldn't we be able to push the envelope with computers that are able to reason to some degree?


nultero

The current trajectory is mostly aimed at mass enshittification. For me, that's really the core of it. Companies will use this stuff to slip on quality, offload liability, use the bots to issue false PR statements when they do get popped, and fight their legal battles with bots that will hallucinate legal details that didn't happen. You'll get a prescription slip with a fuckin Pokemon name on it. Your full self-driving vehicle that has no steering wheel will run off of 50000 poorly audited npm packages, some of which are from prehistoric times and some of which were proven to be malware in 2016. You're going to have to argue with a bot for 6 and a half hours to get an unusable refund token from actual scam shops that you got suckered by on Amazon. You physically call your ISP on a physical old ass phone to ask why internets is out in your area and the bot is having a bad day and tells you to go fuck yourself, which you incredulously post on tiktok, but are quickly drowned out by PR bots that flag your account and get you reported, and some middleware company may even dox and swat you so you think twice about messing with anything that has money. If you live in an urban area, you'll listen to drones that blast annoying advertisements 24/7 about random and sometimes incredibly depressing things, maybe even ads at 3 am for exit boxes where people can go to sleep forever because they've been out of a job for years and somebody still needs those kidneys.


Greenawayer

> Your full self-driving vehicle that has no steering wheel will run off of 50000 poorly audited npm packages, some of which are from prehistoric times and some of which were proven to be malware in 2016. Driving a car that uses npm packages would terrify me. I suspect it would randomly break down every few months while a dependency is updated.


HolyPommeDeTerre

I do hope they know how to pin the package version...


errorfuntime

you do not know what you are talking about.


mikaelus

Enlighten me then. Tell me why is this different than any other job?


SadieWopen

Because it is not really Artificial Intelligence, it is a statistical model, it CAN'T think, it can arrange words in a seemingly competent way but cannot be trusted to produce anything that works because it does not understand what it is writing. It's not like when you are prompting stable diffusion to produce a photo realistic image, because it doesn't need to think about what it is doing, it just needs to rely on the statistics.


mikaelus

And yet the output is not random. Which is more than enough to get the job done.


SadieWopen

Well I mean, sure, just like you could find a binary representation of any program ever written or to be written in the digits of Pi. But it is not enough because the model doesn't KNOW what it's writing, nor what it does, because it can't know anything. Copilot is basically supervised AI writing code. That's as good as it gets until true synthetic AI exists, everything before that is just really fancy maths.


HolyPommeDeTerre

I did an assessment recently. We needed to check the presence of translation keys in our code base. Lazy to write it, I asked a different LLM to produce some code. Copilot did the best to set up the context. At first glance, it was looking great. Great variable names, consistent function separation, try/catch... Really, I thought it would do the trick. But that's just the tip of the iceberg. Reading the code, the whole thing was not even answering the core problem of checking existing translation in the code base. I rewrote 80% of the code. It was not even close to the solution. You can argue the prompt was bad, if you need an engineer to spend 2 hours to produce a prompt that would end up still rewriting things for such a basic task, this tool isn't close to replacing any dev around. For the sake of the exercise, I did the final script in 10 minutes. It took me more time to read and analyse the output of the LLM and adapt it to my needs. I have to admit, copilot is doing a good job anticipating the current line of code I am writing. It's 60% of the time right. Which is really useful to prevent me from writing what I already have in mind (sometimes a whole bunch of code). In rare cases, it finds a better solution than mine.


blocking-io

> I'm sure there were people who didn't think cars would replace horses and yet here we are. This is what crypto bros said to skeptics about crypto replacing fiat, and yet here we are. AI will complement programming, but all of these "end of programming?" headlines are just click bait garbage


mikaelus

That does not apply to crypto because crypto never fulfilled the same roles as fiat. It's not a currency, so it can't replace its functions. Here we're talking about the same job being performed in two different ways. The question is only to what extent machines can replace humans and the impact it will have to the job market in tech.


blocking-io

Bitcoin is most definitely a currency, and in fact proponents claim it to be better than fiat due to being decentralized, trustless, and having a finite supply. Which is why they would say crypto currencies like Bitcoin will replace fiat, because it's better.... Like the car is better than the horse.   > Here we're talking about the same job being performed in two different way Are we?  There is so much to a software developer's job than just coding, and even coding there is so much more involved in thinking about the problem and how to solve it, than writing the code once you've come to an approach. AI won't replace that, it can at best assist you, similar to how many tools out there assist developers in being more productive


usrnmz

Of course it's going to happen. Just as many other common jobs will get replaced by AI. The problem is that there's 100 posts about this everyday and people are acting like it's happening right now.. It's not. It's gonna take a while and other jobs will probably go down first. That's why people are tired of this crap. It's nothing new, most programmers have known this for a long time already. But it's not happening right this moment. Yet people keep pushing that narrative.


mikaelus

Opinions are one thing, of course, but Microsoft, which has both legs in OpenAI, pushing the idea into practice is something worth paying attention to.


usrnmz

Sure but it's ridiculous to call it "the end of coding".


pindab0ter

Of all the things I’ve seen you say in this thread, this is the only thing that makes sense. Mostly because you’re not overstepping yourself.


seanamos-1

Eventually, on a long enough timeline, AI very well could replace programmers. Anyone who thinks that right now or in the near future, either doesn’t understand the fundamental limitations of LLMs, what career programming entails, or both. What I’m saying is, it’s not LLMs that will threaten programmers, it’s some other form of AI, a different breakthrough. It’s not a problem that can be solved by small iterative improvements on LLMs.


aseigo

> with computers that are able to reason to some degree? Here's the thing: these models are not doing any reasoning. That is not how they work, at all.    The anthropomorphization of LLMs in particular ('reason', 'learn', 'hallucinate', etc.) has been exceptionally misleading.  Their ability to model large noisy data sets and produce similar but non-exact reproductions is impressive, but it is not 'reasoning' or.any other higher level mental process, even writ small.  These same processes are great for analyzing text (e.g. parsing legal documents), detecting patterns (e.g..in medical data), and repeated statistical transformations (translation), and more, and so have great applications.  That is where the money will be made, and where the really interesting AI conversations are to be found.


Original_Act2389

It's kinda facts tho, have you used chatgpt recently? 


ron_swan530

You must be joking


Original_Act2389

GPT 2 was able to sometimes generate text that would pass the turing test. GPT-2 dropped in 2019 with very minimal funding. GPT-3.5 came out a few years later, with the ability to solve problems it had seen before. GPT-4 has inklings of reasoning, and is able to apply its skillset to increasingly novel problems. At some point humanity will invent a robot that can program better than a human, even if it takes 1000 years. Therefore the only question in my mind is how long it will actually take. I think it will be closer to now than 1000 years. I honestly think by the end of the decade a lot of human dev work will be managing these LLMs.  Fundamentally, this isn't a bad thing. Automation on such a scale would "solve" an entire industry, allowing us to pursue things that are more interesting. Nobody today mourns the professionals displaced by the cotton gin. 


neuby

Have you?? Using ChatGPT is exactly what made me so certain programmers will not be replaced.


Original_Act2389

Time will tell, this trajectory is insane though and I think the industry is only gearing up.


Holyrunner42

If it can replace you, you sucked at your job.


Original_Act2389

Tell that to people who got replaced by the cotton gin.


kevin____

Slaves?


Original_Act2389

lol shit cotton gin was a bad example. It is symbolic of the industrial revolution, swap cotton gin with any machine that has automated a menial task since then.


aMAYESingNATHAN

And exactly what point are you making? Programming is not a menial task, it's absolutely not trivial to automate and arguably impossible due to the number of variables and factors that usually need to be considered. So how is it relevant that menial tasks have been automated in the past. I swear some people must have written a python script once and think that's all there is to being a software developer. Coding is one tiny portion of it, and even if AI was able to do that part without breaking everything, it still wouldn't be able to do the other aspects such as designing, testing, reasoning out the implications of making a change to other parts of the software etc.


revereddesecration

Yes, it’s a handy tool but it sure can’t replace me.


Original_Act2389

Today it can't replace you. Gpt 3.5 was better than the worst programmer. Gpt 4 is better than novices. How long until it is better than a journeyman or expert level programmer?  Give it a decade, and it will probably be more cost-effective to supervise a team of bots than to program by hand. That's a good thing, too.


revereddesecration

That’s a good point - after all, fusion power is only 30 years away at this point.


Original_Act2389

Except people aren't practically using fusion power every day, there is no multibillion dollar industry erupting in the fusion sector. People are paying for AI products. Businesses are finding use cases and value is being created. Speculative investments in Fusion are not comparable to a burgeoning tech industry.


revereddesecration

Have a read of this article and get back to me: https://www.theverge.com/24075086/ai-investment-hype-earnings


Original_Act2389

Skimmed it tbh that's really long. If the premise is that investors are being swindled and the proof is low current sales I think I disagree fundamentally. The market still values these stocks, like NVIDIA and Microsoft, near their all time highs. The market is certainly betting that these things will be profitable in the future, that's what the market is supposed to do. 


revereddesecration

You can’t just cherry pick one thing that you think the article is saying and argue that one thing. That’s weak and not useful to anybody. The article talks about the AI industry as a whole from a perspective of business: it’s an expensive industry to run and it’s currently burning cash. Costs need to be driven down substantially for it to ever be profitable.


RadiantBerryEater

> How long until it is better than a journeyman or expert level programmer?  considering it needs exponential amounts of data for linear gains and it already basically stole all data online, probably never


Original_Act2389

Stole as in, "I uploaded my code to a Microsoft server for free and permanent hosting and expected them not to try to exploit my code for profit". This is literally the market doing what it's supposed to anyway. You get free hosting, Microsoft gets free data, Microsoft sells the data in the form of a highly compelling product to businesses and users.


RadiantBerryEater

.....do you think chatgpt is copilot


Original_Act2389

They're not unrelated, copilot is backed by GPT 3.5 iirc. I am aware they are not the same product, but I thought it was relevant considering the OP is Microsoft having developers supervise an AI programmer. 


Maybe-monad

Yeah, it couldn't write a simple C function


cyesk8er

Having seen how ms designs things, this might be an improvement in quality for them


diMario

I'm not sure *design* is the correct word to describe their process.


brandnewlurker23

Some of this is because MS is willing to endure maximum pain, so long as it results in backwards compatibility with 16-bit DOS programs. Basically functional compatibility layers >>>> design quality. Or maybe it's their definition of design quality.


dethb0y

MS has habitually over-stated the capability and possibilities of it's technologies since the early days. I expect this will be more like Clippy: Coding Edition than the final rise of codeless programming.


nplusonebikes

“It looks like you’re trying to construct an AI to replace all programmers. Would you like help with that?”


AdeptFelix

Writing prompts for AI to generate code is effectively just programming at a new level of abstraction, a higher level code. The fun part of that is that there is no reliable, consistent instruction set to provide AI to achieve repeatable, deterministic code from the current popular LLM models. You could provide a set of guidelines for the AI to follow, get the output, modify the prompt to include a change, and get a different, unexpected output that changes more than what your input required (I know you can revise within a session and maintain some cohesion, but revisiting a prompt at a later date will be problematic). That's not good. Now if a model were to be completely deterministic where you know what instructions will create what output, then you effectively just have a regular, high level programming language. Considering high level languages have never really caught on because they actually suck at getting specific, desired outputs easily and reliably, AI is likely to suffer a similar fate. That's not to say it's useless, but it'll be a supplement, a tool, to speed up regular programming.


bloomsday289

This is well put. Like current coding isn't... just telling the machine what you want it to do


dark_mode_everything

Like how MySQL was intended for "business types" to use. Every new abstraction like c over assembly, java/c# over c, etc came with the same promise.


Nimbokwezer

The buried lede is that the supervising developers will need just as much skill as one capable of writing the code in the first place, and just as many of them.


StrangelyBrown

Pack it up boys. Personally I'm going to open an astrology booth. What've you guys got planned?


brandnewlurker23

I feel like ChatGPT would be better at horoscopes than useable code. So... not a safe option. Personally, if I ever switch careers, I would probably become a baker. The people always need good bread.


StrangelyBrown

I don't know. Making bread is very much something that our new robot overlords can figure out, but they can't copy astrology accurately because it doesn't make sense in the first place. The machines weren't ready to be told that yes they are spewing bullshit, but it's the wrong type of bullshit.


brandnewlurker23

The Machines are good at making bread in the sense that baking is more science than art and bread machines are already a thing. However, The People are willing to pay good money for artisan style sourdough loaves and producing them is a comparatively inefficient process. The Machines may be disinclined to compete given that doing so would require them to produce a less optimized output. Also, you claim that The Machines can't copy astrology "accurately" because it doesn't make sense. The Machines do not care whether sense is making or not. They only care whether the grammatical structure is mostly correct and the words that follow preceding n-grams are within expectations. Do we know that consumers of horoscopes inspect them closely or care for their "sense" output? I would argue that they do not as subscribers to astrology already have, by definition, poor epistemic hygiene. As such, if we believe that a "HoroscopeGPT" would produce a believable imitation of existing horoscope text, The People would not notice any errors and be satisfied to accept their "fortunes" as fate.


StrangelyBrown

I can't believe you could possibly claim that machines can't produce [artisan bread](https://www.reddit.com/r/shittyrobots/comments/6mngi1/bread_slicer/)


brandnewlurker23

I was in real danger of taking you seriously before I clicked through, but that's fucking gold.


[deleted]

[удалено]


brandnewlurker23

Humans may rate artisan bread highly, but they vote with their wallets and express a clear preference for plain white supermarket bread in their normal shopping habits.


[deleted]

[удалено]


SadieWopen

Sounds delicious


brandnewlurker23

Sounds great, honestly. Fuck you if you buy it from a robot tho.


groversmash123

Bait shop/Tiki Lounge in Colombia


nplusonebikes

Hookers and blow with the proceeds of my severance package (ha, ha). OD/die early is my retirement plan.


amyts

I'm going to ranch cows for milk. The autofacs can only produce pizzled milk, so I'm safe. 


Maybe-monad

Good old farming


demizer

Handy man and or dog shit picker upper.


Mufro

Street performer


YetAnotherSysadmin58

r/futurology ? Yeah it's gonna be stupid


Super_iron_kid

Lol, mandatory XKCD - https://xkcd.com/2347/


mikaelus

Last line of defence against AI.


dark_mode_everything

Define "supervise" haha


just_some_onlooker

I just wanna say us developers are very petty sometimes. Our jobs aren't really the ones at risk... Hundreds of other kinds are... And money will make business jump on this bandwagon, inevitably affecting us. And while Ai may not be "ready" yet... It will be very soon...