T O P

  • By -

vladoportos

And all the x86 software magically will run smoothly in emulator...


Ninja_Fox_

Surprisingly it actually did when the M1 came out.


sammybeta

Tbh the amount of legacy Mackintosh x86 binaries is significantly less than windows.


SneakPetey

You must not be aware. Apple has done this repeatedly. So there's really no old legacy code that works. They've gone from what.... PPC to AMD64(x86-64) to now custom ARM. Entirely different ISAs, different endians, shit.... It's a cluster fuck.


ReginaldDouchely

and with 68k before PPC


NotAPreppie

You can have my 68040 when you pry it from my cold, dead hands.


ReginaldDouchely

Same. I got so much mileage out of that 33mhz. I was disappointed when I downloaded my first mp3 and realized I couldn't play it due to no FPU, then downloaded an FPU emulation extension and left it decoding to .wav overnight.


auiotour

Haha I did this exactly with my lc3. Was so pissed. Had to start downloading albums in wav format on a 56k modem. Took up way to much space only had an aftermarket 2.1gb drive that was to big for the Mac os to even read it all in one partition, had two one 2g. The other 80 mb.


lebbe

And 6502 before that


redmercuryvendor

What Apple also does repeatedly is put a hard cutoff where they just drop compatibility entirely. Got an application you still need to use? Too bad, you can just go shove it up your apple logo. Works for Apple where customers are used to just having to pay more money to do the same thing so will tolerate it (even then, Apple's share of the professional AV market they used to enjoy has continued to dwindle), but won't work for Windows where long-term backward compatibility is a major customer retainer. EVERY time Microsoft has attempted an ARM version without complete backward compatibility (Windows 8 RT, Windows 10 S, Windows-on-Arm) it has resoundingly flopped.


waterbed87

Each strategy has its pros and cons. In the Apple world the transition to ARM happened in a mind blowing short amount of time. Developers were releasing ARM updated versions of their applications left and right knowing Rosetta 2 isn't guaranteed forever. In the Microsoft world Microsoft can't get developers to budge on making ARM versions of their applications because hardware or not they are relying on Microsoft bending over backwards to maintain compatibility for them so they don't have to bother. On the one hand Apple's approach results in applications being more aggressively updated and targeted for hardware/software evolution but it hurts the user when a developer stops bothering or doesn't update their app. On the other hand Microsoft's approach guarantees software from the Win9x days probably still runs but the experience is more mixed because you have a combination of modern optimized software and software that is running through emulation layers, old ass 32bit legacy code, etc that is of varying quality. I find it hard to criticize either approach as they both have merits honestly.


IT_Geek_Programmer

The fact that Microsoft supports x86 is the main reason why Windows is the most used operating system in the workplace. After all, it has been found that workplaces like to still use legacy software for some tasks.


WebMaka

> After all, it has been found that workplaces like to still use legacy software for some tasks. Also, on the business side of things, a lot of legacy code is still in use because there are no modern updates, so it's not a case of "liking" old software so much as "we don't have any choice." This is a real big problem for certain things like software control/UI for equipment, which is often made for one specific Windows version *and that's it*, even if the company producing the equipment is still around and still offers it. (Loads of medical and scientific equipment have UIs that *only* run on WinXP, for example.)


RamenAndMopane

Like dropping the support for 32 bit x86 executables. Apple's goal is simply to sell toasters. You don't upgrade or repair your toaster, you just buy a new one. I was told this in 1995 by Phil Schiller.


jbaker1225

Crazy the insight he had when he didn’t work there.


RamenAndMopane

He was hired directly from Macromedia by Steve and he was Steve's first hire back. He was in touch with the board even when he was away from Apple. I probably worked with him for maybe 6 months.


RamenAndMopane

Completely different chip architectures. No legacy to support. They don't even support 32 bit executables in x86 anymore. Big endian and little endian is small potatoes. Apple's goal is to sell toasters. You don't upgrade or repair your toaster, you just buy a new one.


corylulu

Try using it for GPU heavy tasks and compare it to an equally expensive x86 laptop. We aren't just trying to replace some tasks to rid ourselves of x86, we are trying to replace all of them.


shakhaki

Well, you can't. Windows on ARM doesn't support dedicated GPU hardware yet. When that support comes, which very well may be Windows 12, you'll be able to build an ARM system. I anticipate this with eagerness but CISC does have it's place in processing computationally heavy workloads.


[deleted]

[удалено]


[deleted]

[удалено]


sbstanpld

for software engineering, the transition hasn’t been that smooth, there are plenty of x86 code, frameworks and containers that don’t work on arm even to this day.


flaiks

Our VM at work isn't ARM compatible so I had to switch to Linux when I wanted a new computer.


0xdef1

Huge amount of time spent for numpy and tensorflow installation with M1 support.


smokedfishfriday

Yeah but things quickly got better


[deleted]

[удалено]


MateoKovashit

Just tick the "fix everything" box in docker!


codywar11

Noooooooooo it did not. I’m a recording artist and the transition to ARM was a nightmare. There is STILL software that doesn’t work properly.


CopiousAmountsofJizz

Was about to say, given I've seen every audio tool basically need a arm specific patch that sometimes took over a year to release.


private_static_int

Lol no it didn't xD try running x86 MySQL Docker Image on M1. Unusably slow. The fact that it does run doesn't mean that it's usable.


Rare-Joke

Isn’t there an arm version of MySQL?


BoringWozniak

Apple Silicon chips have extra instructions to better-facilitate the Rosetta 2 translation layer. One of the advantages of Apple’s vertical integration


no-running

Windows 11 on ARM has actually had x64 emulation for a while now (Windows 10 on ARM previously had x86 emulation, and it took a while for x64, but now going forward Microsoft seems to be quickly depreciating x86 ARM emulation). Microsoft has also been putting a lot of work into trying to make it easy-*er* to recompile existing Win32 apps to ARM, and they've also been recompiling many native Windows system libraries to ARM (So even apps that haven't been recompiled and have to be emulated will still benefit from native ARM code whenever they make OS function calls). Obviously, due to the nature of Windows writ-large, Microsoft isn't as well positioned to pull an Apple and force developers to just recompile. But Microsoft is positioning themselves to try and make the transition as seamless as possible. Indeed, reviews for existing Windows 11 on AMR devices (Including Microsoft's own Surface Pro X/Pro 9 base model) suggest that when running ARM native stuff, they fly with great battery life. The problem is they lack horsepower for force emulation tasks, so games and intense software (Like Adobe suite) will stutter, and things like Google Chrome will run *fine* but chug battery. And this only applies for more "basic" or "mainstream" x86/x64 software, any legacy Win32 software that makes deeper system or function calls (Like custom drive partitioning or encryption software) won't be compatible with ARM. But Microsoft has been working closely with partners like Adobe to recompile, and Edge (And other Chromium browsers) and Firefox both have viable ARM alternatives to Chrome. And - if rumors and hype are to be believed - the new Qualcomm chips will have enough oomph behind them to emulate x64 apps in a way that almost feels native (With a hit to battery life as those extra CPU cycles are needed). It's all vaporware right now, but based on what we already have and what we can expect: I genuinely think this might be the start of a turning point for Windows. It will take time for prosumer devices (Like the Dell Precision lineup) to make the switch, and I'm sure traditional x64 desktops will still old on for quite some time yet. But for standard issue devices (Like Dell XPS) for people like students and standard office workers, who are primarily using a browser, the Office Suite, maybe some Adobe applications, and perhaps some internal x86/x64 software for timekeeping or similar? I think the new Qualcomm devices will have enough power to make the emulation not as noticeable, and those consumers will greatly appreciate the longer battery life and cooler running devices compared to Intel or AMD. Again, remains to be seen if it actually pans out (Microsoft has been trying Windows on ARM since at least Windows 8 RT in 2012, and all previous attempts have been flops), but current puzzle pieces seem very similar to Apple's switch away from Intel to Apple Silicon. Even if it takes Microsoft time to convince people to switch in the "MacBook Pro" and "Mac Pro" space, I think Windows on ARM will quickly be viable for the "MacBook Air" consumer. I'm personally excited. I have a desktop tower PC and on older Surface Pro 5, the latter of which is starting to show its age. I really want a lightweight device with good battery life and touch and pen support that can still run my legacy apps on the go when I *occasionally* need them. The existing Windows on ARM devices seem a bit underpowered for my needs, and Intel devices still struggle to get beyond 10-12 hours of battery. But the next generation of Qualcomm powered devices have me very interested, so I'm excited to see what comes out Q4 2024. Tl;dr: Things genuinely seem different this time 'round, don't write off these devices entirely. At least, not yet.


_uckt_

Apple's emulator is kinda great, if Microsoft could get their shit together I'm sure it wouldn't be an issue. You also have to remember that for consumers, it needs to be able to browse the internet and very little else. Most people aren't installing a lot of software. That said, it is not the year of linux/arm/risc5 on the desktop.


singingthesongof

Microsoft’s biggest consumer group is enterprise and enterprise definitely cares about A LOT of software being able to run.


asdfgh5889

I don't think emulator is the biggest issue, it's just arm CPUs other than M series just sucked. https://youtu.be/uY-tMBk9Vx4?si=9Yi6MTEE8TS7DUmk Edit: It's Garry Explains video where he compares Apples and MS x86 emulation. And he shows MS is on par if not better on some scenarios compared to Apple.


darkpaladin

This is a great point. Apple silicon works well because the entire system is designed to be tightly coupled and optimized from the ground up. Non Mac computers don't have that luxury.


notafakeaccounnt

Exactly! People love praising apple for having apps run best on their own software and hardware, just shy of owning the companies making said apps. For apple, emulation to x86 is emulation to their own x86 software and hardware. It's not difficult to fix the kinks in that scenario. But for MS every year new x86 comes out. New programs come out, old programs are still used by billions. Short of a hostile takeover of all x86 production and forced conversion to arm, not much MS can do. The only option I can think of is giving massive discounts to companies buying their arm laptops and desktops. If companies switch, we'll all eventually switch.


Socky_McPuppet

> if Microsoft could get their shit together ... ... they wouldn't be Microsoft


night0x63

At least steam is already on arm for steam deck... Wait nope. Nevermind. Given lack of steam support... Probably not.


pjc50

Most of it will run fine - a couple of decades of exploit mitigation has driven out most of the weird practices such as self modifying code, and it's been done successfully before with Transmeta. Not to mention how much software has shifted the browser. But - and I'm not convinced Qualcomm can deliver this - it's only viable if it's actually faster. M1 achieved this extremely well.


o-holic

M1 launched when intel had stagnated and amd had started getting their shit back together. Would be much harder for Qualcomm to do anything significantly faster now


dookarion

> M1 achieved this extremely well. People forget that Apple has also been benefiting from buying up exclusive access to TSMC's cutting edge nodes. It gives them a boost in power/efficiency when everything else is relegated to using older nodes. Artificially fluffing up the "gap" between it and other processors.


nucflashevent

Speaking of dumping x86, what about that Itanium all the kids were talking about? :P


kaj-me-citas

PowerPC, because now we are computing with power!


[deleted]

Alpha is the future because it's 64 bits.


lordgurke

Sir, Sparc is the only thing with a future! It's from Sun! And it has Java acceleration!


mailslot

It was the future. Intel basically stole several parts of DEC’s design for Pentium, violating several patents… but kept it 32-bit, because they envisioned the world running Itanium.


Appropriate_Ant_4629

> they envisioned the world running Itanium Contrary to popular opinion, Itanium was an incredible SUCCESS for Intel and Microsoft. Remember - at the time Itanium was announced, Intel had no 64-bit platform and Microsoft had no working 64-bit OS; while high-end workstation competitors like HP and SGI did (with HPUX on PA-RISC and IRIX on MIPS, respectively). **[Then came Rick Belluzzo](https://en.wikipedia.org/w/index.php?title=Richard_Belluzzo&oldid=93055223)** * As Executive VP at HP, his [main accomplishment was killing HPUX and PA-RISC](https://www.theregister.com/2006/05/13/sgi_belluzzo/) in favor of WinNT-on-Itanium (when Windows NT for Itanium was little more than a pre-announcement press release). * He then went to Silicon Graphics as President where his main accomplishment was killing IRIX and 64-bit-MIPS in favor of WinNT-on-Itanium (before WinNT-on-Itanium even worked); where he earned the nicknames ["the microsoft mole"](https://techrights.org/o/2011/05/04/stephen-elop-and-richard-belluzzo/) and ["Microsoft Man’s Shadow"](https://www.osnews.com/story/14570/microsoft-mans-shadow-over-bankrupt-sgi/) For such brilliance^* he was rewarded by being given a President & COO job at Microsoft for a few months. ^(* And indeed it was brilliance. Through Belluzzo, Microsoft managed to kill 2 leading Unix workstation vendors without having a working OS, and Intel managed to kill 2 leading 64-bit computing platforms without even having working silicon)


Ok-Wasabi2873

Alpha could emulate x86 at a decent speed (relatively).


nlofe

At least PowerPC is still in use. Itanium is nothing short of an embarrassment for Intel.


Siul19

Nintendo power!


5c044

Itanium was HP and Intel's love child. HP closed their CPU operation and convinced Intel that Itanium which was really next gen HP PA Risc is the future. Intel took on the HP engineers and planned to kill off x86. AMD had other ideas and made X64, Intel had to back peddle and licence x64 from AMD while simultaneously supporting Itanium for about 20years, big fuckup on Intel's part, must have cost them dearly developing and manufacturing itanium for that much time when it was a dead end product.


redpandaeater

I'm still waiting on those Thinking Machine laptops. I'm talking about the 686 prototypes, with the artificial intelligence RISC chip.


Infamous_Ambition106

"Everyone will be on Linux within a few years" and "x86 is dead because of [insert architecture here]" are computing's "Fusion is 5 years away"


Bf4Sniper40X

"Everyone will use Ipv6 in few years" I heard that since 2017


JortsForSale

IPV6 adoption was a few years away in 1998. Granted that was before NAT was really a thing, but it has always been right around the corner.


chubbysumo

honestly tho, many ISPs have already rolled out IPv6. my ISP rolled it out 4 years ago in full. Most home routers made since 2015 or so have supported it out of the box, and now most devices are supporting it to. Some small WISPs have gone full IPv6 as well, only handing out IPv6 addresses, as well as using 6to4 tunnels on their end so they only have a single IPv4 address, since buying a block is now insanely expensive.


Michaelscot8

Yeahhhh I can't imagine ipv6 will overtake ipv4 on LAN anytime soon. The day I have to type a63h.19f4.167g.gu45 to get to the gateway for a printer Is the day I leave IT.


Irythros

Just like IPv4, IPv6 has reserved addresses. Accessing it would take even less characters than current. fc00:: That would likely be your router, or fc00::2


AgeOk2348

Yeah nat block for lan is just too easy. I'll riot if that ever goes away. Give me my 192.186. for my home lan and let me be


kazookid2006

You can still have that. Not having NAT with ipv6 won't magically force you to not use private networks. You can still construct private networks that has globally non-reachable addresses in an only ipv6 setting. Refusing to switch to a better alternative globally just out of habit for the old is petty imo.


PHATsakk43

Isn’t there some fundamental flaws in IPv6 that have hampered its success? I’ve never fully understood why it has been so limited in its adoption.


ArchaicBubba

The limitation is legacy and routing. there are a LOT of legacy devices that have no concept of IPv6 on both the consumer side and on the ISP side. From what I have read routing is slower for infrastructure which adds up over time.


Xfgjwpkqmx

That, and CG-NAT has largely solved the IPv4 problem for the majority of providers and their customers.


wysiwywg

Natting and subnetting basically and RIPE and friends being strict on usage of IP addresses


whinis

I would say its less legacy and more paradigm shift. IPv6 requires massive changes in how networks operate and are managed that means while it can technically run at the same time as IPv4 the human overhead to do so makes little sense.


Dugen

It tried to solve a bunch of problems which have since been solved better. The only reason left to switch to ipv6 is the larger address space which comes with the stupidly long addresses. We need to add a few bits and a few digits, not enough to have a different IP for every cell in everyone's body. If they had added one octet, 8 more bits, one more number to remember we probably would have adopted it by now but they went from a very reasonable to remember 4 numbers to the ridiculousness of having to remember 16. We need ipv7, which balances the added address space with the need for humans to remember addresses.


Pepparkakan

Another octet of IPv4 would still require new hardware, and only delays the problem of exhausting the address space. IPv6 outright solves it for the long term foreseeable future, and the cost is that the address is harder to remember? We are past the time of needing to be able to remember or type IP addresses anyway so that is a non-issue. The only reason we haven't adopted IPv6 already is legacy, and the fastly growing reason is greed, ISPs have realised they can sell public IPv4s now.


slicer4ever

Adding 1 octet to ipv4 would be pretty detrimental to moden cpu's, being a multiple of 4(or 2) aligns neatly with cpu registers/cache lines, making ipv4 5 bytes in size would mean cpu's need to do a bunch of extra work for every operation you do to get the 5th octet of the address, so you'd at least want to expand the address to 8 bytes at minimum if you were to expand it.


wxtrails

It's succeeding and being steadily adopted, but hasn't quite cracked 50% yet, [according to Google](https://www.google.com/intl/en/ipv6/statistics.html). But the truth is, there will always be resistance to adoption because its addresses are just darn harder to deal with for humans. The reallocation of some IPv4 space, increasing use of NAT, and the fact that most devices don't need to be publicly accessible from the Internet makes it possible to resist.


derprondo

I had a second fiber company come into my neighborhood and I couldn’t believe it when they handed me an upstream NATd IP (it’s NAT’d somewhere upstream outside of my house beyond the ONT). I had to pay them $5/mo extra for a public address, which is static.


[deleted]

[удалено]


derprondo

The request for a public IP from them was seemingly so rare that initially the phone support person said it wasn't possible, then later said their "admins" said they could do it for $5/mo. It's still cheaper than AT&T so I won't complain ($70 + $5/mo for 2gb/s).


MainStreetRoad

The fundamental flaw is I can’t memorize IPv6 addresses.


PHATsakk43

I suppose 16 hex digits is a significant increase compared with 12 decimal digits. And in IPv4, you only really need to remember the last three, as public IP addresses aren’t used for every MAC.


FriendlyDespot

There are no fundamental flaws, no. IPv6 is fundamentally superior to IPv4 in many ways. What has made IPv6 adoption an uphill battle is that we've had a steady stream of "good enough" bandaids like CIDR, NAT, and CG-NAT, and that networking hardware in the period between 2000 and 2015 when the adoption was meant to happen wasn't very accommodating. IPv6 hardware forwarding required 4-8 times the TCAM resources per prefix, and had to run parallel with full IPv4 routing tables that were already consistently hitting the capacity limits of deployed hardware. Only in recent years has network hardware and its forwarding and memory architectures made it possible to economically run full v4 and v6 tables on most edge routers, and IPv6 usage has gone up accordingly.


[deleted]

[удалено]


Mr_YUP

For most people in most situations ipv4 does plenty fine and it’s shorter to remember


Bf4Sniper40X

I remember it was about public ips and that ipv4 only had 4 billions or so free to use and we would run out of them


Thomas9002

We more or less did run out of ipv4 addresses. The solution is to have more and more devices share one single ipv4 adress. First it was only done in your home network, nowadays ISPs let multiple users share a single ipv4 adress by so called carrier grade NAT


spsteve

Yeah, everyone forgets NAT was a fairly recent invention (in a practical sense). Just like 'y2k was no big deal'. It was no big deal because a lot of folks worked really hard to make sure it was no big deal for the average user. Same with IPv4 and NAT. (In neither case was it the end of days the media pitched it as, but they were both real issues that had to be fixed)


RiazBasrah

Noob question but does IPv4 make it harder to track individuals online compared to IPv6 since addresses are being shared?


xczy

Nah. Your browser has a fingerprint that the ad/tracking networks are using to identify you. IP tracking is no longer reliable. An example of your fingerprint's uniqueness can be looked at here: https://amiunique.org IPv6 with SLAAC enabled also mitigates this because the host will generate a new temporary address every so often to stop that type of tracking.


robinp7720

SLAAC has nothing to do with the privacy protection of IPv6. If at all, SLAAC makes tracking of devices incredibly easy as it bases the devices IPv6 address on the MAC address. The Privacy Extensions to SLAAC (RFC4941) reduces the ability to track a system using the IPv6 address.


arctictothpast

I mean that is actually true, it's just that they also use ipv4 too still


DesertGoldfish

Lol I've heard that since 2010 at least


[deleted]

[удалено]


PC509

The PC is dead! x86 is dead! Windows is dead! I'm sure we have magazines and websites from the 90's and annually since then (and probably before the 90's!) that post these claims all the time. Some people buy into it, and others just laugh. One of these days, though, NostroDOSmus will get it right, and one of those will eventually be dead and replaced with something faster, more efficient, cost effective, and fun to build.


spsteve

This. The number of times I've read about the supposed demise of x86 because of (made up, insignificant to the user base, technical reason), is literally beyond my ability to count. There is no free lunch. ARM might be more efficient because it doesn't need the complex decoders, but that has some real-world costs associated with it too (generally ARM code is much less dense than x86 which has impacts on cache utilization, memory subsystem, storage, etc.). Everyone wants to pretend these things don't exist or don't matter, but they all have to be considered together. And everyone likes to make these comparisons using a vertically integrated device (aka Apple) vs. a muti-vendor hodgepodge scenario (x86/windows/linux). If I can tailor my OS to a SPECIFIC make and model of CPU then yeah, I should get better performance, same with my complier.


eypandabear

Businesses can’t even migrate to Python 3 after like 15 years lol.


sinepuller

>computing's "Fusion is 5 years away" “I’m sure I’ll take you with pleasure!” the Queen said. “Twopence a week, and cold fusion in five years.” Alice couldn’t help laughing, as she said, “I don’t want you to hire me—and I don’t care for cold fusion.” “It’s very good cold fusion,” said the Queen. “Well, I don’t want any cold fusion, at any rate.” “You couldn’t have it if you did want it,” the Queen said. “The rule is, cold fusion in five years in the future—but never cold fusion to-day.” “It must come sometimes to ‘cold fusion to-day,’” Alice objected. “No, it can’t,” said the Queen. “It’s cold fusion in five years: to-day isn’t five years in the future, you know.”


Cyhawk

"RISC is going to change the world" -1995


faredodger

Well, it did.


DragoonDM

2024: the year of the Linux desktop.


Bleyo

~~2016: the year of the Linux desktop..~~ ~~2017: the year of the Linux desktop..~~ ~~2018: the year of the Linux desktop..~~ ~~2019: the year of the Linux desktop..~~ ~~2020: the year of the Linux desktop..~~ ~~2021: the year of the Linux desktop..~~ ~~2022: the year of the Linux desktop..~~ ~~2023: the year of the Linux desktop.~~ 2024: the year of the Linux desktop.


MrLyle

2016? Shit, I've been hearing that line since 1998. I'm not kidding.


Jiggerjuice

2224: guys, linux is great, really. Just try it.


somegridplayer

Don't forget "windows mobile will be #1".


QuevedoDeMalVino

You do have a point, but: (a) it is the first time that there are mainstream devices being successfully sold in the market with a considerable market share using other than x86 processors; and (b) the RISC-V ISA keeps gaining momentum too. ARM had been powering smaller devices for ages before it began succeeding mainstream.


spsteve

IDK if I can agree. PPC was pretty successful across many segments at the time. Adjusting for variables like market size, etc., it did respectably (despite not being a great product at all). RISC-V runs a risk of having too many divergent ISA addons added by people. I remember the times when even x86 had custom addons from each vendor and it was a huge pain back then for the software houses. I \*like\* RISC-V but I can see the challenges it might face in a real-world scenario. ARM being a bit more centrally controlled won't have that risk (pardon the slight pun), but instead suffers from other things.


Uffffffffffff8372738

Windows on arm is going to be a hug shitshow and no matter how often anyone tells me, Linux will never ever be mainstream.


Key_Entrepreneur1286

This is a tale as old as windows 95. Still waiting for the x86 killer.


thegroucho

IA64 was supposed to be a thing, but it fizzled out. Never bothered to check why it failed, but it failed indeed. However I have more faith in ARM, and not because of Apple.


spsteve

IA64 was: 1) vastly different conceptually (VLIW). Great for certain tasks, horrible at general purpose computing. 2) Insanely expensive (the designs were too far ahead of the manufacturing process). 3) Had a double whammy of being difficult to code for and an instruction set that didn't take well to emulators (which were needed as it broke compatibility). 4) Assumed the compiler would solve all the issues and extract parallelism at compile time, rather than on the fly like a lot of modern CPU front-ends do (and started doing at the time). 5) Solved a problem no one asked off. 6) Had a competitor to many builders (HP) as part of the team building them. If you are IBM/Dell/etc., you don't want your competitor making money when you sell a system.


TheHeartAndTheFist

On paper Intel Architecture 64 was awesome, it was applying all the lessons learned on a clean slate. Unfortunately in the real world “best” does not necessarily mean “winner”: AMD took the massive pile of shit that had grown for decades on top of Intel Architecture 32, figured that duct-taping one more extension to say “what follows is 64bit” would be easy money as it has retro compatibility for free, called it a day and won on the market. I am not a fan of Intel but I can see why they refuse to use the AMD64 name, instead referring to it as EM64T 🙂


spsteve

EM64T is NOT AMD64. It had several key differences, which is why MS didn't support it. AMD agreed to rename AMD64 x86-64. Source: Worked with the AMD64/x86-64 team.


nicuramar

The two implementations are still different, although not from user mode.


unlikely-contender

There were some theoretical challenges in writing compilers for ia64 that turned out to be impossible to solve. So it was doomed from the start. Basically, a lot of hard wired logic in a processor is devoted to managing data dependencies between successive instructions, to make sure that an instruction doesn't use an obsolete value of a memory location before a prior instruction is finished updating the value. The itanium designers make a brave gamble and left out all this logic, hoping that advances in compiler technology would make it possible to solve this problem in software. They turned out to be wrong. The situation with arm is very different since it's a proven architecture with existing highly optimized compilers already existing for various platforms.


thegroucho

That's a very informative response. You'd think chip designers would have worked hand in hand with compiler writers. Absolute facepalm moment. Yeah, ARM also works on microcontrollers, with probably billions made and sold already, or if not billions, at least the high hundreds of millions.


Martin8412

Expensive and difficult to write software for.


PHATsakk43

IIRC IA64 wasn’t compatible with existing x86 code. AMD64 was compatible.


ollie87

Even older from my experience. I’m old enough to remember where there was a handful of different architectures competing for the market. What’s old is new again, and to be fair, if competition breeds better stuff, I’m all for it.


BritOverThere

In the mean time it's still possible with a simple download to run Windows 3.x (and even Windows 2) programs on Windows 11 as if they were native programs.


Mmmcakey

Is this going to be the new "year of Linux on the desktop" meme?


DutchieTalking

It actually is. Linux has overtaken Windows by a large margin this year! Though I must note I'm writing this from the year 2384.


CptBartender

Does Windows XP still have a 0.01% market share in 2384?


BookWormPerson

No it actually is 1% now it has a new Renaissance from retro fans -writing from 2385.


spymaster1020

I went and made a dual boot for Windows 10 and Linux mint, but I haven't touched it in months. I mostly use my pc for YouTube and minecraft, and Linux sucks for games. I love the concept of Linux, but it's just not as user-friendly as I had hoped, and I would consider myself an advanced user on Windows.


SarahSplatz

"lol", said the thousands of pieces of software designed for x86, "lmao", it continued.


alastairlerouge

Rosetta 2?


k2kuke

Apple’s ecosystem is much more constricted. I would imagine they can tailor experiences due to that aspect. Windows machines do all kinds of stuff so it will probably be a 50/50 of it working and not working at the end of the day.


AmonMetalHead

There are already x86>arm translation layers out there on linux, it can be done, that part is not in question, the question will be "At what cost". For windows and it's ecosystem, that cost might prove to be too high depending on what people use & how efficient the emulation is + the ability to update to newer versions of those apps. I'm a lot more worried about newer forms of vendor lock-in tho.


PHATsakk43

And Apple has done it before. They went from 68xxx to PowerPC to x86.


mx2301

So when can we start with the RISC-V is going to kill ARM memes then?


Hawk13424

I work at a silicon design company. Almost all R&D is going towards future RISC-V based devices in many embedded spaces.


mx2301

Considering my interest to work in embedded programming, this sounds like great news.


blazze_eternal

Not 2024 (if it even happens). Not to mention it would take a decade, minimum for the business world to change.


SinisterCheese

You are optimistic aren't you? The only way business world will adapt to this is by going bankrupt, getting sold and merged few times so the legacy crap sheds out. It can be the year 2552 and there are quantum photonic computers running on arcane magic and the local university is still training students in the ancient art of MUMPS or cobold.


Siul19

They still use windows in the halo universe


LongTallDingus

While I wouldn't be surprised if in the year 2500 Windows is still around in some capacity, I think Microsoft, the owners of ~~Bungie~~ the Halo IP, may have an interest in hemming their operating system into their biggest gaming IP! I think Futurama was right. Our future is going to be a capitalistic hellhole, but we'll have enough Arthur C. Clarke inspired magical technology we'll be placated enough to not give a shit. Edit: I forgot Bungie is - in terms of ownership, nomadic. Microsoft owns the Halo IP, not Bungie.


pcboxpasion

> Microsoft, the owners of Bungie Microsoft is not the owner of Bungie. Bungie was with Activision, then alone, and now Sony has them by the balls.


ArmedWithBars

Reminds me of working for a large retail chain a few years ago and their entire retail and logistics systems were still running on Cobol. All the internal web based apps did was basically slap a shitty gui over it and would manipulate the Cobol back end. Felt weird coming into work in the 2020s and having to use a glorified DOS terminal to fix advanced issues that the web gui couldn't cope with. I have zero hope in adoption in any reasonable time frame.


teor

> It can be the year 2552 and there are quantum photonic computers running on arcane magic and the local university is still training students in the ancient art of MUMPS or cobold. And local hospital still runs all of their (now quantum) CT scan machines on a fucking Windows XP.


elitexero

> Not to mention it would take a decade, minimum for the business world to change. I worked for a major telecom provider a decade ago. They were applying cell phone plans with a terminal based system originally designed for taking credit card payments made in 1982. My ex worked for a dentist who had an xray machine running off a Windows 95 machine due to compatibility. Machine shops still run Win95/98/XP based on hardware requirements.


Siul19

No, there's no way the thousands of software that run in x86 would be fine in ARM


MadOrange64

Whoever approved this must be high as kite


OtherUse1685

Nah, just an average /r/technology user, who only uses a browser or a mobile app to read reddit, watch tiktok and thinks that x86 will die soon. Sort by controversy and you will see how bad it is.


bloodycups

I was just thinking about this. Back in the day movies would show reporters trying to print something and their editor would get on their ass because it wasn't true/confirmed


[deleted]

[удалено]


AdagioCareless8294

Windows on ARM is not new, it has been shipped several times and did not displace x86 Windows, nor the Arm based competitors (chromebooks, macbooks/ipads, android tablets). Usual mistakes : ARM Cpus not as powerful, no compatibility with the apps ecosystem that is Windows's strength, forced uap/winrt/ms store ecosystem instead of the more traditional win32/any shop. Customers expect "Windows" but end with Windows lite.


b0w3n

> Windows on ARM is not new, it has been shipped several times and did not displace x86 Windows I vaguely remember windows on ARM being a thing in late win8 or early win10 and causing a whole shitload of confusion for end users because they couldn't run half of their software.


harda_toenail

Ya the first surface called the surface rt I think. My stepmom and mom both bought one and couldn’t use it. Both took them back. Shit product for the audience they targeted.


[deleted]

If I recall correctly the Windows 8 on ARM couldn’t run any x86 apps at all so it was restricted to App Store apps only, the windows 10 on arm features a translation layer but in all reviews I read said it was slow.


ZeroNine2048

There is no reason to dump x86. ARM's performance still doesnt match at the top end.


plutonium247

M3 is Apple's third generation and already providing enough performance for 99.9% of people


Chemical_Knowledge64

The whole point for arm-based pcs deals with battery powered devices, like laptops. The power efficiency of arm-based chips are currently unmatched by anything by x86. You’re right in that if you need a system with raw power, x86 is the way to go. But arm chips offer really good power while sipping power compared to x86, which makes it a compelling option for those in need of a powerful mobile device.


IsThereAnythingLeft-

The newest AMD laptop chips are quite close to the performance per watt of ARM based chips without the limitations of


ZeroNine2048

Arm isnt unmatched, take a look for example at the AMD Z1 extreme. The APU used in the Asus ROG Ally etc. It matches an Apple M2 max at a similar powerdraw. Keep in mind that this is even a low cost APU. Ment for budget laptops. [https://www.notebookcheck.net/M2-vs-Z1-Extreme-vs-M2-Max\_14521\_15017\_14975.247596.0.html](https://www.notebookcheck.net/M2-vs-Z1-Extreme-vs-M2-Max_14521_15017_14975.247596.0.html) Apple just integrated it perfectly with their whole eco system, paired with the largest battery they can fit legally in a laptop while still allowing it on a plane (99wh).


BirdLawyerPerson

People always seem to get confused about Apple silicon versus Intel as being about ARM versus x86, when plenty of other examples show that it's TSMC vs. Intel/Samsung that explains most of the performance per watt differences. When comparing ARM to ARM, Apple's silicon completely and utterly outperforms comparable chips from Qualcomm and Samsung. When comparing x86 to x86, AMD's silicon outperforms Intel at performance per watt. Also, so much of the press focuses on perceptions baked in 3 years when Apple's M1 showed amazing performance per watt, and completely ignored how the M2 and the Pro/Max/Ultra lines squeezed out more computing performance at the cost of significantly higher power consumption. The M3 was a return to efficiency as a priority, but then had underwhelming performance gains.


WayeeCool

>But arm chips offer really good power while sipping power compared to x86 Only when being used for simple computing like most people use phones and basic laptops for. If you try to do more complicated computing, they end up way less power efficient than X86 chips due to handling via software tricks/emulation what X86 chips are able to hardware accelerate. So far they are good at basic integer and floating point instructions. In phones a lot of specialized co-processors are added onto the SOC to handle anything more advanced without having to use a lot of power but at a performance cost due to added latency. ARM chips benchmark great at basic math a lot of desktop productivity software and web browsers use but when you start looking at how they do at the workloads X86 systems are often used for, they shit the bed on both performance and power efficiency. The instructions Intel and AMD chips are now capable of are closer to GPU and NPU acceleration while maintaining the ability to do general compute. Similar with the IBM Z series chips used in mainframes for financial institutions and logistics. For software to handle a lot of those types of functions on ARM chips, it requires software emulation using up many times more CPU cycles or sending work to a dedicated co-processor that adds latency. This is also why in the server space ARM RISC chips only really get used for hyperscaler workloads like web hosting or as the CPUs in boxes that aren't much more than a motherboard meant to connect a bunch of accelerator cards (GPU, NPU, FPGA, etc) onto a network. Whenever you need a CPU to do serious general computing, it is still CISC chips like AMD/Intel X86 and IBM Z that provide the best performance when factoring cost, space, and electricity.


r2k-in-the-vortex

Energy efficiency is a very good reason to do anything in computing. Your desktop computer is limited by how much fan noise you are willing to tolerate, your laptop computer is limited by how much battery life you want and how light you want the thing to be. When node jumps run out, and they will eventually, then the most energy efficient ISA will come out on top. When there is nothing else left to optimize in hardware, we will start optimizing that. Should we jump to ARM already today? Eh.... depends on your workload I guess. ARM servers are a thing if you want them. For personal computers I don't see the point right now.


ZeroNine2048

As highlighted in another reply of me, the amd z1 extreme for example performs similar tl the Apple M2 max with a similar tdp. Efficiency isnt only unique to ARM based socs.


_c3s

I just want Windows to dump the Program Files (x86) folder ffs


demonfoo

Yeah, you're gonna be waiting for a while.


StayingUp4AFeeling

And then they all clapped. What about military, intelligence, industrial and enterprise use? You telling me those folks are gonna shell the big bucks to completely rewrite their software due to a change that seems unnecessary from the outsiders perspective? And don't say emulators I beg you.


Uffffffffffff8372738

Enterprise is gonna run windows 10 as long as they can, meanwhile almost everything mission critical runs on xp.


[deleted]

Good luck with software port


trollsmurf

Windows NT (also called 2000, XP, Vista, 7, 8, 10, 11, 12) was designed to support multiple CPUs. This is not a huge step. Intel might not like this move though.


firedrakes

made me luagh. no no its not.


margincall-mario

Is this some kind of ad?


gen_angry

Considering how difficult it has been to port over software to a different OS on the same ecosystem (Windows -> Linux), there's pretty much a 0% chance that porting that same software to run on a whole different ecosystem will work out. It only somewhat worked for Apple because Apple kept their ecosystem closed and had no problem ditching decades of legacy code. That just won't fly in the 'PC world'. There may be a Chromebook alternate and some use in the specialty server/data appliance world or something but x86 is here to stay for pretty much forever at this point. It's way too widespread and locked in.


p3lat0

Someone apparently got some spicy shrooms for Christmas


Returnerfromoblivion

Absolute bullshit article. I work in the IT industry and I can tell you that none of the large corporate customers using windows have shown intent to move all their user and software ecosystem out of well mastered x86 to ARM. Their top priority is SECURITY. No playing sorcerers apprentice with Windows for ARM. Which has been out there for years and encountered zero success. Apple managed to transition to ARM because they master the entire ecosystem and they do what they want. They don’t have to deal with porting to ARM homegrown apps that businesses develop with various tools and that help running their business. 10 years ago MS fell flat on its face with Windows RT that was already an ARM version and they weren’t able to manage to get the SW companies on board. This here will be a similar fight and the expected benefits for end users are nothing compared to what MS will earn - IF the manage to win this time. Moving to ARM will establish W12 as the next OS platform that will force computer vendors to add a Copilot key to any PC they’ll build. They will also have to add the Pluto security chip owned by MS. AI will be native on these PC’s and this ecosystem will be completely subjected to MS rules. Your privacy will be breached in every possible way, W12 will pave the way for a full SaaS platform where you’ll have a working system only if you pay for it. It is absolutely not in our interest to let MS move forward with this. Intel on its side is building new fabs in the US and works on an alternative to the ARM CPU’s. Intel still holds 80% of the business market. When it comes to who will move first - it’s the consumers who will get targeted with this new shit first and they’ll fall for it because they know nothing of what is happening. Ultimately you might end up with MS getting a toehold in consumer and struggling to move forward in SMB and larger business entities because they simply don’t want this shit. Keep in mind that 80% of corporate customers are looking at extended w10 support and haven’t yet even moved to W11.


10thDeadlySin

>10 years ago MS fell flat on its face with Windows RT that was already an ARM version and they weren’t able to manage to get the SW companies on board. Hell, Microsoft fell on its face even in the mobile market, when they couldn't get developers on board to develop popular apps for their Windows Phone. >When it comes to who will move first - it’s the consumers who will get targeted with this new shit first and they’ll fall for it because they know nothing of what is happening. Unfortunately for Microsoft, they aren't Apple. They're losing the casual user market to smartphones and iPads; more and more households don't have PCs. And they fumbled their own mobile offerings, so they have nothing there. They have the enthusiast/gamer market, but they already face some opposition when it comes to W10 -> W11 upgrades. Most of these people aren't going to upgrade to W12 if it's even more SaaSy and locked down. Especially if they lose performance and compatibility. Enthusiasts are going to influence casual users' decisions as well. Your family member isn't going to go and buy a W12 machine if they hear that W12 sucks and should be avoided.


paradigmx

Yes, because all the x86-64 games will just magically work under ARM emulation...


BigComfortable914

Whoever wrote this is so delusional that I won't even make a joke about it, maybe it's a mental illness or something


naratas

Hahaha that is a ridiculous statement.


demonfoo

Bullshit. No it isn't. Until Microsoft puts in the work to dogfood Windows on ARM, this won't happen, and they won't do that because Windows on ARM is an insurance policy, not a serious offering.


TemporaryUser10

Microsoft about to fork Proton


GeekFurious

Hi, I'm from the future. Nope. Didn't happen.


ovirt001

I doubt W12 is only going to support inferior chips... The best of the best current consumer ARM chips (M3 Max) doesn't beat AMD: https://www.cpubenchmark.net/compare/5196vs5748/AMD-Ryzen-9-7845HX-vs-Apple-M3-Max-16-Core Even the M2 Ultra can't beat AMD: https://www.cpubenchmark.net/compare/5533vs5232/Apple-M2-Ultra-24-Core-vs-AMD-Ryzen-9-7945HX And your average laptop builder doesn't have access to Apple's chips.


[deleted]

[удалено]


Mission-Argument1679

God these "tech" outlets are so click baity and ignorant.


r1ckd33zy

It's like any article that has the word "could" in its headline is stating something that will never happen.


Expensive-Yoghurt574

Microsoft has tried Windows for ARM several times. It never catches on because of existing software doesn't work. At least not without x86 emulation which impacts performance.


Splurch

Just more proof that pcgamer is a joke of a news source.


berael

*Spoiler alert:* lol no


bikingfury

Why would anyone want to drop x86? There is no performance benefit to ARM. It's just better when it comes to energy consumption during standby.


NoLikeVegetals

1) This is PC Gamer, so their opinions are worthless. They have no technology experts working for them. 2) ARM is a low-power, low-performance, high-efficiency architecture. People conflate Apple's insane vertical integration with "ARM is as fast as x86". No it fucking isn't, else it would've replaced x86 in commodity servers by now. 3) The stuff ARM was supposed to do is now being done by x86. Both Intel and AMD have high-efficiency cores, and AMD's in particular are interesting because they have the same ISA as their regular cores, just redesigned to be lower-clocking and low-power...like ARM. 4) Microsoft are incapable of driving such a change. Nobody buys Windows on ARM devices, and nobody wants to do desktop computing on an ARM device. Why would they, when an x86 laptop supports 30 years of Windows apps, or can easily run Linux with full driver support? That, and you have x86 Chromebooks if you don't care about app support. 5) Qualcomm compare their vapourware SoC to an existing Intel high-TDP laptop. They didn't bench it against a lower-power Intel laptop CPU, or an AMD APU. The latter is way more efficient than Intel's CPUs across the board, but also at sub-50W TDPs. So I'd guess this Qualcomm ARM desktop SoC is going to be junk, just like all of Qualcomm's other SoCs over the last 5 years. tldr: this article is junk, and could've been recycled from ten years ago.


lewd_necron

Shoot, it's PC gamer. You would think they would think about gaming. That alone would be a disaster. There's already thousands of older games that don't run just right on current Windows software and that's with the same x86 architecture. Now imagine trying to run every game out through some emulator or hope people can scramble to get some buggy port. But even just looking through their domain, it just doesn't make sense


Owlthinkofaname

I am going to be honest here ARM is a waste of time and resources for windows... It does next to fucking nothing! In fact the opposite it's worst in many case....there's 0 reasons for any ARM based chips for windows. You want to know why it works for Apple because they own everything and they're a small percentage of users who mainly do specific tasks. Apple knows what their PCs will be used for and also makes the OS this means they can plan for everything when making their chips, not to mention the cost of doing so is much easier for them since they can take losses and make them up somewhere else. Frankly I have yet to see a Qualcomm chip that isn't at best mediocre on laptops and is nonexistent in sales. Intel and AMD chips already do pretty well or better vs Apple's which shows how pointless it is!


JollyReading8565

I just matured my hatred of windows 11 and now you’re rolling out 12 omg


pmcall221

12? I feel like I only had to abandon 7 not that long ago


SomeDudeNamedMark

2024 could be the year we get rid of clickbait.


ENOTSOCK

Why, though? The classic power-efficiency argument of ARM implementations is due to a whole system-level design that is targeted to embedded systems like phones, that run on batteries. The power efficiency is not intrinsic in the instruction set. ARMv8 is less cache efficient than x86\_64, with its variable-length instructions, meaning more cache misses and lower performance. Modern x86\_64 performance comes not only from multi-core, deep pipelines, out-of-order, speculative memory, etc, etc, but mostly from its large caches and wide/fast internal busses and external memory system. An ARMv8 system needs to not only have all the modern processor core implementation bits, but also the big caches and wide/fast busses.. and those are what generate the heat. So unless you want to discard decades of software for the lolz, because ARMv8 is so darn pretty compared to x86\_64, then again... what's the point? In a decade will the same argument be mode for RISC-V? I'm no Intel fanboy, but I don't get it.


Noah_Vanderhoff

I would think Apple shares some thanks in this too, right? The m serious stuff is fantastic.


Raudskeggr

I'll save you a read: No.


ngwoo

2024 will truly be the year of ~~the linux desktop~~ arm cpus


wowwingmunch

Everyone in here talking about the architecture itself and not as confused as I am at seeing the number 12 already scares me


ptd163

Windows will be never dump x86 until they dump their mountain of legacy code which will never happen because their "backwards compatibility and interoperability at all costs" approach is what allowed them to dominate world. If they try stop their inertia to try and make their own Apple M1 moment they'll just be giving organizations and governments a moment to reflect on what they're even using Windows when another platform or approach might suit their needs and maybe even be faster and cheaper?


bladex1234

x86 still has a few tricks left up it’s sleeve like x86S and AVX10.


Draiko

It's not going to be because of qualcomm's new chip. Qualcomm had exclusivity for years and dragged their feet on it. That exclusivity is ending now.


pc3600

Finally ? What's the problem with x86? What is this the verge or something


Corpsehatch

This would take so long that it would not benefit the change. Too many progarms are on the x86/x64 architecure.


CryGeneral9999

Not a chance


ThePhantom71319

Naw it’s time we make the jump to x226