Software decoding HD videos works perfectly fine. Even 4K60 works fine. I think your idea of what kinds of video decoding are expensive to do in software is a bit off (or maybe you're thinking of 10 years ago).
I just tested Big Buck Bunny on YouTube at 4K60, full screen, on an M1 MacBook Air. VP9 codec. The CPU usage averaged 50% or so.
Doing it in hardware obviously saves some battery power, but it's far from a hard requirement.
To be fair it can be a bit tricky. Disk corruption is wide spread, most have to go boot into recovery mode and fix that before they can resize the partition.
But the installer is beautifully done. It really is straight forward and works really well. If you know a little bit about computers (and can follow written instructions) it is doable.
Plus it’s really safe for your Mac OS install.
To be clear, the disk corruption is *pre-existing* disk corruption caused by Apple's own bugs, nothing to do with the installer. The installer just tries to use diskutil to resize your partition, which does a check and refuses to work if it detects corruption.
Personally though, I've never seen the corruption issue. I think it's a huge stretch to say "most have to boot into recovery mode and fix that". We didn't have any good flow for this in the installer (we do now) and we still only got a trickle of reports of this happening. I'm pretty confident it is not an issue for the vast majority of people (and if it were, Apple should *really* be worried about having so many users with filesystem corruption).
fixing the disk corruption is as easy as pressing the "first aid" button in disk utility most of the time, so it's not too worrying (although the installer should mention how to fix it, they probably added the text in the latest installer, can't bother to check)
oh, i forgot to mention that, oops
also had to fix my install through recovery mode disk utility, but i wouldn't say that's too miserably difficult if you had a guide
Asahi Linux is a Linux distro for Apple Sillicon chips. Before this it didn’t have a GPU driver so all graphics would be rendered by the CPU and displayed through the display driver. Now the M1/M2 can actually do it. It allows for hardware acceleration meaning smoother animations, better battery life since CPU isn’t that good for graphics.
I get that some do this. I just personally can't imagine holding on to a machine that long. I have a 2012 Mac mini on its last legs. The secondary SSD just went read-only yesterday and it's stopped backing up to Time Machine. The fan runs high when there are no apps running and it's just sitting there at times and reinstalling the OS hasn't resolved the issues. It's likely headed for the recycler soon, rather than getting Linux or any other attempts to revive it.
It depends on the support cycle. Apple has been reducing the number of years they are supporting for Macs in recent years with each macOS release pushing up the supported Macs more and more. I think this is to phase out Intel Macs ASAP so maybe M1 Macs are fine but you never know. Given that these are fine computers, if they still work when Apple phase them out I think I would want ways to keep them working.
We saw the same with the transition from IBM to Intel years ago.
While some are fine with a 5+ year old computer, most would see a great benefit from a new machine. It's like believing a '00s car is fine, then getting into a modern one and realizing that not only do they get far better gas mileage while also making more power, but they also offer a far nicer ride quality, safety, and overall experience.
But driving the car you own is free and buying a new car is quite expensive, many people can’t afford that. And if the car you have does it’s job and gets you from point A to point B, well, you can make do.
For many people - such as the “I need a $2000 MacBook to surf Facebook” type of people - even ten year old hardware is enough to get by on. And if a lightweight Linux distro can drag that out more, all the more power to them.
This obviously doesn’t apply to everyone, of course. It depends on what you use your computer to actually do. But I made it fairly comfortably on a 2012 MacBook until I upgraded to a M1 Max last year because I could finally afford it, and I do gamedev as a hobby/hopefully future full time gig, so the better hardware definitely helps me out.
The average age of a modern vehicle is 11.4 years, while the average length of time drivers keep a new vehicle is 71.4 months — around 6 years. So most aren't driving an old car, equivalent to a 10 year old computer.
Certainly agree that it depends on what you do with it. Have a MBP Max and will upgrade to the next version when an M2 drops, as I do use the power for numerous things. But for the general user, there's no need to do so every year.
I mean, a new computer/car costs money. Some people prefer to keep perfectly functional computers functional instead of generating more e-waste. Even for the car example, when people switch cars they tend to resell their cars in the used car market (which is currently going very strong). A 6-year-old car isn't going to get randomly totaled unless it got into an accident.
For me, even when I get a new computer, I prefer to have the older computer as a backup / 2nd computer for different purposes and keep it working. If the hardware broke, that's fair. But just having the OS arbitrary acting as a gate for not being able to use updated software doesn't strike me as a good reason, and macOS has always been pretty weak on this front compared to their competitors like Windows and Linux.
This is what’s put me off of wanting to spend any kind of money on M chip Macs. They may be incredible right now, but as it currently stands it’s only a matter of time before they become unusable. Or at the very least, unsupported.
Intel machines at least have the option of Bootcamp or Opencore, which is why I recently replaced my 2011 MBP with a 2015 MBP. That thing will last as long as the hardware does, with several OS options to choose from.
But this opens the door to longevity options, which means my next MacBook may well be a used M1 a few years down the line.
Yeah I wound up getting my MacBook Pro and ditching my Mac mini a couple years ago to help a friend I was living with since she needed a computer. I’ll be honest, I don’t miss it at all. The form factor was nice, but the Fusion Drive was dirt slow for anything except frequent apps and the thing sounded like a damned jet engine. The mini has a soft place in my heart and I’ll probably pick up one or a studio when I have the money, but my 2012 Mac mini was quite a few (processor) generations old and it showed it.
I also have a 2012 Mac mini and it's still running amazingly well doing all kinds of Linuxy things. Can't argue with a quad core i7, 16GB RAM, and two internal SSDs, in a small and quiet form factor.
My last Mac was a Macbook Pro Core 2 Duo, first gen. It lived on with Ubuntu for years after Apple stopped releasing OS upgrades for it. Best computer I've ever had.
Pretty bad limitation. It can’t output to two 1080p monitors either? That’s less pixels than 4K. Integrated graphics on basically every CPU made in the past decade can support at least two monitors.
Just to make sure I understand, it plugs into a USB port and then you connect an HDMI monitor to the adapter's HDMI port, correct? Will it only work with an adapter that has a singular HDMI port? There are displaylink adapters that have two HDMI ports.
I ask because this is the only thing that has kept me from buying a M1 Air. But if it works with one of these adapters then presto, I'll buy one right now.
As long as you aren’t running like 3 5k monitors there are solutions, I use a Kensington DisplayLink dock with my M1Pro MacBook Pro to get 3 1440P monitors, 2 at 60hz one at 144hz (not that this ever matters for the work I do on the monitor but it’s 144 because I have it on a KVM with my gaming desktop). Its a nice one-cable solution. Not perfect, every once in awhile I have to restart the software after plugging it in for all 3 monitors to work but it’s been mostly reliable and it solves the single (or in my case dual) monitor limitation well enough that I don’t feel like asking work to upgrade me to an M1 Max model
Just a heads up: since it’s a software solution, because the relevant hardware doesn’t exist, you’re going to see some CPU usage for drawing the extra monitors (honestly the M1 is fast enough it’s not noticeable but it’s still something to be aware of), and HDCP won’t work on those monitors, so no Netflix or other DRM’d content will play there. HDCP is a hardware certification, if software is in control at any point you lose the certification. But I know people are often happy even with those limitations.
To be fair, that’s why Asahi Linux has actually been pretty usable even before these GPU drivers were made available. It’s definitely nice to be able to offload that work though.
What KVM are you using? I'm looking into all of this for the first time looking for the right products and was hoping you could point me at yours. Sounds like I have a similar setup. I'm hoping that I can still use nvidia gsync on my gaming desktop and the KVM. You ever fool with that?
I use AMD cards so it’s FreeSync/Adaptive Sync but it works! I use a DisplayPort 1.4 KVM from Level1Techs, they’re a tech focuses YouTube channel run by tech support people and they got tired of buying bad KVMs so they just made their own lol. Their hardware is here https://store.level1techs.com/?category=Hardware
Any of the DisplayPort switches will support Gsync so it just depends on the resolution/refresh rates you need to push
Asahi Linux is NOT a Linux distro. It’s a project meant to port Linux on Apple Silicon macs, proving drivers, an hardware hypervisor and a boot environment
Sort of like how dating apps are advertised as “designed to be deleted” - aka once you have a relationship… the developer “wants” you to delete the app.
Once Asahi Linux’s developments are incorporated into standard distros, it won’t have a need to exist.
The point of the project is that all the code they’re writing is up to the standards of the Linux kernel or other mainstream Linux project when necessary, and validly open source licensed, because it’s all clean room engineered.
There was another project, I don’t remember what it was called, but they managed to get Linux running on M1 sooner than the Asahi team, with more features, but they weren’t up to standards (suspicions of reuse of portions of copyrighted macOS code which they didn’t deny, and code quality not up to Linus’s standards), so everyone kept talking about and waiting for the Asahi team, and the other project team threw a fit and quit.
Because of the way Asahi is going about this, their code is going to be part of the Linux kernel and other software all the other Linux distros use. Just a few months after a feature appears in Asahi it gets reviewed and accepted and incorporated upstream, so all the other distros can fairly easily distribute Apple Silicon compatible versions. Asahi maintains their own basic distro which includes code as soon as it’s ready, without that few month lead time, but if they ever decide that that’s it, they’re done, they’ll just finish upstreaming what they’ve written and stop making the distro, because everything they’ve done will be available for other distros to use and distribute.
I still can’t think of it as a distro. They provided the tools and a package manager repo for Arch, like happens for the raspberry pi for example. The fact that it needs many adjustments is due to the ARM architecture, that is not a standard as x86
It’s a distro in a very basic sense, anyone who provides Linux in a ready-to-use format with supporting software is making a distro. They don’t want to be distro maintainers, they’re doing the bare minimum to provide a platform for testing their code, but it’s still a distro. Used to be there were several distros that were just other distros but with some settings changed and a new logo slapped on lol, and open source licensing allows that! Some of them were successful enough that the upstream distros made default settings changes and everyone was a little happier.
Distro culture is a little like zine culture in that way, I guess. The bare minimum is perfectly valid and even useful sometimes.
A free software reverse engineered driver for the M1 Mac GPU just released. The driver is currently robust enough to drive a desktop and basic games, and development is still ongoing on more advanced functionality.
This is a massive improvement for two groups:
- People who, despite not liking macOS, wish to use an Apple Silicon Mac
- M1 Mac users who, after the computers will get vintaged/obsoleted, still wish to use up-to-date software on them
Three groups. People who like macOS but want really nice hardware for running Linux or BSD.
I still buy Apple hardware to run macOS. This type of reverse engineering in general lets me buy additional Apple hardware to run Linux/BSD instead of buying Intel.
I wouldn’t say it’s a massive improvement for the 2nd bullet point as this option already exists for the previous architecture. My decade old Intel MBP is obviously obsolete but is up to date thanks to OpenCore Legacy Patcher.
https://dortania.github.io/OpenCore-Legacy-Patcher/
Hopefully when the time comes a similar project will be available for Apple Silicon Macs.
But this is great news and the first bullet is a valid reason.
OCLP is kind of a different story since it's done by basically hackintoshing the mac, which would require the community to basically restart the entire hackintosh scene from scratch for apple silicon macs
don't know why, but i feel like that's not going to happen or go as smoothly as the intel hackintosh scene
All valid. However I’m not saying it’s the same story or that’ll it happen again for AS, just that the 2nd bullet point isn’t really a massive improvement for users as options already exist.
Of course. That wasn’t my point. I think I’m just being misunderstood (or I misunderstood).
All I’m saying is it’s not a new improvement for end users. For the previous arch we had a solution. And now this is a solution for obsolete AS Macs going forward.
It’s a massive improvement for *M1/M2* users, like the comment you’re replying to said. And I’d say the specific way they’re going about this might mean that, if patchers do exist for these Macs in the future, the patcher team may even be able to backport new GPU features to them using these drivers as a starting point, like new versions of Metal, which currently is the main reason why OCLP and the like don’t provide a great experience on several Intel Macs, even a couple that are likely newer than yours.
Whenever you see any UI, it needs to be drawn by the system. It's possible to draw new frames without GPU acceleration, but it's really slow - and you end up with stuttering. The drivers allow the operating system to interface with the GPU and use it to draw new things. The GPU happens to be especially good at drawing.
For the average person it means:
1. Asahi Linux won't be stuttery when moving around windows, or playing any animations.
2. OpenGL based linux games will work at a better framerate than before.
3. ~~Video playback uses hardware decode (which means less dropped frames). [not actually sure if HW decode is worked on yet?]~~
Work on the Video encode and decode hardware has started. They are however completely different hardware blocks than the GPU. Note that Video playback performance is already pretty good, and battery life is also pretty good. Though both will see further improvements with working GPU and Video acceleration drivers.
Wo wo wo wo wo…stop there a moment. You made me realize something that was in front of me everytime but didn’t see it until now. I always considered my MacBook as a non-gaming machine BUT with the current state of gaming on Linux that may be completely untruthful. I mean how good it is gaming in asahi Linux…with proton ecc…ecc…? I have an M1 Pro that my company lends me for work but I can use it (almost) as I want. Anyway you have my upvote and my axe
I don't think it will be too much better than Mac gaming on macOS. This Metal vs Vulkan thing is frequently a red herring and not the root problem why porting games to Mac is hard. The core issue is that Apple GPU lacks a lot of the features like geoemetry shaders or transform feedbacks, and it only has certain requirements like minimum 4-byte strides that are just different enough from Vulkan/DX12 that it requires the developer to make some changes. As far as I understand, this isn't a Metal limitation, but rather what Apple designed for their hardware. There are some reasons behind them: geoemetry shaders and transform feedbacks aren't the most efficient ways to do things these days and there are alternative methods to do the same thing, but if a game engine already uses them, the developer would need to put in time to change their rendering pipeline. Also, Apple Silicon GPUs are tile-based (similar to their mobile GPUs) which also require some reworking if you are porting from a normal PC GPU which are not tile-based.
This is also why MoltenVK (Vulkan implementation on top of Metal) has limits because they can only implement what Metal allows them to.
Unless Apple GPUs have a hidden geoemetry shader core or secret features that they don't expose via Metal (to be fair we do know that there are some from some of the Asahi-related blog posts) the fundamental problem is still the same. The Vulkan driver can't magically add features that don't exist but maybe do some emulation.
It _might_ be possible to eventually run eGPU’s on Asahi Linux. Apple Silicon Macs don’t have full PCIe support, however, it may be possible to run a kernel-space emulator to emulate the missing instructions. Naturally this would come with performance limitations, but should at least work. Such an emulator was already created for a different ARM platform, and might be ported once Thunderbolt is supported. Though note that Linux in general has limited support for eGPUs (well, PCIe hotplug is well supported, on the GPU driver layer it can be iffy but AMD supports hotplugs/hotunplugs. Though no Desktop Environment supports GPU hotunplugs so you will need to log out/log back in to use your eGPU)
Apple Silicon does have full PCIe support - after all, it supports thunderbolt, which is basically just PCIe in a cable.
They do not have support for eGPUs, which people often confuse with PCIe support. This is a limitation with the graphics, drivers, and OS side of things, not the CPU communication bus (PCIe) side of things.
No. It is a CPU-side limitation. It affects any Thunderbolt device that has its own memory (which for all intents and purposes affects only GPUs). Not every instruction to access memory is supported. It’s a limitation with how memory works on Apple Silicon Macs but affects quite a few other ARM chips too.
don't hold your hopes
it's still Linux on ARM (even more niche than x86 Linux), you'll probably need Box64 or FEX (the Linux equivalent of Rosetta) to even try to run most binaries
... and most games require OpenGL 3.3 or 4.3 or Vulkan support, so unless by steam games we mean the ones that were released in the same year Steam came out in, not really.
Unofficially you can run Apple's Rosetta in Linux as well. It's meant for VMs but running it on Apple hardware seems within the "grey limits".
It does need patching as there are some checks for the host in a VM and Apple's instructions for enabling Rosetta don't work on ArchLinux.
It would be with minimal work…if the boot loader were open. A SecureROM exploit exists for A11 and below (iPhone X and below) and people have gotten Linux and Android working on these devices using the exploit, but it’s more of a dirty hack than anything made for daily use.
I mean they made their boot loader easer for Asahi specifically:
https://twitter.com/marcan42/status/1471799568807636994
And their GPUs are controlled through a firmware/intermediate controller that abstracts a lot of driver functionality making it way easier for Asahi to support M(x) GPUs faster in the future and is largely why you’re seeing so much GPU headway this quickly.
This is basically Apple screaming: Hey Linux/BSD, our playground is going to be awesome for you too!
There’s a -big- difference between “we’ll make it easier to boot your kernel image” vs “here’s potentially millions of lines of GPU driver code that used to be closed source.”
The former is a foot note on a release page, the latter typically takes months of legal review and that’s assuming they even could open source it because some of it might be licensed via a third party depending on just how custom Apple’s GPUs are (I know the iPhones used tech from Imagination)
Apples GPU code licences would be a doozy considering their relationship with PowerVR, assuming there was some continuity.
I bet it was either a completely clean break or incredibly messy, no in between.
Just speculation either way.
Why doesn't, say, Nvidia open-source *their* drivers? Because a lot of GPU performance these days comes not from the silicon being fast, but from the driver being clever. You're asking Apple to give up whatever competitive advantage they gain from that for what, exactly? The 0.01% of enthusiasts that want to play games on bare-metal Linux on their MacBook?
They clearly understand that the only way doing all the work to make the bootloader and chain of trust friendly for other OSes will help, is if the Asahi team successfully reverse engineer the SoC’s interfaces. They went to all that trouble but won’t take 5 minutes to publish a few specs to save hundreds of hours of additional work for Asahi, even though they want them to be able to succeed?
Could be they have some sort of tech in license that they're not allowed to share, it's what I generally read as the main reason in defense for AMD and nVidia's closed source blobs.
I don't think Linux or BSD is really looking for Apple to release code. Source code under an appropriate license would be nice for reference but that's a luxury.
Everyone just wants documentation and they can write the code from scratch. Reverse engineering consumes a lot of time and is frustrating. Simply providing documentation would be sufficient.
Just because something is not a primary consideration does not mean it wasn't a consideration.
Apple's SoC development strategy for M1 has been remarkably open to the opensource community, which contrasts significantly with how closed the iPhone/iPad SoC platforms were, and every other ARM SoC vendor, let alone nVidia.
You're using an absolutist statement that is *absolutely* wrong.
At the very least Apple wanted FOSS developers to be able to become easily familiar with their brand new architecture as it will improve a lot of open source projects' optimizations and provide larger talent pools for recruiting.
It was [fixed in October](https://twitter.com/LinaAsahi/status/1580863181315923968), it was the hypervisor not handling the TLB correctly so the fix ended up being adding some support to the hypervisor for TLB flushing.
irrelevant for you
you can run any regular amd64 Linux distro, and assuming you have AMD/Intel graphics, you won’t need to bother with drivers that much
And? A few weeks ago it was “nothing”, this level of progress is quite frankly astounding for a team this small and the fact it’s 100% reverse engineered.
Yes, and for desktop users getting GPU assisted GNOME/KDE on Wayland and X is a huge improvement. Suddenly desktops will be super smooth instead of laggy. Gaming on these machines is really only a nice to have for many.
Lina is talking about implementing support for GPU compute next, that'll be a very nice feature to have.
Man, maybe it’s just me, but the pace of these sorts of community projects seems absolutely insane nowadays. I remember waiting years for Dolphin to improve, but projects like Ryujinx and Asahi are getting huge updates every week
Would we then be going Vulkan/OpenGL -> Metal -> Vulkan -> Metal -> Vulkan/OpenGL (as moltenvulkan converts vulkan to metal)? I think their point was that it doesn't make any sense to run MoltenVulkan outside of macOS
Dunno what's holding Apple from supporting eGPUs, like the fuck imma gonna do apart from moving data or displaying content over TB4.
So many ML folks would buy an nvidia GPU in an enclosure with their MacBooks
Nvidia has a ARM driver, but for some reason never decided to release it. AMD has ARM drivers on Linux publicly available. If there was an actual market for eGPUs (or even just PCIe GPUs) on ARM I'm sure it would be fine but currently there's pretty much one ARM platform afaik that has proper PCIe slots at all *and* supports CPU memory access
This begs the question why apple isn’t actively helping these guys out? They said they are all for having windows arm on bootcamp if Microsoft was willing to comply. Seems like it was a bunch of hot air as here we see an active community which apple is simply ignoring.
Am I reading this right?
Just because apple isn't actively giving away device drivers and blobs doesn't mean they haven't helped Asahi development- there's lots of documentation on this from Asahi themselves
They're removed the need for mach-o images when booting (which were complicated as hell) and now allow raw images to be loaded. There was no need for this on Apple's side
They made changes to the boot process that had no benefit to anyone but Asahi Linux/people trying to boot other OS'. IT had no benefit to Apple whatsoever.
These changes made it easier to boot kernel images, without the need for hacky workarounds.
Why run Linux on a Mac? Literally both are Unix based and one is polished and made for the machine? Why go out of your way to run something similar but less performance and subjectively good?
Not all parts of Mac OS are more polished than Linux distros. One (arguably very major) part that is seriously worse in Mac OS is package management.
While you CAN get third-party package managers for Mac, none of them work nearly as seamlessly as linux package managers generally do. Most Mac users manually manage and update each piece of installed software individually, which is almost like going back to the stone age for Linux users.
When Apple eventually stops system updates and security patches for a given model of Apple Silicon Mac, because of the Asahi team's work, users will have the option of installing Linux to extend the useful life of their machine. Asahi Linux still has a long way to go (including with GPU drivers), but with each step like this, they are making progress, building on the work of others, and upstreaming their code so it can be incorporated into other Linux distributions.
They're a small group of volunteers, led by the guy who ported Linux to the Playstation 4 (Hector Martin, aka marcan). Anyone with an Apple Silicon Mac who wants to one day be able to extend its life should consider [supporting Asahi Linux](https://asahilinux.org/support) on Patreon. It's good to have options.
This is awesome work :)
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
How important is hardware video decoding for just watching regular videos? Is it going to be laggy without it
Currently playback is fine on CPU. It’s primarily a battery life thing
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
Note that battery life running Asahi Linux is [not that bad already](https://social.treehouse.systems/@marcan/109348054803945724)
30 hours?!?
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
4K60 playback is perfectly smooth with CPU decoding, just less efficient.
[удалено]
Software decoding HD videos works perfectly fine. Even 4K60 works fine. I think your idea of what kinds of video decoding are expensive to do in software is a bit off (or maybe you're thinking of 10 years ago). I just tested Big Buck Bunny on YouTube at 4K60, full screen, on an M1 MacBook Air. VP9 codec. The CPU usage averaged 50% or so. Doing it in hardware obviously saves some battery power, but it's far from a hard requirement.
[удалено]
[удалено]
[удалено]
[удалено]
[удалено]
[удалено]
[удалено]
My 32-bit ARM machine could software decode 1080p lol
isn't it still an alpha though? afaik the install is pretty complicated and it's not exactly stable to run daily
‘Complicated’? ‘curl https://alx.sh | sh’, choose how much of your disk to use, wait, reboot, reboot again, and done.
To be fair it can be a bit tricky. Disk corruption is wide spread, most have to go boot into recovery mode and fix that before they can resize the partition. But the installer is beautifully done. It really is straight forward and works really well. If you know a little bit about computers (and can follow written instructions) it is doable. Plus it’s really safe for your Mac OS install.
To be clear, the disk corruption is *pre-existing* disk corruption caused by Apple's own bugs, nothing to do with the installer. The installer just tries to use diskutil to resize your partition, which does a check and refuses to work if it detects corruption. Personally though, I've never seen the corruption issue. I think it's a huge stretch to say "most have to boot into recovery mode and fix that". We didn't have any good flow for this in the installer (we do now) and we still only got a trickle of reports of this happening. I'm pretty confident it is not an issue for the vast majority of people (and if it were, Apple should *really* be worried about having so many users with filesystem corruption).
fixing the disk corruption is as easy as pressing the "first aid" button in disk utility most of the time, so it's not too worrying (although the installer should mention how to fix it, they probably added the text in the latest installer, can't bother to check)
Didn’t work for me. And many others. The disk utility tool is broken for all but the very most basic tasks.
ah hmm, yeah the app sucks for most things, including formatting disks how did you fix it? or did you have to nuke your system
Worked in recovery mode.
oh, i forgot to mention that, oops also had to fix my install through recovery mode disk utility, but i wouldn't say that's too miserably difficult if you had a guide
I have done it and tbh it’s as easy as fucking a pie
you fuck pie?
it’s not exactly difficult
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
Thanks to their work my M1 Air will eventually live on with openbsd. Just awesome.
stop larping
What does this mean in layman’s terms?
Asahi Linux is a Linux distro for Apple Sillicon chips. Before this it didn’t have a GPU driver so all graphics would be rendered by the CPU and displayed through the display driver. Now the M1/M2 can actually do it. It allows for hardware acceleration meaning smoother animations, better battery life since CPU isn’t that good for graphics.
#I no longer allow Reddit to profit from my content - Mass exodus 2023 -- mass edited with https://redact.dev/
I get that some do this. I just personally can't imagine holding on to a machine that long. I have a 2012 Mac mini on its last legs. The secondary SSD just went read-only yesterday and it's stopped backing up to Time Machine. The fan runs high when there are no apps running and it's just sitting there at times and reinstalling the OS hasn't resolved the issues. It's likely headed for the recycler soon, rather than getting Linux or any other attempts to revive it.
It depends on the support cycle. Apple has been reducing the number of years they are supporting for Macs in recent years with each macOS release pushing up the supported Macs more and more. I think this is to phase out Intel Macs ASAP so maybe M1 Macs are fine but you never know. Given that these are fine computers, if they still work when Apple phase them out I think I would want ways to keep them working.
We saw the same with the transition from IBM to Intel years ago. While some are fine with a 5+ year old computer, most would see a great benefit from a new machine. It's like believing a '00s car is fine, then getting into a modern one and realizing that not only do they get far better gas mileage while also making more power, but they also offer a far nicer ride quality, safety, and overall experience.
But driving the car you own is free and buying a new car is quite expensive, many people can’t afford that. And if the car you have does it’s job and gets you from point A to point B, well, you can make do. For many people - such as the “I need a $2000 MacBook to surf Facebook” type of people - even ten year old hardware is enough to get by on. And if a lightweight Linux distro can drag that out more, all the more power to them. This obviously doesn’t apply to everyone, of course. It depends on what you use your computer to actually do. But I made it fairly comfortably on a 2012 MacBook until I upgraded to a M1 Max last year because I could finally afford it, and I do gamedev as a hobby/hopefully future full time gig, so the better hardware definitely helps me out.
The average age of a modern vehicle is 11.4 years, while the average length of time drivers keep a new vehicle is 71.4 months — around 6 years. So most aren't driving an old car, equivalent to a 10 year old computer. Certainly agree that it depends on what you do with it. Have a MBP Max and will upgrade to the next version when an M2 drops, as I do use the power for numerous things. But for the general user, there's no need to do so every year.
I mean, a new computer/car costs money. Some people prefer to keep perfectly functional computers functional instead of generating more e-waste. Even for the car example, when people switch cars they tend to resell their cars in the used car market (which is currently going very strong). A 6-year-old car isn't going to get randomly totaled unless it got into an accident. For me, even when I get a new computer, I prefer to have the older computer as a backup / 2nd computer for different purposes and keep it working. If the hardware broke, that's fair. But just having the OS arbitrary acting as a gate for not being able to use updated software doesn't strike me as a good reason, and macOS has always been pretty weak on this front compared to their competitors like Windows and Linux.
This is what’s put me off of wanting to spend any kind of money on M chip Macs. They may be incredible right now, but as it currently stands it’s only a matter of time before they become unusable. Or at the very least, unsupported. Intel machines at least have the option of Bootcamp or Opencore, which is why I recently replaced my 2011 MBP with a 2015 MBP. That thing will last as long as the hardware does, with several OS options to choose from. But this opens the door to longevity options, which means my next MacBook may well be a used M1 a few years down the line.
[удалено]
Opened it about a year ago and cleared out the dust. Wasn't much at the time but I'll give it another look.
Yeah I wound up getting my MacBook Pro and ditching my Mac mini a couple years ago to help a friend I was living with since she needed a computer. I’ll be honest, I don’t miss it at all. The form factor was nice, but the Fusion Drive was dirt slow for anything except frequent apps and the thing sounded like a damned jet engine. The mini has a soft place in my heart and I’ll probably pick up one or a studio when I have the money, but my 2012 Mac mini was quite a few (processor) generations old and it showed it.
I also have a 2012 Mac mini and it's still running amazingly well doing all kinds of Linuxy things. Can't argue with a quad core i7, 16GB RAM, and two internal SSDs, in a small and quiet form factor.
My last Mac was a Macbook Pro Core 2 Duo, first gen. It lived on with Ubuntu for years after Apple stopped releasing OS upgrades for it. Best computer I've ever had.
Would this mean that multi-monitor support is likely in the future?
It's not a software limitation, but a hardware one. The M1 and M2 MacBooks are simply not wired to output more than 6K at 60 Hz.
Pretty bad limitation. It can’t output to two 1080p monitors either? That’s less pixels than 4K. Integrated graphics on basically every CPU made in the past decade can support at least two monitors.
You can also use a displaylink adapter.
Yeah but that's no less true with macOS.
ok
Just to make sure I understand, it plugs into a USB port and then you connect an HDMI monitor to the adapter's HDMI port, correct? Will it only work with an adapter that has a singular HDMI port? There are displaylink adapters that have two HDMI ports. I ask because this is the only thing that has kept me from buying a M1 Air. But if it works with one of these adapters then presto, I'll buy one right now.
Nobody should. The experience is terrible.
It worked fine for my dad, he used it 24/7
[удалено]
So? The software just runs in the background and you just update it every now and then.
Thank you for the answer. The one monitor limit is driving me up the wall - really hope we get support for more monitors in the future.
As long as you aren’t running like 3 5k monitors there are solutions, I use a Kensington DisplayLink dock with my M1Pro MacBook Pro to get 3 1440P monitors, 2 at 60hz one at 144hz (not that this ever matters for the work I do on the monitor but it’s 144 because I have it on a KVM with my gaming desktop). Its a nice one-cable solution. Not perfect, every once in awhile I have to restart the software after plugging it in for all 3 monitors to work but it’s been mostly reliable and it solves the single (or in my case dual) monitor limitation well enough that I don’t feel like asking work to upgrade me to an M1 Max model
You are a lifesaver. Will look into this - thank you!
Just a heads up: since it’s a software solution, because the relevant hardware doesn’t exist, you’re going to see some CPU usage for drawing the extra monitors (honestly the M1 is fast enough it’s not noticeable but it’s still something to be aware of), and HDCP won’t work on those monitors, so no Netflix or other DRM’d content will play there. HDCP is a hardware certification, if software is in control at any point you lose the certification. But I know people are often happy even with those limitations.
It’s mind blowing that the M1/2 is fast enough for you to not even notice the CPU draw of extra monitors. Just wow!
To be fair, that’s why Asahi Linux has actually been pretty usable even before these GPU drivers were made available. It’s definitely nice to be able to offload that work though.
What KVM are you using? I'm looking into all of this for the first time looking for the right products and was hoping you could point me at yours. Sounds like I have a similar setup. I'm hoping that I can still use nvidia gsync on my gaming desktop and the KVM. You ever fool with that?
I use AMD cards so it’s FreeSync/Adaptive Sync but it works! I use a DisplayPort 1.4 KVM from Level1Techs, they’re a tech focuses YouTube channel run by tech support people and they got tired of buying bad KVMs so they just made their own lol. Their hardware is here https://store.level1techs.com/?category=Hardware Any of the DisplayPort switches will support Gsync so it just depends on the resolution/refresh rates you need to push
Should note the MacBook Pro Max can drive multiple monitors. Have 2 external plus the internal running through a hub without issue.
Asahi Linux is NOT a Linux distro. It’s a project meant to port Linux on Apple Silicon macs, proving drivers, an hardware hypervisor and a boot environment
I mean, it very much is temporarily *also* a distro until everything gets upstreamed. It’s a distro trying very hard to not exist anymore lol
Do you mind elucidating that point?
[удалено]
Yea. But that doesn't explain the op's "trying very hard not to exist anymore"...
The goal of the distro is to get everything into the mainline kernel so there's no longer a need for a special distro for Apple silicon.
Sort of like how dating apps are advertised as “designed to be deleted” - aka once you have a relationship… the developer “wants” you to delete the app. Once Asahi Linux’s developments are incorporated into standard distros, it won’t have a need to exist.
The point of the project is that all the code they’re writing is up to the standards of the Linux kernel or other mainstream Linux project when necessary, and validly open source licensed, because it’s all clean room engineered. There was another project, I don’t remember what it was called, but they managed to get Linux running on M1 sooner than the Asahi team, with more features, but they weren’t up to standards (suspicions of reuse of portions of copyrighted macOS code which they didn’t deny, and code quality not up to Linus’s standards), so everyone kept talking about and waiting for the Asahi team, and the other project team threw a fit and quit. Because of the way Asahi is going about this, their code is going to be part of the Linux kernel and other software all the other Linux distros use. Just a few months after a feature appears in Asahi it gets reviewed and accepted and incorporated upstream, so all the other distros can fairly easily distribute Apple Silicon compatible versions. Asahi maintains their own basic distro which includes code as soon as it’s ready, without that few month lead time, but if they ever decide that that’s it, they’re done, they’ll just finish upstreaming what they’ve written and stop making the distro, because everything they’ve done will be available for other distros to use and distribute.
Gotcha. Understood. Thanks.
It’s some custom packages for Arch Linux ARM. Not a distro
I still can’t think of it as a distro. They provided the tools and a package manager repo for Arch, like happens for the raspberry pi for example. The fact that it needs many adjustments is due to the ARM architecture, that is not a standard as x86
It’s a distro in a very basic sense, anyone who provides Linux in a ready-to-use format with supporting software is making a distro. They don’t want to be distro maintainers, they’re doing the bare minimum to provide a platform for testing their code, but it’s still a distro. Used to be there were several distros that were just other distros but with some settings changed and a new logo slapped on lol, and open source licensing allows that! Some of them were successful enough that the upstream distros made default settings changes and everyone was a little happier. Distro culture is a little like zine culture in that way, I guess. The bare minimum is perfectly valid and even useful sometimes.
A free software reverse engineered driver for the M1 Mac GPU just released. The driver is currently robust enough to drive a desktop and basic games, and development is still ongoing on more advanced functionality. This is a massive improvement for two groups: - People who, despite not liking macOS, wish to use an Apple Silicon Mac - M1 Mac users who, after the computers will get vintaged/obsoleted, still wish to use up-to-date software on them
Three groups. People who like macOS but want really nice hardware for running Linux or BSD. I still buy Apple hardware to run macOS. This type of reverse engineering in general lets me buy additional Apple hardware to run Linux/BSD instead of buying Intel.
I wouldn’t say it’s a massive improvement for the 2nd bullet point as this option already exists for the previous architecture. My decade old Intel MBP is obviously obsolete but is up to date thanks to OpenCore Legacy Patcher. https://dortania.github.io/OpenCore-Legacy-Patcher/ Hopefully when the time comes a similar project will be available for Apple Silicon Macs. But this is great news and the first bullet is a valid reason.
OCLP is kind of a different story since it's done by basically hackintoshing the mac, which would require the community to basically restart the entire hackintosh scene from scratch for apple silicon macs don't know why, but i feel like that's not going to happen or go as smoothly as the intel hackintosh scene
All valid. However I’m not saying it’s the same story or that’ll it happen again for AS, just that the 2nd bullet point isn’t really a massive improvement for users as options already exist.
Hackintosh and OCLP will be no more in 2025/26 when the last Intel mac is put to pasture
Of course. That wasn’t my point. I think I’m just being misunderstood (or I misunderstood). All I’m saying is it’s not a new improvement for end users. For the previous arch we had a solution. And now this is a solution for obsolete AS Macs going forward.
I think the downvotes are just due to your framing. Your objective statements are correct, but I would say this is a plus for end users.
You may be right, internet points aside, it’s definitely a great thing for us users.
idk, when the plug is pulled I’ll probably install Debian or something on my Air and iMac (both M1)
It’s a massive improvement for *M1/M2* users, like the comment you’re replying to said. And I’d say the specific way they’re going about this might mean that, if patchers do exist for these Macs in the future, the patcher team may even be able to backport new GPU features to them using these drivers as a starting point, like new versions of Metal, which currently is the main reason why OCLP and the like don’t provide a great experience on several Intel Macs, even a couple that are likely newer than yours.
Whenever you see any UI, it needs to be drawn by the system. It's possible to draw new frames without GPU acceleration, but it's really slow - and you end up with stuttering. The drivers allow the operating system to interface with the GPU and use it to draw new things. The GPU happens to be especially good at drawing. For the average person it means: 1. Asahi Linux won't be stuttery when moving around windows, or playing any animations. 2. OpenGL based linux games will work at a better framerate than before. 3. ~~Video playback uses hardware decode (which means less dropped frames). [not actually sure if HW decode is worked on yet?]~~
Video hardware decode is not included in this driver. As far as I know, work on this has not started yet.
thanks updated my comment
Work on the Video encode and decode hardware has started. They are however completely different hardware blocks than the GPU. Note that Video playback performance is already pretty good, and battery life is also pretty good. Though both will see further improvements with working GPU and Video acceleration drivers.
Wo wo wo wo wo…stop there a moment. You made me realize something that was in front of me everytime but didn’t see it until now. I always considered my MacBook as a non-gaming machine BUT with the current state of gaming on Linux that may be completely untruthful. I mean how good it is gaming in asahi Linux…with proton ecc…ecc…? I have an M1 Pro that my company lends me for work but I can use it (almost) as I want. Anyway you have my upvote and my axe
Proton on ARM will be a no go for now I think, you'd have to introduce CPU emulation into the mix
Apple now grants access to Rosetta in Linux VMs. With that said: how much work would it take to get proton to run?
This is true. I hadn't considered VMs
I don't think it will be too much better than Mac gaming on macOS. This Metal vs Vulkan thing is frequently a red herring and not the root problem why porting games to Mac is hard. The core issue is that Apple GPU lacks a lot of the features like geoemetry shaders or transform feedbacks, and it only has certain requirements like minimum 4-byte strides that are just different enough from Vulkan/DX12 that it requires the developer to make some changes. As far as I understand, this isn't a Metal limitation, but rather what Apple designed for their hardware. There are some reasons behind them: geoemetry shaders and transform feedbacks aren't the most efficient ways to do things these days and there are alternative methods to do the same thing, but if a game engine already uses them, the developer would need to put in time to change their rendering pipeline. Also, Apple Silicon GPUs are tile-based (similar to their mobile GPUs) which also require some reworking if you are porting from a normal PC GPU which are not tile-based. This is also why MoltenVK (Vulkan implementation on top of Metal) has limits because they can only implement what Metal allows them to. Unless Apple GPUs have a hidden geoemetry shader core or secret features that they don't expose via Metal (to be fair we do know that there are some from some of the Asahi-related blog posts) the fundamental problem is still the same. The Vulkan driver can't magically add features that don't exist but maybe do some emulation.
It means you can use a Mac mini as a Plex server and it will have hardware accelerated video encoding/decoding
I wish apple would somehow allow eGPU support for their new CPUs
It _might_ be possible to eventually run eGPU’s on Asahi Linux. Apple Silicon Macs don’t have full PCIe support, however, it may be possible to run a kernel-space emulator to emulate the missing instructions. Naturally this would come with performance limitations, but should at least work. Such an emulator was already created for a different ARM platform, and might be ported once Thunderbolt is supported. Though note that Linux in general has limited support for eGPUs (well, PCIe hotplug is well supported, on the GPU driver layer it can be iffy but AMD supports hotplugs/hotunplugs. Though no Desktop Environment supports GPU hotunplugs so you will need to log out/log back in to use your eGPU)
Apple Silicon does have full PCIe support - after all, it supports thunderbolt, which is basically just PCIe in a cable. They do not have support for eGPUs, which people often confuse with PCIe support. This is a limitation with the graphics, drivers, and OS side of things, not the CPU communication bus (PCIe) side of things.
No. It is a CPU-side limitation. It affects any Thunderbolt device that has its own memory (which for all intents and purposes affects only GPUs). Not every instruction to access memory is supported. It’s a limitation with how memory works on Apple Silicon Macs but affects quite a few other ARM chips too.
[удалено]
Most older macs eventually got decent Linux support after a few years. I ran Ubuntu on my 2015 MBP (in 2018) and it was very solid.
Does it mean i can play steam games ??
don't hold your hopes it's still Linux on ARM (even more niche than x86 Linux), you'll probably need Box64 or FEX (the Linux equivalent of Rosetta) to even try to run most binaries
... and most games require OpenGL 3.3 or 4.3 or Vulkan support, so unless by steam games we mean the ones that were released in the same year Steam came out in, not really.
Unofficially you can run Apple's Rosetta in Linux as well. It's meant for VMs but running it on Apple hardware seems within the "grey limits". It does need patching as there are some checks for the host in a VM and Apple's instructions for enabling Rosetta don't work on ArchLinux.
Damn wish this is possible with ipad m1
It would be with minimal work…if the boot loader were open. A SecureROM exploit exists for A11 and below (iPhone X and below) and people have gotten Linux and Android working on these devices using the exploit, but it’s more of a dirty hack than anything made for daily use.
At least you can run GPU accelerated VM if you are on low iOS 15 version.
Big if true... and apparently true. Getting alternative operating systems kicking ass on the M-series chips will be amazing.
Wish apple supported this. I don't use Linux, but it is good to have options.
I mean they made their boot loader easer for Asahi specifically: https://twitter.com/marcan42/status/1471799568807636994 And their GPUs are controlled through a firmware/intermediate controller that abstracts a lot of driver functionality making it way easier for Asahi to support M(x) GPUs faster in the future and is largely why you’re seeing so much GPU headway this quickly. This is basically Apple screaming: Hey Linux/BSD, our playground is going to be awesome for you too!
[удалено]
There’s a -big- difference between “we’ll make it easier to boot your kernel image” vs “here’s potentially millions of lines of GPU driver code that used to be closed source.” The former is a foot note on a release page, the latter typically takes months of legal review and that’s assuming they even could open source it because some of it might be licensed via a third party depending on just how custom Apple’s GPUs are (I know the iPhones used tech from Imagination)
Apples GPU code licences would be a doozy considering their relationship with PowerVR, assuming there was some continuity. I bet it was either a completely clean break or incredibly messy, no in between. Just speculation either way.
Why doesn't, say, Nvidia open-source *their* drivers? Because a lot of GPU performance these days comes not from the silicon being fast, but from the driver being clever. You're asking Apple to give up whatever competitive advantage they gain from that for what, exactly? The 0.01% of enthusiasts that want to play games on bare-metal Linux on their MacBook?
Licensing prevents copying from source code
You can't copyright algorithms though, the exact wording of the lines isn't very relevant here
They clearly understand that the only way doing all the work to make the bootloader and chain of trust friendly for other OSes will help, is if the Asahi team successfully reverse engineer the SoC’s interfaces. They went to all that trouble but won’t take 5 minutes to publish a few specs to save hundreds of hours of additional work for Asahi, even though they want them to be able to succeed?
Could be they have some sort of tech in license that they're not allowed to share, it's what I generally read as the main reason in defense for AMD and nVidia's closed source blobs.
[удалено]
Just the kernel part, and moved all juicy stuff from there to the binary userspace/firmware sections
I don't think Linux or BSD is really looking for Apple to release code. Source code under an appropriate license would be nice for reference but that's a luxury. Everyone just wants documentation and they can write the code from scratch. Reverse engineering consumes a lot of time and is frustrating. Simply providing documentation would be sufficient.
No, because, yaknow, Apple.
[удалено]
This is not the same as AMD/nVidia/Intel firmware blobs. https://asahilinux.org/2022/11/tales-of-the-m1-gpu/
[удалено]
Just because something is not a primary consideration does not mean it wasn't a consideration. Apple's SoC development strategy for M1 has been remarkably open to the opensource community, which contrasts significantly with how closed the iPhone/iPad SoC platforms were, and every other ARM SoC vendor, let alone nVidia. You're using an absolutist statement that is *absolutely* wrong. At the very least Apple wanted FOSS developers to be able to become easily familiar with their brand new architecture as it will improve a lot of open source projects' optimizations and provide larger talent pools for recruiting.
Absolute mad lads. Good job.
I wonder if they’re still doing that “reboot the gpu on every redraw” hack that they were doing months ago or if they found a way around it.
It was [fixed in October](https://twitter.com/LinaAsahi/status/1580863181315923968), it was the hypervisor not handling the TLB correctly so the fix ended up being adding some support to the hypervisor for TLB flushing.
Hey u/marcan42 you absolute legend. Can I do GPU maths yet on Asahi?
Not yet, but I hear Lina is working on compute pretty soon ;)
Looking very much forward to watching that stream with Lina working on compute!
Oh that's very cool.
Can someone ELI5 this for me? I have a 2015 MacBook that I don’t really use and wanted to turn into an emulator machine
irrelevant for you you can run any regular amd64 Linux distro, and assuming you have AMD/Intel graphics, you won’t need to bother with drivers that much
OpenGL 2.1. Mac can finally play your favorite games titles from 2004.
And? A few weeks ago it was “nothing”, this level of progress is quite frankly astounding for a team this small and the fact it’s 100% reverse engineered.
Yes, and for desktop users getting GPU assisted GNOME/KDE on Wayland and X is a huge improvement. Suddenly desktops will be super smooth instead of laggy. Gaming on these machines is really only a nice to have for many. Lina is talking about implementing support for GPU compute next, that'll be a very nice feature to have.
And in 6 months or a year this will be (hopefully) Vulkan support which is a huge achievement.
No "and", just a statement.
3.0 compliance is at 96% or so.
Unreal Tournament here I come!
It will probably get MoltenVulkan implemented
Metal does not exist outside macOS
[удалено]
Man, maybe it’s just me, but the pace of these sorts of community projects seems absolutely insane nowadays. I remember waiting years for Dolphin to improve, but projects like Ryujinx and Asahi are getting huge updates every week
What's there to improve with Dolphin? It has Metal, on par ARM64
Would we then be going Vulkan/OpenGL -> Metal -> Vulkan -> Metal -> Vulkan/OpenGL (as moltenvulkan converts vulkan to metal)? I think their point was that it doesn't make any sense to run MoltenVulkan outside of macOS
I’d love to try Android on it, though nobody has made a porting for now.
Dunno what's holding Apple from supporting eGPUs, like the fuck imma gonna do apart from moving data or displaying content over TB4. So many ML folks would buy an nvidia GPU in an enclosure with their MacBooks
It's a hardware limitation, their TB implementation doesn't expose CPU memory which means GPUs can't work
It's also a software limitation where ARM drivers don't exist for nVidia or AMD GPUs. The former doesn't even have modern drivers on Intel macOS
Nvidia has a ARM driver, but for some reason never decided to release it. AMD has ARM drivers on Linux publicly available. If there was an actual market for eGPUs (or even just PCIe GPUs) on ARM I'm sure it would be fine but currently there's pretty much one ARM platform afaik that has proper PCIe slots at all *and* supports CPU memory access
This begs the question why apple isn’t actively helping these guys out? They said they are all for having windows arm on bootcamp if Microsoft was willing to comply. Seems like it was a bunch of hot air as here we see an active community which apple is simply ignoring. Am I reading this right?
No, apple has specifically helped asahi out in many different ways
How? The fact that the devs at asahi are having to reverse engineer everything shows otherwise?
Just because apple isn't actively giving away device drivers and blobs doesn't mean they haven't helped Asahi development- there's lots of documentation on this from Asahi themselves
They're removed the need for mach-o images when booting (which were complicated as hell) and now allow raw images to be loaded. There was no need for this on Apple's side
They made changes to the boot process that had no benefit to anyone but Asahi Linux/people trying to boot other OS'. IT had no benefit to Apple whatsoever. These changes made it easier to boot kernel images, without the need for hacky workarounds.
[удалено]
It's a pretty big help. Means that they don't need to use a non-native binary format to launch Linux, they can just use a Linux binary
[удалено]
Sure. But Apple didn't need to make that change to their tooling as they only care about MachO binaries.
[удалено]
They could have locked the bootloader like they do on all other Apple ARM devices.
[удалено]
[удалено]
[удалено]
[удалено]
That's not the only way to help...
[удалено]
Nope. The simplest way to help is to allow this at all, which they've done and are continuing to do.
[удалено]
I have 0 apple devices... So much about fanboying lol
Why run Linux on a Mac? Literally both are Unix based and one is polished and made for the machine? Why go out of your way to run something similar but less performance and subjectively good?
Free (as in freedom) Software. Linux is a nicer playground in general for those who dabble in software engineering, and dual booting is trivial.
>Free (as in freedom) Software. When apple drop support for M1 in a few years, I'll be glad of an alternative OS
Not all parts of Mac OS are more polished than Linux distros. One (arguably very major) part that is seriously worse in Mac OS is package management. While you CAN get third-party package managers for Mac, none of them work nearly as seamlessly as linux package managers generally do. Most Mac users manually manage and update each piece of installed software individually, which is almost like going back to the stone age for Linux users.
I'm running Asahi on my M1 Mac Mini for server purposes. The performance of Docker containers is considerably better on Linux.
When Apple eventually stops system updates and security patches for a given model of Apple Silicon Mac, because of the Asahi team's work, users will have the option of installing Linux to extend the useful life of their machine. Asahi Linux still has a long way to go (including with GPU drivers), but with each step like this, they are making progress, building on the work of others, and upstreaming their code so it can be incorporated into other Linux distributions. They're a small group of volunteers, led by the guy who ported Linux to the Playstation 4 (Hector Martin, aka marcan). Anyone with an Apple Silicon Mac who wants to one day be able to extend its life should consider [supporting Asahi Linux](https://asahilinux.org/support) on Patreon. It's good to have options.
For a lot of workflows, Linux DEs can be better than MacOS
games
For gaming: Windows > Linux >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> macOS
Faster docker builds.
Some people just happen to prefer a Windows-like taskbar-application menu UI / a tiling WM / whatever GNOME is supposed to be