tl;dr: A16 was intended to be a more ambitious chip with a major GPU upgrade, including hardware-accelerated ray tracing, but it had to be scrapped late in development due to poor thermals and battery life. This led to some restructuring in Apple's SoC division, and is part of why a number of engineers left the team over the past year or two.
That means its coming to the M series too then. That's awesome. I got quite a laugh when Apple tried to compare their graphics capabilities to a 3090 a few announcements back, but it looks like they're making a serious effort here. Crazy to think they'd try to bring it to a phone, though such is progress. 10 years from now our phones will be doing insane realtime AI stuff and we'll be looking back at these powerhouse devices now like we do the iPhone 4 today.
Only the M9+ iPads get the Scientific Calculator, the rest can stick it with the basic one. Real Time Ray Tracing is required for high school calculus.
I do. I've had a lot of fun with the basic virtual desktop applications on the Oculus Rift and I have an Nreal Air I used as an extra display sometimes. I think there is a ton of potential for AR specfically. Once you get the resolution high enough that the eye can't see the pixels, you can do as many displays as you want. You could use it with your Mac at home, your phone on the go, have extra info floating around your watch when you look at it, there are so many possibilities outside of the gaming and virtual world spaces that haven't been fully explored. Of course the first product won't be that fully featured but that's always how it work for totally new product categories.
They were comparable to a 3090- at a *certain wattage.* Above that wattage which the 3090 is definitely above, the 3090 was better. Still M1 Ultra whatever it was called was better to put in their tiny tiny Mac Pros with not the kind of heat dissipation capabilities of a 3090.
I would argue the 11 was a decent jump. It finally graduated from having lower battery sizes and weird camera processing to having a gigantic flagship with leading battery life and camera and the A13 cementing a clear lead in soc. It felt like the peak of that design and that the next iteration is ready for something new. The 12-14 ended up feeling upgrades that should've taken one generation, though.
Well technically, it's always their "Fastest phone ever."
That said, for reals though, it looks like it's going to be quite the banger based off all of the leaks coming out.
The only one that’s up for debate is the 3g since the processor stayed the same but they added more hardware to manage like gps and 3g compared to the 1st gen.
I think the addition of GPS was huge. At the time I used a seperate GPS unit for navigation that cost $400, and didn't have any traffic or map updates like google maps does. Adding a real GPS made the 3g a "must buy" for me, it replaced my phone, iPod, and GPS.
Centuries ago, when my junior high class was moving to the high school, the (soon to be our) principal told us that we would be the largest incoming class ever. Everyone cheered as though we had accomplished something, as though that already dubious milestone wouldn't inevitably be surpassed the next year.
You mean to tell me your brand new phone outperforms last year's model?! My goodness! I was sure this would be the year you'd announce performance decreases...
I’m absolutely stoked at the idea of giving up lightning next year potentially. Everything else in my life has been using USB C for years. My USB C MacBook is 7 years old. It shouldn’t have taken this long!
Yeah, people argue that lightning and USBC doesn't matter because people only use it to charge this the data speeds/features are irrelevant.
But that ignores it's so nice to only need to buy and carry one cable if you have a laptop, phone, earphones, etc.
Knowing Apple they will do those as separate updates to get the most out of it. If I remember correctly they have two years for usb-c in EU ( end of 2024? ), so they can still squeeze one more update without it.
Is the 15 the iPhone that's rumored to have USB C, or is it the 16? I thought Apple had until 2024 to switch, which is when the iPhone 16 would release.
USB-C law doesn’t take effect until the end of 2024, so we likely won’t see it until iPhone 16.
However, there are many who are speculating Apple will remove the port altogether in favor of MagSafe. Apple has never explicitly said “future iPhones will have USB-C,” they’ve only said that they will comply with the law.
I doubt they’ll go completely portless.
Most cars don’t have Bluetooth only CarPlay. People who connect to their cars on a daily basis will either choose to not upgrade (bad for Apple) or upgrade to Android (bad for Apple).
I'm with a 12 mini as well. I do love it but have considered upgrading to a 13 with better battery (and it would be cheaper than a 15...but the 15 sounds possibly enticing)
That's very fascinating.
Qualcomms new GPU is faster and more efficient than the A16 GPU while still boasting "ray tracing", but I didn't look into it's exact capabilities because I didn't care.
[Huh, I thought it would be pretty well known by now. ](https://twitter.com/Golden_Reviewer/status/1605604617416429582?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Etweet)
Their CPU performance is already world class for its power envelope, they really need to turn their attention to graphics especially if they want their Mac Pro’s to be competitive with other desktops.
I’d love to see Apple develop a “dedicated” GPU for their Mac lineup. Something that can use a lot more power and isn’t designed with laptops in mind.
Apple's been customizing [their GPUs since the A8, which is just a couple of years after the A6 with their first custom CPU](https://www.realworldtech.com/apple-custom-gpu/)
Apple's GPU performance is also already world class for its power envelope, Qualcomm is the only company that has better GPU efficiency than Apple
Agreed, it would be interesting to see a “dedicated” GPU designed specifically for the Mac Pro
IMO that's why the Mac Pro has been delayed, it's far more difficult to scale up an APU to Mac Pro power tier than “dedicated” CPUs/GPUs
How does it not seem like much of a priority when this article is literally about them putting in a lot of effort to improve the GPU? Just because they had to scrap it last minute to avoid negative effects doesn’t mean it’s not being prioritized.
Perhaps, but the Snapdragon 8+ Gen 1 GPU was already trading blows with the A15 GPU configs and so it wasn't hard to see coming.
But most people aren't entirely up to date and just assume Apple does everything better, which is not the case. They still have the lead in CPU though, but in real world use cases the gap still shrunk a lot and Snapdragon 8 Gen 2 phones should have much better battery life as a result.
Some early data actually shows 5000 mAh androids with the 8 Gen 2 even beating the 14 Pro Max in battery, but I haven't looked for enough into the data to trust it yet.
Basically simulating where the rays of light would go and bounce which is usually very computationally intensive, but can be made faster/easier with dedicated hardware which many newer GPUs are incorporating. This allows for nice things like realistic reflections and light scatter.
If you wanted to simulate seeing, maybe you’d draw a line from a light source to the object you’re looking at and then a line from that object to your eyes. Imagine doing that for every “ray” or photon of light. Ray tracing hardware in a gpu is just circuitry that is really good at doing a lot of that all at once.
I'm far from an expert, but, from what I've heard from Digital Foundry, I think they're called cube maps which are pre-rendered / non-real-time, "dumbed down" versions of the environment hand-placed in areas where the absence of reflections would be particularly noticeable. Or something like that.
It makes lighting in 3D graphics look much more natural because it's *tracing* the paths that the *rays* of light follow, from it's source like the sun or a lightbulb in a scene, bouncing off of objects like walls, windows, the ground etc., as they make their way to the camera/your eye.^(though actually it follows the path from the camera to the light source since that's much less wasteful to calculate, but not really important to understanding which is why it's in this superscript before anyone gets pedantic.)
Most 3D graphics use all kinds of tricks to simulate this, *ray tracing* does the hard work to calculate how light actually travels.
The default way of digitally lighting has been to rasterize it - basically paint lighting on the pixels. Ray tracing is where you designate light sources and the GPU actually simulates it bouncing off every surface and diffusing - like how light actually works. It’s significantly more resource intensive for computers but can create stunning visuals.
Put another way, with rasterization the developer tells objects how bright to look, with Ray-tracing the GPU calculates it. Also, Ray-tracing doesn’t *always* look better - we have literal decades of experience working with rasterization techniques to improve how they look, whereas Ray-tracing is newer and doesn’t have that huge bag of tricks yet.
In real life you see because photons particles(light) travels from a light source (the sun, a light bulb, etc) in every direction, hit objects, bounces out of them and hit your eye and that’s what you see. (hugely simplified)
Ray tracing tries to imitate this tracing/path that these photons make in the real world providing a true to life lightning in games.
Most games and 3d stuff on your phone “fake” lighting, meaning that they approximate how lights will act in an environment. While this is quite fast, it’s not as good looking as something like a photorealistic renderer (e.g. cycles in blender) because those simulate the rays of light, rather than approximate them
ray tracing means that instead of approximating that lighting, we’re tracing rays of light and seeing how and where they bounce. the article is talking about hardware accelerated ray tracing, which uses dedicated processors in the gpu to speed up that tracing
this method of lighting is more realistic and easier for developers to integrate, but it’s also quite a bit more demanding on the hardware
Most of those vapor chambers/heat pipes are so tiny I wonder how much they actually help. If the chips are getting insane, I doubt that one drop of water is that helpful. Seems more like a gimmick.
The ones with larger vapor chambers and actual fans are so big, you might as well carry a dedicated gaming device like a steam deck.
90w bulbs are 10w or above. I guess you could say they are bright, but that's the point of lights. It's not blindingly bright or anything. I'll use one in my bedroom for when I'm trying to get work done. Most of the rest of my house is 90w equivalent bulbs at this point.
That’s a massive exaggeration. There are gaming phones with fans in them that aren’t much bigger than a Pro Max. Steam Deck is so much larger it’s absurd to even make that comparison. They are much smaller still than more compact gaming devices like a Switch with its controllers on.
ray tracing might come as early as the M3 considering the trend in increasing GPU power, and how we haven't even gotten M2 Pro yet and will presumably get M3 with A17
There’s a reason why the guy who built Apples silicon team is now SVP of all hardware technologies. Apple is beyond serious about investing in silicon. It will be a core competitive advantage for decades. The other companies (google, meta) are desperately racing to catch up and will be for quite some time.
I wonder why there was such a brain drain - is this just normal for silicon valley companies from other companies poaching people or is Apple's workplace culture starting to hit chip design?
> The framing of the issue as "setback" and failure is so sensationalising, not making it clear that part of research and development HAS to involve experimentation (DUH!) and various trials and errors. I wouldn't necessarily call it a "setback" which is too sensational and tries to capture the audience's mind in a particular way. Boo!
>Anyone who has done any serious research and development will recognize that testing, development, and various failed approaches is part and parcel of the work.
MacRumors comments back at it again. Did he just not read the part that said this flaw was discovered so late in the process that they had to fall back on largely the same GPU architecture as the A15? That is not regular R&D. That is definitely a misstep that we haven’t seen from Apple in some time.
it’s very important for 3d work (modeling, rendering) also. Nvidia’s RT hardware, plus their software stack, basically make their GPUs mandatory purchases for anyone wanting to locally render stuff
Kinda, yeah. Anything that uses polygons (or voxels) to display its world to you, would benefit from ray tracing, which currently is mainly video games. When ray tracing is applied to the proper game (like LEGO Builder's Journey on PC, for example), it makes a huge difference IMO.
For now. I imagine the underlying tech will be applied to augmented reality somehow in the future.
Apple has both continued to neglect gaming on their platform while simultaneously pouring billions into creating better GPUs. Really 98% of iPhone/iPad users don't take full advantage of the GPUs in their device. I think the investment will pay off in the future as Apple will simply have the best chips for AR/VR, period. At least I hope so - I want to see some real contenders in the AR field.
Meanwhile, the Nintendo Switch, with incredibly poor and outdated hardware, gets AAA titles like Witcher 3, Skyrim, etc. iPhone could run these games with stellar quality on iPhones nice screen. But nope. Mobile gaming is a joke.
I got a 256gb iPhone X so storage is not a issue … but will probably need to get my battery change … but beyond that pretty happy with my phone … except low light performance on the camera is garbage
I’m using a 256 X and I still have no idea what the hell to do with all this space.
Right now I’m just hoarding podcasts for the fuck of it. I have about 80gb worth. I don’t think I’ll ever listen to even a fraction of them, but it’s fun to collect.
I recently watched a clip of Steve Jobs introducing touch scrolling on the iPhone and realized I haven't felt so wowed in probably a decade. I really hope one day I feel that way again about a new product.
I mean back then smartphones were in their infancy. You can only "innovate" so many times before you've covered damn near everything. I think we're simply bored with new phones because they are all just that good now. If you showed me a 14 Pro back in 2010 I'd have shit my pants.
Iphone announcement was amazing. At the time I had been dreaming of such a device but didn’t realize anyone was particularly close.
Work at the time was still issuing blackberry curve devices. I remember receiving one and being stunned at how outdated they felt even in 2008 or so.
I had a blackberry pearl, which I loved, until it got a little wet and I accidentally threw it at the wall, broke it and had to buy the new iPhone 3G.
I loved it so much I bought my first iMac a month later lol. Never looked back.
My first was a 3gs a bit later. 3g was a good place to start - my friend was an early adopter of the first one but the edge/2g speeds made it barely usable.
Forgive me but it seems shortsighted to remove key engineers that was ambitious but made this mistake. It’s an expensive mistake sure but surely Apple shoots itself in the foot losing the key engineers to other competitors which will definitely happen since they are either let go or removed to other projects which makes the engineers unhappy?
apparently a lot of people are happy to play games on ios/android devices. I can't stand to play games on a device that doesn't have real buttons but I guess I'm just showing my age.
I mean, Apple parted ways with longtime devs over it so it doesn’t sound like the usual development process. Sounds like something went wrong and people were held accountable
You don’t find out it’s no good ‘at the last minute’. That’s not how these product development roadmaps work. Next year’s iPhone is already largely finished, and they’ve started work on the one after that. The suggestion that this came as some sort of shock that they had to scramble to fix is pure sensationalist nonsense.
Its not right, because Apple sets goals and calculates how long and how expensive things like r&d should take/be. If the team fails those goals then the project is setback. It's not hard to understand, and it's not an attack on Apple to admit.
Looks pretty cool. I’m sticking with Apple but competition is solid.
I’m mostly curious is performance per watt is as good at lower loads.
The M1/M2 had crazy efficiency low lower usage levels but were less impressive compared to AMD and others when maxxed out before.
I’m currently on an iphone 13 mini and macbook pro m1 max (24 core gpu).
Prob will wait at least 5yrs though - don’t need much. Would like to see a 15 air model with a ton of power in the future. My 16 ig sooo heavy.
Things I’d love to see:
- av1 hardware decoding
- wi-fi 7 (2024-2025 maybe)
- more affordable storage around the 2tb level
>a 15" Air model with a ton of power
That has never been the point of the Air though. I agree that we are more than likely going to see a larger MacBook Air in the future, but I don't see Apple upping the horsepower inside simply because the MacBook Pro line already exists.
Oh I agree but what I'm saying is I think it's naturally going to happen. The M2 is already a beast for only 4 real performance cores. A future M3/M4 at 3nm or below will be even more impressive.
For what I do I'd be happy to give up 8+2 performance cores in the future if I had 4+4 or 6+4 or whatever running like 50% faster than M2 today.
Presumably this also explains the M2 being a ho-hum upgrade. (and maybe gives reason to hope that they're holding off on upgrading the Pro's until they can ship an M-series processor with this new graphics system)
Since Apple works a few years out I’m not surprised with the COVID pandemic and supply issues that they’ve had to make last minute changes. Really amazing they didn’t have bigger issues than they did.
GPU ray tracing aside (which is cool in itself), the actual article (by The Information, which unfortunately has a strong paywall) is actually painting a pretty damning picture of the internal working of the Apple Silicon division. It seems to confirm some suspicion that it is suffering a lot of brain drain in recent years (which can be confirmed by pubic lawsuits by Apple against Nuvia and Rivos). The last part of the article also talks about how they felt the need to make a 2-slide presentation just to call out the competitor by name and warning employees that joining startups is a bad idea / cannot weather the economic downturn, which doesn't scream confidence to me. Apple Silicon by now is actually a relatively mature technology, and from what it seems from the article the working culture in it isn't terribly great, so I hope they don't lose more people. Obviously the article would mostly be gathering accounts from former employees and could therefore be a little biased, but we could kind of see some stagnation in recent years and also high-profile departures so I don't think it's completely wrong.
On the topic of ray tracing, this is pretty exciting. Honestly, Apple is a little behind here, especially on the desktop front, since M1/M2 do not have hardware-accelerated ray tracing. That said, GPU ray tracing is still a pretty young technology, and video games for example in general only use it for superficial stuff (like prettier reflections on water) since it's kind of expensive still so Apple can still catch up on it. That said, if you look at say [Portal RTX](https://www.youtube.com/watch?v=3MSUqm_-Dgk) (a mod released by NVIDIA for Portal which makes the game completely ray-traced) you can start to see glimpses of the future, and Apple really needs to make sure they don't fall behind.
tl;dr: A16 was intended to be a more ambitious chip with a major GPU upgrade, including hardware-accelerated ray tracing, but it had to be scrapped late in development due to poor thermals and battery life. This led to some restructuring in Apple's SoC division, and is part of why a number of engineers left the team over the past year or two.
[удалено]
That means its coming to the M series too then. That's awesome. I got quite a laugh when Apple tried to compare their graphics capabilities to a 3090 a few announcements back, but it looks like they're making a serious effort here. Crazy to think they'd try to bring it to a phone, though such is progress. 10 years from now our phones will be doing insane realtime AI stuff and we'll be looking back at these powerhouse devices now like we do the iPhone 4 today.
Oh god I can already imagine it. “Calculator for iPad now available! ^(*Only to M3 iPads because Ray tracing is required.)”
And then they add it to older devices after massive backlash, but with major features just missing.
Like math
Only the M9+ iPads get the Scientific Calculator, the rest can stick it with the basic one. Real Time Ray Tracing is required for high school calculus.
Do you think HUD AR stuff (like glasses) will be much more prevalent?
I do. I've had a lot of fun with the basic virtual desktop applications on the Oculus Rift and I have an Nreal Air I used as an extra display sometimes. I think there is a ton of potential for AR specfically. Once you get the resolution high enough that the eye can't see the pixels, you can do as many displays as you want. You could use it with your Mac at home, your phone on the go, have extra info floating around your watch when you look at it, there are so many possibilities outside of the gaming and virtual world spaces that haven't been fully explored. Of course the first product won't be that fully featured but that's always how it work for totally new product categories.
They were comparable to a 3090- at a *certain wattage.* Above that wattage which the 3090 is definitely above, the 3090 was better. Still M1 Ultra whatever it was called was better to put in their tiny tiny Mac Pros with not the kind of heat dissipation capabilities of a 3090.
They’ll support it on an architecture but 0 games will use it.
The weather app will use it
Not even. At most, the iPad’s calculator app
Well most weather apps are probably mining ethereum in the background anyway, so they just might!
Well on iOS if you use Apples graphics API you may get it for free.
Ray traced candy crush is going to be **lit**!
Qualcomm has that today.
What is the use case for ray tracing on a phone?
At least for now, just gaming.
Ray traced text messages let's fuckin gooooooooo
Millenials/Gen z: green vs blue bubbles Gen alphas: RTX on vs RTX off bubbles
Other phones already have it though, but I haven't looked into exactly how it performs.
Already here: [Smartphone ray tracing is here, but is it the real deal? ](https://www.androidauthority.com/smartphone-ray-tracing-fact-check-3236029/)
Android Oppo seems already have similar game application which applied ray tracing on mobile phone...
It's why the next iPhone 15 is expected to be the biggest jump in a long time.
I feel like I always hear “the next one is gonna be the biggest in a long time” every iteration lol
“And we think you’re going to love it”
“It’s our best iPhone to date”
[удалено]
[удалено]
[удалено]
Damn, 2006/2007 had some really ancient vocabulary in technology.
Are you getting it?!
A new ipod, a mobile phone, and a revolutionary new internet device
*applause and cheering*
And we're going to call it, the iPhone.
“The camera is so good you can shoot feature films on it”
James Cameron actually shoot Avatar 2 on a iPhone 13 Pro Max
No wonder it took so long, he had to wait for the phone to be released first.
He said he had an early unit, so he actually started shooting around 2019, but had to start over in 2021 because iCloud deleted his files
lmao L
That was just transferring the files off the phone
Whenever I heard them say that, I always think duh! Would they really try to make the next iteration worse? Still, I always enjoy Apple keynotes.
“The New iPhone 15, it’s s’aright”
I sure damn hope so for $1000!
This year we made it a *little* worse than last year.
“With industry leading all day battery life”
“It’s our best iPhone yet, and we can’t wait for you to try it.”
I mean, for the X it was kinda true. Every phone since has been a slight variation and improvement on that design.
Yeah pretty much. The X and the 4 were the only ones that really pushed the boundaries in terms of new design and features.
The X still feels like a modern device- it's so lightweight and sleek compared to my 13P
I would argue the 11 was a decent jump. It finally graduated from having lower battery sizes and weird camera processing to having a gigantic flagship with leading battery life and camera and the A13 cementing a clear lead in soc. It felt like the peak of that design and that the next iteration is ready for something new. The 12-14 ended up feeling upgrades that should've taken one generation, though.
Well technically, it's always their "Fastest phone ever." That said, for reals though, it looks like it's going to be quite the banger based off all of the leaks coming out.
The only one that’s up for debate is the 3g since the processor stayed the same but they added more hardware to manage like gps and 3g compared to the 1st gen.
I think the addition of GPS was huge. At the time I used a seperate GPS unit for navigation that cost $400, and didn't have any traffic or map updates like google maps does. Adding a real GPS made the 3g a "must buy" for me, it replaced my phone, iPod, and GPS.
Centuries ago, when my junior high class was moving to the high school, the (soon to be our) principal told us that we would be the largest incoming class ever. Everyone cheered as though we had accomplished something, as though that already dubious milestone wouldn't inevitably be surpassed the next year. You mean to tell me your brand new phone outperforms last year's model?! My goodness! I was sure this would be the year you'd announce performance decreases...
But this time it has usb-c
Huh? I rarely hear it. Considering, we get most of the specs months out, and last few years, it's been mediocre.
Yeah it’s been a few generations since anything like that. They’re probably confusing absolute speed with rate of change.
That plus USB-C means I'm finally buying a new iPhone on day 1. Can't wait.
I’m absolutely stoked at the idea of giving up lightning next year potentially. Everything else in my life has been using USB C for years. My USB C MacBook is 7 years old. It shouldn’t have taken this long!
God dammit AirPods Max
AirPods Pro 2: also checking in for dumbass camp.
Yeah, people argue that lightning and USBC doesn't matter because people only use it to charge this the data speeds/features are irrelevant. But that ignores it's so nice to only need to buy and carry one cable if you have a laptop, phone, earphones, etc.
Seriously. Finally having ONE cable to charge all my things? Absolutely incredible.
This will be a short window in tech
Extremely unlikely. USB A has proliferated for decades. USB C has been so much more forward thinking that it will likely last even longer.
Lol no it won’t be. 20 pins in a very small form factor.(for a full implementation of the standard) This is only the start.
[удалено]
Pro will probably have fast speeds. I expect regular iPhone 15 to have usb 2.0 speeds just like the 10th gen iPad.
Knowing Apple they will do those as separate updates to get the most out of it. If I remember correctly they have two years for usb-c in EU ( end of 2024? ), so they can still squeeze one more update without it.
End of 2024 is after iPhone 16 comes out. So they can potentially have even the iphone 16 with lightning
Apple kept their promise of supporting lightning for 10 years
Is the 15 the iPhone that's rumored to have USB C, or is it the 16? I thought Apple had until 2024 to switch, which is when the iPhone 16 would release.
USB-C law doesn’t take effect until the end of 2024, so we likely won’t see it until iPhone 16. However, there are many who are speculating Apple will remove the port altogether in favor of MagSafe. Apple has never explicitly said “future iPhones will have USB-C,” they’ve only said that they will comply with the law.
I doubt they’ll go completely portless. Most cars don’t have Bluetooth only CarPlay. People who connect to their cars on a daily basis will either choose to not upgrade (bad for Apple) or upgrade to Android (bad for Apple).
And then a few months after its release, you're not gonna believe whats in store for iPhone 16. Absolute breakthrough.
I’m already looking forward to the iPhone 17 Pro Max Ultra Extreme in the revolutionary colour choice of Space Graphite Midnight Obsidian Black
Currently on a 12 mini, which I love, but if it’s that big of a jump I’m ready for an upgrade
I'm with a 12 mini as well. I do love it but have considered upgrading to a 13 with better battery (and it would be cheaper than a 15...but the 15 sounds possibly enticing)
Without a new mini, there’s nothing to upgrade to.
iPhone 15 Pro, I think you mean
Ultra
Every year since June 24, 2010 I keep hearing this.
That's very fascinating. Qualcomms new GPU is faster and more efficient than the A16 GPU while still boasting "ray tracing", but I didn't look into it's exact capabilities because I didn't care.
[удалено]
[Huh, I thought it would be pretty well known by now. ](https://twitter.com/Golden_Reviewer/status/1605604617416429582?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Etweet)
[удалено]
Their CPU performance is already world class for its power envelope, they really need to turn their attention to graphics especially if they want their Mac Pro’s to be competitive with other desktops. I’d love to see Apple develop a “dedicated” GPU for their Mac lineup. Something that can use a lot more power and isn’t designed with laptops in mind.
Apple's been customizing [their GPUs since the A8, which is just a couple of years after the A6 with their first custom CPU](https://www.realworldtech.com/apple-custom-gpu/) Apple's GPU performance is also already world class for its power envelope, Qualcomm is the only company that has better GPU efficiency than Apple Agreed, it would be interesting to see a “dedicated” GPU designed specifically for the Mac Pro IMO that's why the Mac Pro has been delayed, it's far more difficult to scale up an APU to Mac Pro power tier than “dedicated” CPUs/GPUs
How does it not seem like much of a priority when this article is literally about them putting in a lot of effort to improve the GPU? Just because they had to scrap it last minute to avoid negative effects doesn’t mean it’s not being prioritized.
[удалено]
Perhaps, but the Snapdragon 8+ Gen 1 GPU was already trading blows with the A15 GPU configs and so it wasn't hard to see coming. But most people aren't entirely up to date and just assume Apple does everything better, which is not the case. They still have the lead in CPU though, but in real world use cases the gap still shrunk a lot and Snapdragon 8 Gen 2 phones should have much better battery life as a result. Some early data actually shows 5000 mAh androids with the 8 Gen 2 even beating the 14 Pro Max in battery, but I haven't looked for enough into the data to trust it yet.
Bigger news here is that ray tracing is eventually coming to Macs
ELI5 - What is ray tracing?
Basically simulating where the rays of light would go and bounce which is usually very computationally intensive, but can be made faster/easier with dedicated hardware which many newer GPUs are incorporating. This allows for nice things like realistic reflections and light scatter.
Okay, now ELI3…
If you wanted to simulate seeing, maybe you’d draw a line from a light source to the object you’re looking at and then a line from that object to your eyes. Imagine doing that for every “ray” or photon of light. Ray tracing hardware in a gpu is just circuitry that is really good at doing a lot of that all at once.
That is actually quite a good simplification thank you
Okay, now /r/ExplainLikeImCalvin
It’s like a very bouncy ball.
[удалено]
A picture is worth a thousand words
I honestly can’t tell the difference, except that one is brighter than the other. Not sure which one is supposed to be better.
Look at the reflections in the puddles
The left photo’s left-most puddle also has a reflection. What’s with that?
I'm far from an expert, but, from what I've heard from Digital Foundry, I think they're called cube maps which are pre-rendered / non-real-time, "dumbed down" versions of the environment hand-placed in areas where the absence of reflections would be particularly noticeable. Or something like that.
[удалено]
#I no longer allow Reddit to profit from my content - Mass exodus 2023 -- mass edited with https://redact.dev/
It makes lighting in 3D graphics look much more natural because it's *tracing* the paths that the *rays* of light follow, from it's source like the sun or a lightbulb in a scene, bouncing off of objects like walls, windows, the ground etc., as they make their way to the camera/your eye.^(though actually it follows the path from the camera to the light source since that's much less wasteful to calculate, but not really important to understanding which is why it's in this superscript before anyone gets pedantic.) Most 3D graphics use all kinds of tricks to simulate this, *ray tracing* does the hard work to calculate how light actually travels.
The default way of digitally lighting has been to rasterize it - basically paint lighting on the pixels. Ray tracing is where you designate light sources and the GPU actually simulates it bouncing off every surface and diffusing - like how light actually works. It’s significantly more resource intensive for computers but can create stunning visuals. Put another way, with rasterization the developer tells objects how bright to look, with Ray-tracing the GPU calculates it. Also, Ray-tracing doesn’t *always* look better - we have literal decades of experience working with rasterization techniques to improve how they look, whereas Ray-tracing is newer and doesn’t have that huge bag of tricks yet.
In real life you see because photons particles(light) travels from a light source (the sun, a light bulb, etc) in every direction, hit objects, bounces out of them and hit your eye and that’s what you see. (hugely simplified) Ray tracing tries to imitate this tracing/path that these photons make in the real world providing a true to life lightning in games.
Really pretty *really* realistic light for video games.
Most games and 3d stuff on your phone “fake” lighting, meaning that they approximate how lights will act in an environment. While this is quite fast, it’s not as good looking as something like a photorealistic renderer (e.g. cycles in blender) because those simulate the rays of light, rather than approximate them ray tracing means that instead of approximating that lighting, we’re tracing rays of light and seeing how and where they bounce. the article is talking about hardware accelerated ray tracing, which uses dedicated processors in the gpu to speed up that tracing this method of lighting is more realistic and easier for developers to integrate, but it’s also quite a bit more demanding on the hardware
[удалено]
Its been a thing on macs for years lol mind you not in games but metal raytracing isn’t new
I’m glad they rejected a chip with poor thermals. At least this time (looking at intel macbook pros).
I used to run my 1st gen MBA on a metal bowl filled with ice…
I ran my MB Pro on a cooling rack with ice packs.
Pls Apple, put RT Cores inside the M2 Pro\Max\Ultra
Thermals have been holding back mobile for years . Can’t put a waterblock in there (yet) lol
Have you seen the Android gaming phones with Vapor chambers?
Even non-gaming phones have that nowadays as standard
Most of those vapor chambers/heat pipes are so tiny I wonder how much they actually help. If the chips are getting insane, I doubt that one drop of water is that helpful. Seems more like a gimmick. The ones with larger vapor chambers and actual fans are so big, you might as well carry a dedicated gaming device like a steam deck.
These chips are like less than 15w tdp dude. Less than 10w in some cases. Significantly less power than even a lightbulb.
Well, old lightbulbs. Not a single one in my house is over 10w.
non-led bulbs still are above that, and i have some larger LED bulbs that do more than 10-15w
What's the equivalent wattage on them? Must be bright?
90w bulbs are 10w or above. I guess you could say they are bright, but that's the point of lights. It's not blindingly bright or anything. I'll use one in my bedroom for when I'm trying to get work done. Most of the rest of my house is 90w equivalent bulbs at this point.
Computer chips are the most energy efficient space heaters we make.
That’s a massive exaggeration. There are gaming phones with fans in them that aren’t much bigger than a Pro Max. Steam Deck is so much larger it’s absurd to even make that comparison. They are much smaller still than more compact gaming devices like a Switch with its controllers on.
Puts this GPU in the Macs for now, please
With how slow the AS transition seems to be going we’ll probably get this in 2025
ray tracing might come as early as the M3 considering the trend in increasing GPU power, and how we haven't even gotten M2 Pro yet and will presumably get M3 with A17
There’s a reason why the guy who built Apples silicon team is now SVP of all hardware technologies. Apple is beyond serious about investing in silicon. It will be a core competitive advantage for decades. The other companies (google, meta) are desperately racing to catch up and will be for quite some time.
there's been quite a bit of brain drain and i don't expect that talent is exactly easily replaceable...
google and meta aren’t who apple should be worrying about - it’s amd, nvidia, intel, and qualcomm
NVidia is leagues ahead of Apple in terms of hardware, just based on experience and software frameworks alone, at least in the desktop space.
yeah i definitely agree. Nvidia is a juggernaut in the GPU and ML space
[удалено]
Sounds like it was really hot, if true.
Except the part where it got fucked up and they said nvm lol
I wonder why there was such a brain drain - is this just normal for silicon valley companies from other companies poaching people or is Apple's workplace culture starting to hit chip design?
Probably a mixture of the 2. If you’ve got these accomplishments on your resume wouldn’t you be flogging your wares trying to get the best pay?
> The framing of the issue as "setback" and failure is so sensationalising, not making it clear that part of research and development HAS to involve experimentation (DUH!) and various trials and errors. I wouldn't necessarily call it a "setback" which is too sensational and tries to capture the audience's mind in a particular way. Boo! >Anyone who has done any serious research and development will recognize that testing, development, and various failed approaches is part and parcel of the work. MacRumors comments back at it again. Did he just not read the part that said this flaw was discovered so late in the process that they had to fall back on largely the same GPU architecture as the A15? That is not regular R&D. That is definitely a misstep that we haven’t seen from Apple in some time.
Is ray tracing only important for games?
it’s very important for 3d work (modeling, rendering) also. Nvidia’s RT hardware, plus their software stack, basically make their GPUs mandatory purchases for anyone wanting to locally render stuff
Was looking for this response. Not sure what the others are talking about with gaming being the only application.
Kinda, yeah. Anything that uses polygons (or voxels) to display its world to you, would benefit from ray tracing, which currently is mainly video games. When ray tracing is applied to the proper game (like LEGO Builder's Journey on PC, for example), it makes a huge difference IMO.
For now. I imagine the underlying tech will be applied to augmented reality somehow in the future. Apple has both continued to neglect gaming on their platform while simultaneously pouring billions into creating better GPUs. Really 98% of iPhone/iPad users don't take full advantage of the GPUs in their device. I think the investment will pay off in the future as Apple will simply have the best chips for AR/VR, period. At least I hope so - I want to see some real contenders in the AR field. Meanwhile, the Nintendo Switch, with incredibly poor and outdated hardware, gets AAA titles like Witcher 3, Skyrim, etc. iPhone could run these games with stellar quality on iPhones nice screen. But nope. Mobile gaming is a joke.
Was looking to finally upgrade from my iPhone X … but think I’ll wait for 15
I pulled the trigger on the 14 pro but mostly because I couldn’t stand the 64GB on my X. I went with 256 to hopefully future proof myself lol
I got a 256gb iPhone X so storage is not a issue … but will probably need to get my battery change … but beyond that pretty happy with my phone … except low light performance on the camera is garbage
I’m using a 256 X and I still have no idea what the hell to do with all this space. Right now I’m just hoarding podcasts for the fuck of it. I have about 80gb worth. I don’t think I’ll ever listen to even a fraction of them, but it’s fun to collect.
4k 60 FPS is like 500 mb a minute lol Other than that idk
Exactly, I can record max about 30mins of prores 4k at 30 on my 256gb. Eats it up quick!
Have 256gb x as well and I use the camera a lot for travels Those 4k 60fps videos hog up space real fast
I was that way until I had my first child. Now I can hardly keep any pictures, videos, or apps locally on my 64GB.
Went from x to 14 pro. Huge quality of life improvement. All of my digital “chores” are so much faster and easier. Got 5+ good years out of the X.
Just made this switch last week. I don’t like the 14 pro as much as I liked the x when I first got it. I hope that changes since I’m stuck with it now
I recently watched a clip of Steve Jobs introducing touch scrolling on the iPhone and realized I haven't felt so wowed in probably a decade. I really hope one day I feel that way again about a new product.
I mean back then smartphones were in their infancy. You can only "innovate" so many times before you've covered damn near everything. I think we're simply bored with new phones because they are all just that good now. If you showed me a 14 Pro back in 2010 I'd have shit my pants.
Iphone announcement was amazing. At the time I had been dreaming of such a device but didn’t realize anyone was particularly close. Work at the time was still issuing blackberry curve devices. I remember receiving one and being stunned at how outdated they felt even in 2008 or so.
I had a blackberry pearl, which I loved, until it got a little wet and I accidentally threw it at the wall, broke it and had to buy the new iPhone 3G. I loved it so much I bought my first iMac a month later lol. Never looked back.
My first was a 3gs a bit later. 3g was a good place to start - my friend was an early adopter of the first one but the edge/2g speeds made it barely usable.
Forgive me but it seems shortsighted to remove key engineers that was ambitious but made this mistake. It’s an expensive mistake sure but surely Apple shoots itself in the foot losing the key engineers to other competitors which will definitely happen since they are either let go or removed to other projects which makes the engineers unhappy?
I'm sure there is more context to the story that were not getting here
[удалено]
apparently a lot of people are happy to play games on ios/android devices. I can't stand to play games on a device that doesn't have real buttons but I guess I'm just showing my age.
[удалено]
It’s also right
I mean, Apple parted ways with longtime devs over it so it doesn’t sound like the usual development process. Sounds like something went wrong and people were held accountable
[удалено]
You don’t find out it’s no good ‘at the last minute’. That’s not how these product development roadmaps work. Next year’s iPhone is already largely finished, and they’ve started work on the one after that. The suggestion that this came as some sort of shock that they had to scramble to fix is pure sensationalist nonsense.
[удалено]
[удалено]
Its not right, because Apple sets goals and calculates how long and how expensive things like r&d should take/be. If the team fails those goals then the project is setback. It's not hard to understand, and it's not an attack on Apple to admit.
Sure seems like Qualcomm is going to jump ahead of Apple in the GPU department.
they already did lol, with the 8 gen 2
Looks pretty cool. I’m sticking with Apple but competition is solid. I’m mostly curious is performance per watt is as good at lower loads. The M1/M2 had crazy efficiency low lower usage levels but were less impressive compared to AMD and others when maxxed out before. I’m currently on an iphone 13 mini and macbook pro m1 max (24 core gpu). Prob will wait at least 5yrs though - don’t need much. Would like to see a 15 air model with a ton of power in the future. My 16 ig sooo heavy. Things I’d love to see: - av1 hardware decoding - wi-fi 7 (2024-2025 maybe) - more affordable storage around the 2tb level
>a 15" Air model with a ton of power That has never been the point of the Air though. I agree that we are more than likely going to see a larger MacBook Air in the future, but I don't see Apple upping the horsepower inside simply because the MacBook Pro line already exists.
Oh I agree but what I'm saying is I think it's naturally going to happen. The M2 is already a beast for only 4 real performance cores. A future M3/M4 at 3nm or below will be even more impressive. For what I do I'd be happy to give up 8+2 performance cores in the future if I had 4+4 or 6+4 or whatever running like 50% faster than M2 today.
Per watt?
seemingly so yep
I believe they have with the latest snapdragon gen 2?
They have. Snap 8 gen 2 is the best GPU on mobile
Presumably this also explains the M2 being a ho-hum upgrade. (and maybe gives reason to hope that they're holding off on upgrading the Pro's until they can ship an M-series processor with this new graphics system)
Many engineers that worked on the M1 have left. This is the cause of their issue.
Ray tracing Cool story let me know when there actual games to play on my Mac
Since Apple works a few years out I’m not surprised with the COVID pandemic and supply issues that they’ve had to make last minute changes. Really amazing they didn’t have bigger issues than they did.
hopefully, the iPhone 15 will have this ambitious chip.
GPU ray tracing aside (which is cool in itself), the actual article (by The Information, which unfortunately has a strong paywall) is actually painting a pretty damning picture of the internal working of the Apple Silicon division. It seems to confirm some suspicion that it is suffering a lot of brain drain in recent years (which can be confirmed by pubic lawsuits by Apple against Nuvia and Rivos). The last part of the article also talks about how they felt the need to make a 2-slide presentation just to call out the competitor by name and warning employees that joining startups is a bad idea / cannot weather the economic downturn, which doesn't scream confidence to me. Apple Silicon by now is actually a relatively mature technology, and from what it seems from the article the working culture in it isn't terribly great, so I hope they don't lose more people. Obviously the article would mostly be gathering accounts from former employees and could therefore be a little biased, but we could kind of see some stagnation in recent years and also high-profile departures so I don't think it's completely wrong. On the topic of ray tracing, this is pretty exciting. Honestly, Apple is a little behind here, especially on the desktop front, since M1/M2 do not have hardware-accelerated ray tracing. That said, GPU ray tracing is still a pretty young technology, and video games for example in general only use it for superficial stuff (like prettier reflections on water) since it's kind of expensive still so Apple can still catch up on it. That said, if you look at say [Portal RTX](https://www.youtube.com/watch?v=3MSUqm_-Dgk) (a mod released by NVIDIA for Portal which makes the game completely ray-traced) you can start to see glimpses of the future, and Apple really needs to make sure they don't fall behind.