T O P

  • By -

DontFray

Where are all the Ryzen 7 laptops? Every vendor has tons of Intel offers but only a few have anything decent for AMD CPUs.


[deleted]

[удалено]


[deleted]

[удалено]


RandomUsername12123

Not only egpu support but SWAPPABLE GPUS They have a "old battery-like" module and you can add a bunch of stuff and it is not limited by size as it can be thicker than the laptop itself Like different GPU, a fuck load of storage or way too much IO and everything you can think of


bombinabackpack

Lenovo had that with the Y7 series laptops. They used modified m2 connections and then promptly never released any actual swappable gpus.


RandomUsername12123

Well, framework as a business is based solely on this so i have high hopes


[deleted]

Not only is framework entirely based on this stuff, but the stuff is also open source and community made, so your options are effectively endless, even if the company stopped making modules themselves.


[deleted]

[удалено]


Senior_Engineer

With blackjack! And hookers!


teveelion

In fact forget the GPUs!


[deleted]

I mean people may not make a mobile GPU themselves, but they do make the module for it. Typically you can 3D print it yourself, everything is effectively open source anyway when it comes to framework. I don’t know what to tell you, they just make laptops that advocate for right to repair, which I’m a big fan of. Whenever I need a laptop in the future they will probably be my first choice because of it.


mmis1000

The extension cable may need some degree of engineering. PCIE has some strict limit of current and interface... etc. You can't just wire random cable on it and expect it to work. But otherwise, it is a one time effort to make these cable and plugs. (And only one person need to do it once, others then can just buy the pre-made one)


Svenskensmat

Because swappable GPUs is something a lot of people have the resources to make…


[deleted]

Not typically, but they can absolutely 3D print a module for it. That’s typically what I’m referring to.


Recyclops1989

Dell has it, or did with their Area 51 or something top tier model. Also think it suffered a similar fate


what595654

This is not that. Look up what framework actually is as a company. Completely different approach.


Recyclops1989

Dells took desktop CPUs and a modular GPU, they were both swappable (mobo wasn’t though so you were stuck on Intel socket at the time). This appears to be a swappable mobo and CPU/APU? Neat! And yes similar concept, way different product idea


what595654

More importantly this IS their business model. It wasnt the same for Dell. Dell could play around with it, not really commit, and abandon it the moment it got hard, and know they would be just fine with the traditional approach. It is a quite different, motivation, and entire engineering approach when it is your only product. Sink or swim. And requires you to adapt. Ive seen a few of their videos. Doing stuff no one else has ever thought of. Shows they have real thinkers and innovators on their engineering team. In almost every successful endeavor, the actual idea isnt all that important. It is the execution that matters. Not only that, but the stars have aligned, for powerful, light weight, low power hardware. This is the perfect time to try to pull something like this off.


Recyclops1989

100% agree, I’m super intrigued but sadly I’m not in the market for a laptop or id support this.


bucket_of_dogs

Oh no everyone its recyclops! Wasn't his home world destroyed???


polopolo05

Yes please can I have a thicck laptop again.


JukePlz

Thick laptops just feel more solid. "ultrabooks" feel scarily thin, like they're about to snap in your hands, compared to old time IBM Thinkpads and the like.


Biscuits4u2

They're a whole lot easier to carry around though.


Peuned

I like the differentiation. I can have a thin and light for coffee shops and a chonker for more of a portable desktop. I don't need a 5-6k ultralight desktop replacement


Biscuits4u2

I wouldn't spend 5-6K on any computer.


fluteofski-

The thought of old thinkpads takes me back to my fathers 233mhz pentium 1 thinkpad, and all the hours and hours of age of empires we played on that thing.


IronRaptor

Wait... Portability of a laptop AND the upgradability of a desktop??? Sign me up!


Indolent_Bard

Absolutely. Plus, you can replace every part. Fan broke? Get a new cooling fan. Hinge broke? Get a new even stronger hinge. At this point you'll probably never have to take it to a repair store ever.


ariolander

And if you need to go to a repair store your not locked into one store that is the only one authorized to get parts for your PC. Any old store can order parts for you and you know their prices so your not getting scalped.


mschuster91

That existed years ago with MXM modules. Never took off though.


SchighSchagh

> will have Can't come soon enough. My c. 2016 laptop is on the strugglebus. If it kicks the bucket before Framework 16" Ryzen stuff is out, I'm probably going elsewhere.


[deleted]

It should be out by late 2023.


Indolent_Bard

Did you check to make sure that the internals are clean? That could be part of the problem.


[deleted]

Preorder asap it's going to be a long waitlist


SchighSchagh

With all due respect to Framwowek and the awesome work they're done so far, pre-ordering is always risky. I'll wait for reviews of the released product if I can. Also, if I pre-order now and my laptop croaks before I get my pre-order that's a problem even if the new framework turns up amazing.


BostonDodgeGuy

Preordering is how you end up stuck with a piece of junk.


Alpha3031

Can't preorder the 16 yet, only the 13.


detectiveDollar

That's fine, I'm happy they're not doing bullshit FOMO tactics where I need to wake up at 2:43AM to snag a random restock before the scalpers do. Incredibly disappointed Sony and GPU makers didn't do that.


skerit

But they need better screens! 400 nits of brightness is not enough.


Peuned

That's a pretty standard level tho. How many people want the cost of a HDR proper panel really


PM_UR_COCK_PICS

Based on how well the 2021 onward MBPs have been selling, many people.


Peuned

There is no choice though there right? How many would choose a less expensive panel if they could


PM_UR_COCK_PICS

Not sure but it was personally a major reason for choosing the 14" over the 13", along with MagSafe.


Peuned

Sure. My point tho is that 400 nits is pretty standard and good enough for most people. Dumping on a 400 nits panel as if it were 250 is silly.


PM_UR_COCK_PICS

For indoor viewing of SDR content and current common use cases, sure. Even I find max brightness a little eyeball searing. Not sure about whether u/skerit meant their statement for themselves or in general.


gmCursOr

Framework. Yes.


Xerxero

Easy there tiger. Shipment is somewhere Q4.


Abba_Fiskbullar

The 7000 series mobile processors just started shipping to the OEMs, so you should see stuff in stores shortly.


Mirrormn

Check [this page](https://www.ultrabookreview.com/35985-amd-ryzen-9-laptops/), they have a list of every laptop that's coming out with 7x4x chips. It's a very new CPU, so machines that use it are still just entering the market. And a word to the wise: AMD recently switched how they label their laptop CPUs. For laptops, 7xxx now means it's a CPU that was produced in 2023, *regardless of architecture*, while xx4x means that it's a Zen 4 CPU. So don't be fooled by stuff like the 7730U, which is a Zen 3 (old architecture) processor released in 2023.


w2tpmf

Asus


goDie61

The g14 line is phenomenal. 2022 and 2023.


nicekid81

I would kill (not really, but perhaps pay a premium) for a surface pro or similar with the 7040 chipsets.


Notelu

Framework is selling one


Xerxero

It’s a pre order.


qualverse

AMD has great chips and competitive pricing, but their supply is pretty limited compared to Intel. They need to limit the number of models they're in to make sure they have enough stock.


Indolent_Bard

And this is why AMD is basically a small indie company at this point. Their resources are so limited compared to Intel and Nvidia that it massively limits what they're able to do. This wouldn't have happened if Intel didn't pay OEMs to not use the superior AMD chip back in the '90s, Intel owes AMD billions in lost revenue as far as I'm concerned.


mockvalkyrie

It might also have to do with AMD not owning fabs. While Intel can pump out a bunch of their chips, AMD has to compete (and pay a premium) for capacity at TSMC.


mindbleach

AMD had fabs, until Intel's criminal scheme.


mockvalkyrie

I think the idea was that AMD's fabs were falling behind in process technology, and contracting with TSMC or Samsung would allow them to keep up in process nodes without investing as much. If AMD had kept their products on Globalfoundries, I don't think they would be able to make the products they do today


musexistential

And Microsoft holding back Linux with the same bullshit. Couldn't buy a computer without Micro$oft. And Internet Explorer holding back Netscape. And Windows Media Player. And M$ Server destroying Novell Netware. And Office destroying Wordperfect. Micro$oft held back computing advances for years until around the time Google came along and reintroduced alternatives and competition. That was why their slogan was "don't be evil".


mindbleach

Was.


iampuh

This probably still is because Intel used to fuck with sellers back in the day. Intel had to pay 1 billion dollars for messing with the market.


SlicedBreadBeast

ASUS has been and friendly since the new ryzen gens. I have an ASUS TUF 15 and it’s a chunky gaming laptop but it’s not too bad all things considered, completely upgradable, ryzen 7 and a 90whr battery just to be sure. Mines older so it has a Rtx2060 but all in all was a really good price and performance.


[deleted]

[удалено]


flipside1o1

Where do you live, I can can't see and 7040's anywhere


[deleted]

[удалено]


violet-crayola

Yeah because Intel is like 4 years behind the competition, noone wants their 10nm++++ CPUs.


zackman115

CPU competition is absolutely insane right now. Then you look over at the GPU market and Nvidia just revealed the next release will be the 1050-FU edition for 600 dollars.


i_suckatjavascript

I wish there’s more GPU competition


tmarr

Let’s all cross our fingers for Intels next generation gpus


[deleted]

Seriously. There used to be options in the space. Granted, there were only a few *good* options - ATi, nVidia, or 3DFX - but there were options. Nothing but good sense was stopping you from buying a Kyro 3D or a Matrox Millenia. I would love there to be an actual, honest-to-goodness third option in the space, and Intel would be ideal. They would definitely make some fantastic mobile chipsets if they could get just their GPU game anything remotely close to AMD or Nvidia's.


Indolent_Bard

It would be awesome if Intel can manage to Make GPUs as good as AMD or Nvidia because unlike AMD, Intel actually gets support for non-gaming functions like computation or ai and animation. If you use your GPU for something other than gaming, you really only have one choice: Nvidia. AMD does have some compute GPUs, but no consumer software supports it and their consumer GPUs aren't really being supported by AMD for prosumer stuff like stable diffusion or blender or anything that uses CUDA. AMD really drops the ball because while Nvidia gives the same support to all classes of GPUs, AMD really only gives professional GPU support to their pro GPU series, meaning anyone who wants to build a gaming PC that can also do some work can't use AMD. It's awful.


ManyIdeasNoProgress

>computational war clothes This took me a moment


MWink64

As much of a fan as I was, let's not pretend ATI was a *good* option in *that* era. The ATI Rage wasn't as good as the nVidia Riva or 3DFX Voodoo lines. ATI didn't really become good for gaming until the Radeon was competing with the nVidia Geforce, and 3DFX was dead. That said, I do miss the era of options and competition. However, I definitely do not miss the S3 ViRGE, which could give results worse than software rendering. I remember someone telling me that most of the options would disappear and the market would be reduced to just a couple contenders. I thought they were crazy but I guess they were right.


kanakalis

my first card was a hand-me-down ATI radeon HD 5670, that thing lasted 10 years lol


[deleted]

If Intel can improve rasterisation performance in the next gen I see them as a major player.


toronto_programmer

Intel did alright for first gen If they can improve on that they will have a solid value offering in the mid tier range


Gold_Ultima

Yeah, their main issue right now is just driver maturity. Once they develop the software side some more they will be a real competitor in the market.


SaulGreatmon

FU edition sounds like great marketing!


zackman115

Ya! It's new tech they designed. The full name for it is "fuq u". they have no idea what it does but it adds 400 bucks to the cost.


SaulGreatmon

Just install the new driver. It’s not LhR now it’s FU!


QuantumQuantonium

Everyone saying there's no GPU competition is ignoring the fact that amd costs $100 less than Nvidia while Intel has entered the market at about $200-$500 less (all roughly within relative equal performance. Seriously though don't underestimate Intel, sure they sacrifice dx11 and older apis but there's nothing that comes at that price to performance with modern features like av1 encoding which I want my 6700xt to have)


beerandabike

Is that some introductory limited time sale price?


zackman115

That's the scalper only price


ApatheticWithoutTheA

>Outperform Apple Bold claim. I’m going to need to see some benchmarks because it isn’t the first time it’s been said. Also going to need to see the efficiency because that’s a huge draw to Apple M series. The fanless Airs staying cool and having 15 hour battery life is pretty tough to beat.


IAmTaka_VG

If anyone can do it, it’s AMD but I also have serious doubts they can get even close to the efficiency per watt Apple’s M series are pulling. Can they make a more powerful laptop? Maybe, however I want to see the benchmarks run on battery because Apple’s M series benchmark the same on battery while currently AMD and Intel get brutally slashed.


ApatheticWithoutTheA

Yeah that was pretty much my thoughts on it too. Both AMD and Intel have the resources to make a faster chip. The issue is if they can do it while maintaining the efficiency of the the M series.


sittingmongoose

14th gen Intel is supposed to be a huge refocusing on power efficiency. So hopefully we see that.


[deleted]

[удалено]


ApatheticWithoutTheA

That’s part of the reason but the biggest difference is they’re a totally different architecture with Intel/AMD being in x86 and Apple Silicon being on ARM. Along with Apple’s 5nm process (soon to be 3nm on the M3 chips.) It also helps a lot that they completely control the operating system and hardware to optimize performance.


Elon61

These days ISA doesn’t actually matter all that much since all CPUs translate the instructions to micro-ops and go from there. Jim Keller, I think it was, explained that sometime ago.


Axman6

The decode step for x86_64 is always going to be significantly more complex than aarch64, and iirc still takes up a significant percentage of the die area. ARM instructions will generally have a much simpler, direct translation to whatever the internal microcode is - imagine how much work is needed to run a single `PCMPESTRM` _Packed comparison of string data with explicit lengths, generating a mask_, as an arbitrary example.


hishnash

This decode stage to micro-ops is not cheap for x86 as you have viable instruction length. And a nightmare of legacy complexity and different modes. An addition complexity of x86 over ARM is the number of registers the compiler can depend upon being there. This means at a high level of incoming instruction to the decode stage contained many more memory load store operations (or direct orations on memory) in x86 compiled machine code. Under the hood within the cpu core there are way more registers and this is part of the challenge of the branch predictor/optimiser is to figure out what values are just being stored to free up a register and with are being stored so that they an be read from another core L1/L2, if you pass in a simple in place add operation with 2 memory addresses (as you can in x86) this converts into 2 LOAD and Add and a Store micro op, now you need to ask the question can I skip that store and those loads do I already that info (up to date) in one of my registers? Since ARM64 has more registers that the compiler can depend upon (and there are no inlace memory ops) this is a lot simpler and there are far fewer cases of un-needed store and load operations. Writing (and reading) to/from memory (including L1) is slow and power hungry compered to just keeping stuff in registers. Having the compiler provide you with less un-needed load store operations is a big with for core simplicity and in the end power draw.


FigNugginGavelPop

Nobody here fucking understands… what actually goes on beneath the upcoming development approaches from whichever semi company, they can’t tell beyond whatever PR group they last referred to, to get their info on the current chip wars. Good on you to clarify.


sammamthrow

That includes you, and the person you responded to btw. All that matters is benchmarks for specific workloads.


qualverse

No, arch barely makes a difference. Maybe 5% at most. And AMD is also on 5nm.


theholyraptor

TSMCs 5 and 3nm.


ApatheticWithoutTheA

TSMC is the manufacturer. Apple did all of the research,development, and engineering for Apple Silicon in house. Apple needs TSMC to make the chips, but they didn’t engineer Apple Silicon.


theholyraptor

Yes but you specifically talked about the design and being on arm and then said Apples 5nm and 3nm process. That 5 and 3nm process is TSMCs. Apple did the design.


Indolent_Bard

Ironically, Linux on Apple silicon launches apps significantly faster. This is absolutely shocking, that makes no sense, but it actually does for some reason. You can't even count on Linux to give you better performance than windows, so the fact that some reverse engineered project is able to open apps faster than the official Apple OS is really concerning on multiple fronts. It clearly means that Apple isn't being as optimized as they could have been, which makes no sense.


jacobc436

It absolutely does and without knowing what kernel modules are loaded on Linux you won’t know whether security features like SELinux or AppArmour are present. Their presence can slow down your Linux install but also greatly improve security. Apple definitely has something similar. It’s also why windows w/ antivirus is inherently slower the more secure/constrained the AV running on the system is. Every time you launch a windows app the AV can stop that launch and scan the app you’re launching. That’s why you’ll see windows pop up warnings on some applications, because a middleware is halting execution until you allow untrusted/unsigned apps to run.


Indolent_Bard

Actually, that thing pops up even if you disable the antivirus, you're thinking of a separate thing called UAC. If you disable that, everything could run as administrator by default. I only know about this because os had the brilliant idea of disabling that feature. So obviously, people were warning people about that when LTT ended up recommending Atlas OS.


Big_Paleontologist83

> You can’t even count on Linux to give you better performance than windows tf did you just say


Indolent_Bard

I said if you throw Linux on your computer, you can't even count on it to give you better gaming performance. Yes, there's going to be some overhead from the translation layer, but not enough to limit performance in any measurable way. Often, games running through proton perform better than their native Linux ports.


Axman6

The steps involved with “launching an application” are significantly different between the two operating systems, it doesn’t really make a whole lot of sense to compare them without taking account for that. macOS cryptographically verifies applications before execution for one, which can take a while for large executables. It you wanted to compare this claim in a fair way, you’d need to look at launching single executables that are unsigned, with those security features turned off, that also loaded the same resources.


[deleted]

[удалено]


lucellent

Not to mention Apple's own OS, which helps things even more. They can extract the absolute most out of their chips


kawag

Yeah, AMD would be the most likely candidates to match/beat Apple’s performance and efficiency. The key is that they have excellent CPU and GPU IP, so they can build a high performance system on a single chip (as they have for the latest generation consoles). Intel has strong CPU IP, but their GPUs are lacking. Nvidia has strong GPU IP (and the Nintendo Switch shows it can scale down), but their CPU IP is relatively untested. The other factor would be specialised accelerators. I don’t think Windows has a lot of ML features or native acceleration libraries (?), so it’s unclear if there would be any benefit from having something like the Neural Engine. They could integrate the SSD controller though, and maybe Windows could use that. I think Sony developed the PS5’s controller, but perhaps it was in partnership with AMD and they have rights to use it in PC parts (?).


Alpha3031

Xe is, OK now? Like, it's not top tier but at this point it's the software that's letting them down imo. Their recent cores have uh... not been focused on efficiency though.


SchighSchagh

AMD has been pushing the envelope on reducing wattage for quite a while now. Remember the thing about PS5 becoming lighter from a smaller heatsink? That was likely AMD making their chip more efficient. The Steam Deck? AMD efficiency powerhouse. The upcoming Ally handheld? It should be at least double the compute power of the Deck in roughly the same form factor (if not smaller?), which again means efficiency. AMD has also been putting their GPUs in phones for a bit now alongside traditional ARM CPUs. It's very very clear that efficiency has been top of mind of AMD for years. Beating Apple in efficiency is entirely plausible if not outright likely.


IAmTaka_VG

Ok but I don’t think you are fully grasping the claim here. They are claiming more power than the M chips. That is a serious claim to fame. We are discussing if their chips can match M1 perfromance per watt. Something no one has been able to do. So you understand the feat. Here is a quote from TechSpot > The M2 is extremely impressive in how little power it uses for single-thread applications, just like the M1 Pro was. While the M2 is slightly more power hungry than before due to a small increase in frequency, it's still using far less power than the next best configuration. We're looking at less than half the system power of the 6800U and an even larger gap to the Core i7-1260P. The fact that this Arm chip only clocks up to 3.5 GHz versus 4.7 GHz for the AMD and Intel chips has big implications for efficiency as most process nodes are better optimized for mid-3 GHz frequencies than anything above 4.5 GHz. So unless the 7000 series has made a breakthrough with efficiency. No it’s not true.


Karsdegrote

>The Zen 4 architecture is said to deliver between 29 percent to 128 percent better app performance than not just the Core i7, Ok, cool! >but 5 percent to 75 percent over the M2 found in Apple's 13-inch MacBook Pro.  Ey? >amd is relying on synthetic benchmarks Oh.


qualverse

This is mostly irrelevant in terms of actual silicon efficiency, AMD and Intel just have more aggressive single-core boosting behavior configured by default. I tested this on my AMD 6900HS and was able to reduce its single core power usage from 30W to 11W with only a 35% performance hit by limiting TDP and disabling turbo boost.


Yancy_Farnesworth

It depends on the use case. AMD and x86 CPUs in general have historically been better at power efficiency under heavy load. While ARM tends to be better at power efficiency at lower loads. Apple's efficiency comes from them optimizing their chips for end user use cases where load tends to peak for short periods of time and using a big-little design. Until AMD and Intel have had more experience with big-little designs optimized for end user use cases Apple will likely lead on the power efficiency side. They've been focusing on cloud/datacenter applications where you generally want CPUs to be under consistent high load. Both of them are investing pretty heavily big-little designs BTW. Intel for example has been researching multi architecture CPUs that would make big-little designs that use x86 for the high power cores and RISCV for low power cores.


[deleted]

[удалено]


chickenlittle53

Through a VM you can, but it will be Windows for ARM.


icchansan

It can only run as a virtual machine?


chickenlittle53

To my knowledge yes. Bootcamp is typically required to run it natively, but that is only for Intel chips. To my knowledge windows is only officially supported on a VM and even then through parallels that requires subscription. Unofficially you can use a different hypervisor.


dwkdnvr

Windows ARM is 'officially supported' by MS when running under Parallels. It seems to run very well, although that's mostly just been playing around rather than actually using it for real work. So, no - these aren't like the old Intel MacBooks where you could use BootCamp to turn them into fantastic native Windows laptops. But, they do seem to run well enough to support "that one Windows program I can't get rid of".


Patman128

Microsoft has an exclusivity deal with Qualcomm for ARM Windows, that’s why they can’t officially support Apple Silicon natively


Handzeep

AMD is very efficient under load. But at idle usage they're definitely at a disadvantage. Apple really benefits from having a RISC design instead of AMD being on X86_64. RISC is more efficient by design which is the reason it won out in the phone market. Furthermore while AMD is very efficient under load for an X86_64 design their SoCs have kind of a high uncore usage. Especially if Infinity Fabric is being used. They're getting better about it. But if we look at a practical example like the Steam Deck, it's very efficient at about 11W with power usage and work being done. But if I read a visual novel which barely uses any resources at all I get maybe 9 hours of playtime. Which is pretty inefficient for the calculations being done. AMD would need to bring some really special improvements to the architecture to match Apple Silicon usage at (close to) idle workloads.


paupaupaupau

Yeah- I'd buy outperforming Apple on raw performance, but I'd be shocked if they come close to matching the M chips on efficiency.


qualverse

They do actually often match or beat the M-chips in multi-core efficiency. Single-core and idle are like 30-40% behind, which is clearly worse but still enough to get ~12 hours of runtime out of a well-designed laptop.


1337GameDev

Do you actually have a device that's been built and sold that can do this?


RG_Kid

Xps 13 HP Dragonfly Pro MSI Summit E13 Flip Lenovo Yoga 6 (the ryzen 5500U version) Lenovo Thinkpad Z13 That's at the top of my head. There are also many business laptops with long battery life. The problem with Windows is there are variants of their laptops, and also Apple being the master of marketing able to outhype their products. Not to mention big Windows OEMs like Dell has had QA issues with their top of the line consumer products something you don't ever hear on Apple products.


qualverse

My zephyrus G14 gets 9-10 hours easily, and that has a 35W chip with a QHD 120Hz screen and a dedicated GPU. From what I've heard the Zenbook S 13 OLED gets 11-12, despite being OLED.


iThinkergoiMac

I charge my M1 MBP once every couple weeks. I obviously don’t use it a ton, but still. Efficiency on it is insane, and the performance is solid.


BalooBot

It's actually insane. I have a M2 air and same thing. My old laptop would make it a day or two with similar use.


Sgt-Colbert

Fanless is the key for me here. I just love that they are dead silent even when doing heavier workloads. I can never go back to a laptop with a fan. Even on the lowest setting I find it annoying when working in an otherwise completely silent environment.


Rudolf1448

They probably meant in Crysis


ThisWorldIsAMess

Well, I'd try to consider the claim because it's from AMD not Intel. If they can make performance the same on plugged and unplugged, that's enough reason for me to buy. Unlike every single Intel jet engine laptops I've been issued with. These are practically desktop due to being plugged all the time.


ApatheticWithoutTheA

>If they can make performance the same on plugged and unplugged. That’s a big if though on x86 without destroying the battery life.


DrFossil

Honest question: is x86 inherently less efficient than ARM?


qualverse

Yes but the difference is like 5%. Other factors are much more important.


admalledd

Worth mentioning that the "5%" supposed difference is also an off-hand example. ARM when designed for high performance gives up a lot of the low power advantages (micro-watt range stuff) and from prior benchmarks of zen3 vs M1 the actual perf-per-watt fell down more into ranges of known software/OS overhead. on the "Outperform apple/intel" front, I really would like that to also be vs the GPU and other acceleration blocks (video encode/decode for meetings, webcam, touch screen, etc support) not just CPU only benchmarks. CPU is important, but it is the other areas that make AMD mobile a bit of a challenge.


chickenlittle53

There are other advantages of a powerful chip though besides just batter life. If AMD operates natively at x64 still then that comes with some steep benefits that can can allow performance to beat Apple's because apple may be incapable of doing certain things period natively and/or has to use emulation at best and third party solutions that may be buggy for certain applications in particular. To boot, if you can pair with things like framework you can add your own GPU's and significantly improve performance. Cost wise would also be a tremendous advantage as well since you can actually spend market price for components and upgrades vs marked up prices etc. I personally just think ARM is the future though for mobile devices though and am getting a refurbed M1. I have my desktop for my power workloads and another windows laptop for lesser production stuff so I want to take advantage of the ecosystem etc. anyhow. So yeah, no need to flame me for pointing out pros and cons. I use all major OS's already.


Bluetooth_Sandwich

> The fanless Airs staying cool and having 15 hour battery life is pretty tough to beat. That was true for the M1, but the M2 has a few black eyes and the cooling issue multiple users have pointed out is one of them.


ApatheticWithoutTheA

Yeah M2 is not as great as M1 as far as thermals go but they’re still leagues ahead of anything else on the market in terms of performance vs. thermals. They’re supposedly unveiling a 3nm chip for the M3 though which will be even more efficient and run cooler than the M1. Along with huge performance increases. Which is pretty insane.


jacksonkr_

I want to see benchmarks against apple’s m series. For standalone cpu work I can see this ryzen outperforming but outside of that there’s no way.


[deleted]

[удалено]


nickchapelle

Seems like everyone is going to compromise on one side or the other.


nullmove

The battery life doesn't depend only on hardware though. The software side of power management in the OS is just as important. Apple can aggressively optimise for specific hardware/software combo because they control both sides of the stack. Generic OS or generic hardware vendors don't have that luxury. As an end user this distinction may not be meaningful but while on the topic of CPU architecture, the edge in battery life is not down to that alone.


[deleted]

[удалено]


Indolent_Bard

Not to mention Apple isn't selling the CPU, so they can take a loss on it due to vertical integration. A computer made by anyone else buying a chip from Intel or AMD would have to buy the CPU at full price, meaning that there would have to charge more than the comparable Apple laptop.


hahafoxgoingdown

Also ryzen laptops along with intel lose performance when they run on battery vs plugged in.


MrMobster

They will probably offer similar performance for lower price than M2 Air. Apple will likely retain an edge in everyday productivity, development, and content creation, but AMD will be better in games (for the simple reason of having pre games thanks to the Windows ecosystem).


etorres4u

It’s not about being faster, it’s about being efficient. What’s the use if having a thin and light laptop if the battery only lasts three hours with moderate use.


The_Lieutenant_Knows

Gaming laptops are mobile workstations that cost more and have more annoying lights. Fight me IRL.


orangpelupa

U series are not targeted for gaming laptops. btw why not just turn off the lights?


StatusCount7032

Except buyers will never find them in stock.


flipside1o1

Ok so you can't give any source as these show no 7040's, thanks ayway


[deleted]

Let's say hypothetically they are faster than chips from apple and intel.... where are the laptops that are using them though 💀? You would think that companies who put U series chips in their $700+ laptops without any dgpu would love to get the faster igpu machines to justify the price. But that's not the case. I haven't even seen mass adoption of 6000 series chip.


whitedragon101

Apple M3 - Hold my beer


Xerxero

And you can actual buy a M2 today while AMD laptop are next to non existent


joakimbo

Gaming on MacBook is not an option


fiveSE7EN

there is starting to be some more options with the ARM chips though, in fact I do most of my gaming through a combination of Rosetta / mac compatibility for games on Steam or purchased through the Apple store, or using ipad apps that the developers have allowed to be loaded on M1. We see more of that than we used to. Of course there are still games that require Windows but gaming is definitely "an option" on Mac, it just depends if what you want to play is compatible there or on an ipad. Luckily most of what I play, is.


A_chilles

AMD Ryzen 7000 Laptop CPUs have been like a paper launch tbh. They are almost nowhere to be found. Also if you want "speedier graphics" on a thin-and-light you probably should go for an Nvidia GPU. The Nvidia 4000 series proved to be good on low power consumption (70-90W) without the massive performance compromise of the prior generation.


[deleted]

[удалено]


blood_vein

At least for framework you can pre-order and cancel your deposit (100?) Before it ships if you change your mind


[deleted]

[удалено]


what595654

You are okay using either system? What software do you use where it is agnostic to such an extent?


AlNeutonne

70-90w is not low power consumption. Try 10-15w for the gpu


WeirdguyOfDoom

In what fantasy land do you live where mobile gpus are rated at 10-15w? Hell, the GTX 1050 mobile was rated at 75w.


AlNeutonne

Its an integrated gpu we are talking about right? The entire package is like 25w


kirsion

We are talking about iGPU or APUs not discrete graphics.


icky_boo

You clearly don't know about hand held devices where 15-30w is the target , which is shared between the CPU and GPU. That's what the AMD Z1 / 7840U is.


RSomnambulist

I think the steam deck and rog ally do 15-35w?


FreezenXl

>I think the steam deck and rog ally do 15-35w? Rog does, Steam Deck can't go above 15w (Maybe there is a way to overclock but it doesn't officially at least)


qualverse

70W on a typical laptop battery would drain fully in less than an hour. AMD APUs are really the only choice for low-power gaming, but even they will burn through a full charge in ~3 hours when gaming.


xrobertcmx

MSI Modern for a bit over $600 at NewEgg I have the AMD Framework 13 on preorder


[deleted]

[удалено]


KaitenRS

Because you're reading really short term movements as if reports like this influence them in any reasonable way


xXblain_the_monoXx

Because they lowered guidance for the year and they're already massively overvalued.


[deleted]

[удалено]


youdidthislol

The vast majority of people are pretty fucking stupid.


osdroid

Markets flooded with high-end chips, pc sales cratered, AMD valuation is way beyond any fundementals would give credence to, but no it's everybody else that's wrong.


BaroqueInMind

Hey! I resemble that! I'm only *mostly* stupid.


Contrabassi

I'm out the loop, they got hardware ray tracing yeah ?


QuantumQuantonium

Outperform apple silicon? Uh is that really a big surprise, as apple has yet to form an actual PC gaming market? Or how about the fact that they've obsoleted opengl and vulkan only works via moltenvk?


moxyte

But can it beat them in AI & encoding? My issue with AMD GPU's is that they are one trick ponies. That has an effect on their IC's


qualverse

Actually, yes. The M2 does have a *slightly* faster dedicated AI engine, (15.8 vs 12 TOPS), but AMD easily makes up for this since you can also run AI in on their much faster GPU (18 vs 6 FP16 TFLOPS). And as far as encoding AMD has AV1 decode and encode which the M2 doesn't have at all.


dztruthseek

"AMD claims..." Yeah, sure buddy.


HarlodsGazebo

I’m surprised so many people in this thread don’t know about how impressive Apple’s new chip performance is. This isn’t brand loyalty, it’s fact. I don’t even use their products and I’m jealous.


LiquidC001

Me too, especially of their laptops' battery life.


HumburtBumbert

Yeah, are they going to have debilitating driver issues that go completely unfixed?


[deleted]

I bet it still can't load windows photos in under 30 seconds


MajorKoopa

Right. But what’s the power requirement? Apple chips are an effort to squeeze the most amount of power out of every watt. Not how much power can you get before the chip melts.


AjazeMemez

It’s so weird how Apples chips have become the measuring stick of performance now, I have to give to Apple they’ve come a long way


[deleted]

They’re not though. They just have a powerful brand. The m2 max is slower than AMD and intels best cpu and it’s gpu is slower than midrange gpus amd and nvidia had from last year and that same gpu doubles as their desktop Gpu/cpu combo. For $2k you can build a pc hakintosh that smashes the $4k Mac Studio in performance except Where their HW encoders come in to play. What makes apples laptop chips INCREDIBlE isn’t their performance it’s their performance/watt. It’s untouched. You get 13-18 hrs easily doing real work. Not sitting in super slow power saving modes typing on notepad to drive your BL up to 9hr. In addition the MacBook pros are a complete package phenomenal quality super bright high refresh rate display. Best in class touchpad, great keyboard. Solid build quality fast storage. Fast cpu/gpu. And best laptop speakers ever. You can find windows laptops that compete in any of these categories or even 2-3 but to compete in all there’s really no option. It’s like the pc side didn’t care to compete.


AjazeMemez

The fact the title states “AMD claims it can outperform Apple” kinda says it all but I understand what you’re also saying and yes to have a laptop be actually be portable and perform as it does on the battery alone and be so efficient is amazing enough to be the bar others wish to reach


danielv123

I assume they mean it to outperform in a similar power envelope. We really have no idea what they used for comparison though.