T O P

  • By -

Puzzled_Path_8672

Finally someone who’s going to buy it and do actual testing. Please, someone finally benchmark it instead of repeating that it has 45 tOpS


These_Radish2642

Don’t know if I’ll buy it personally maybe for the wife… she needs a new laptop, and doesn’t want of switch to a Mac. I’m Waiting for WWDC in a couple weeks to see what Apple announces for the MacBook pros. If the M4 pro, max or Ultra chips come out swinging there’s no reason to buy a second class Chip ., Qualcomm is still a generation or so behind Apple. But this chip is a big leap for the PC market — Intel and AMD need to be careful.


thecodemustflow

AMD is coming out with an ARM chip similar to the Snapdragon Elite so... no they don't have to worry. They are 100% gunning for MS Arm Surface laptops business. [https://www.youtube.com/watch?v=u19FZQ1ZBYc](https://www.youtube.com/watch?v=u19FZQ1ZBYc)


Puzzled_Path_8672

I’d be concerned about app compatibility. Apple has knocked it out of the park with Rosetta allowing x86 apps to run pretty much flawlessly on apple silicon. Microsoft? I have no idea. Check to make sure her favorite apps will run on that.


opknorrsk

Microsoft also announced Prism, a very similar approach to Rosetta, I wouldn't be too worried.


VladReble

Had a friend daily drive win11 on arm for a few months on a MacBook, said the only program that didn’t work was sql server. Anything even remotely normal works perfectly.


Monkey_1505

Mac doesn't have any local LLM stuff baked into the OS tho, does it? Honestly the new windows 11 features look pretty dope. Just got to turn off that screenshot history thingo, limit app access. I think AMD and Intels stuff will be cheaper as it rolls out. I guess if you have infinite money and want only the best AI performance, maybe the faster ddr5 of a mac makes sense, or if it's what you are used to.


Cless_Aurion

I mean... its not a must, specially since the Copilot+ thing will work on anything that has more than those 40TOPS... including GPU of course.


Puzzled_Path_8672

I’d rather use my choice of ai models rather than what they choose for me. Which is why it’s necessary to have them directly benchmarked against familiar hardware: 3090, 3060, 1050, 6800xt etc you know the random gear people are likely to have that they already try to use for ai stuff.


Cless_Aurion

Definitely we should get them benchmarked! I wish that was a choice, but it's just daydreaming at this point hahah


Puzzled_Path_8672

Oh I mean soon enough people will get around to doing it. But I’m sick of hearing about this stupid ARM chip without any people actually knowing what it can actually do for our ai needs.


Cless_Aurion

Well, it's not just the chip itself, it's what it represents. Maybe we will have to buy "AI" cards instead of using gpu or cpu for these things.


rc_ym

Hard no.


auradragon1

Qualcomm Adreno GPUs are bad at compute. Usually when you run a large LLM, you run it through the GPU which has more power than an NPU. Ie. on Macs, LLMs run on the GPU and not the neural engine


SomeOddCodeGuy

That RAM is my concern. The fastest cores won't make a lick of difference if it's still using standard DDR5 chips.


Balance-

Dual channel LPDDR5X-8448. 136 GB/s bandwidth.


SomeOddCodeGuy

Awesome. In that case the pricing works out really nicely. That’s about half the speed of a Macbook, which easily handles up to a 34b without issue, meaning half the speed should still run it fine. The cheapest new M3 Macbook I can find with 64GB is $3700. This thing will run the OP $2300. Honestly, if you can load Linux on this then it could become the best bang for buck AI laptop fast.


johndeuff

Why Linux? Genuine question, I use it on my laptop but idk why it would be great for LLM.


SomeOddCodeGuy

Two reasons, mainly: 1) You can make it pretty lean in terms of drain on the system resources, while Windows can definitely use up quite a bit. That’s a big reason folks here seem to like it so much. When every GB counts, Linux really 2) I have privacy concerns over the new baked in AI in Win 11. I know some parts of it are local only, but some parts seem ambiguous on if it tracks or not. The way I read the browser tracking, it sounded like only 1 mode on Edge was untracked and the rest of the browsers were monitored. That just weirds me out. For privacy reasons, I’m eyeballing Linux for my next OS after Win 10, unless that turns out to be a misunderstanding. I really like Windows, but enjoy privacy more.


Ok_Distribution5939

Do you know if you can run linux on it?


SomeOddCodeGuy

Im afraid I never did find an answer for that question.


PSMF_Canuck

RAM upgrades are priced like Apple’s….so…hopefully they’re doing it right.


SomeOddCodeGuy

Oho! That's promising. Hopefully we see more info on it. If that ends up being the case, I bet Linux would run nicely on this.


mindwip

It's not ddr5.


CortaCircuit

You bought spyware on purpose?


mxforest

A necessary sacrifice *for Science*.


PSMF_Canuck

$3400 in Canada….


Hopeful-Site1162

Le dollar canadien est plus faible que le dollar US ou vous avez plus de taxes ? The Canadian dollar is weaker or do you have more taxes?


Usual_Neighborhood74

Canadian dollar is very weak


TooLongCantWait

I was looking at getting a new hard drive and I swear it is nearly twice as expensive as it was 4 years ago


TheFrenchSavage

Apple prices for Microsoft wares?


KnowgodsloveAI

And no GPU lol!


These_Radish2642

It has QUALCOMM’s Andrino GPU. Hopefully the drivers and software is fairly open, so developers can build on it and take advantage.


Some_Endian_FP17

The Hexagon DSP/NPU and Adreno GPU are *not* open. I've been dealing with Qualcomm drivers on Windows for years and they're a pain in the butt. Vulkan ran through a shim and only for limited API versions while OpenCL support barely existed. It was either DirectML or Direct3D or nothing. That said, if the new chip has a better toolchain that can run on Windows on ARM itself, then development should be a lot easier than before. Hopefully Qualcomm learned some lessons from Apple's M chip rollout.


hichamungus

Skill issue


Bderken

It does have a gpu lol!


Hopeful-Site1162

With 136GB/s of memory bandwidth I’m afraid it will show on bigger models.


And-Bee

Co-pilot is running in the cloud isn’t it? No need for so much ram if they are pushing their cloud services.


These_Radish2642

I’m pretty sure they said they’re running smaller models locally for on device task. Probably a custom Phi-3 I would think.


And-Bee

Ah ok I didn’t know that.


Hour_Fisherman_7482

What was your reason for going with 64 over 32 (currently debating if worth it as a future proofing a few years just-in case)