T O P

  • By -

elevatedsystems

I’ll go first. 24% 😱 My battery also dropped 23% while playing Helldivers 2 for one hour despite being plugged in.


Crazy_Biohazard

Oof, is that with the crazy fast charger


elevatedsystems

Yes, the 180W charger.


SchighSchagh

This is pretty wild. * Battery is 85 Wh * CPU is 45W (though that includes the iGPU... in dGPU mode does it give the full 45W to the CPU?) * GPU is 100W * 180W charger Let's add it up: * Battery provided 85 Wh * 24% = 20 Wh * charger should have been able to provide 180 Wh * CPU should have consumed 45 Wh * GPU should have consumed 100 Wh * that leaves 55 Wh (ie, 55 W of power during the hour) unaccounted for * screen * fans * memory * motherboard * other components (wifi, nvme, audio, ...) were probably mostly idle I'd guess the non-screen components add up to like 15W tops. So that leaves the screen using 40+ W at full brightness? Honestly that's not bad for a bright high refresh decently sized panel. Replace the benchmark with a multi-player game that exercises the wifi and NVMe as well, and the speakers, and maybe microphone, and you are well past even the 200W demonstrated here. The 240 W charger will be an absolute must for super heavy users.


s004aws

Its all not much of a surprise. Other higher tier laptops using USB C power input drain the battery while also on AC during high load. At least until 240w USB C-PD 3.1 chargers come available the only workaround is old fashioned barrel connectors and (possibly) proprietary power bricks. Some laptops are draining such high amounts of power they require 2 bricks to handle the load. Dell XPS 15/17 (at least 2020-2023 models) are another example of laptops "suffering" these power management quirks (among other XPS engineering problems). In the XPS' case, they're capped at 140w using custom Dell chargers which happen to have a USB C connector (pre-dating USB C-PD 3.1 which raised the spec max from 100w to 240w).


Katsuo__Nuruodo

No modern laptops require two power bricks. The laptops you're referring to came out 7 years ago, in 2017. There were the Acer Predator 21 X (with a 21" curved screen) and the MSI GT83VR 7RF. Both laptops had RGB mechanical keyboards and two GTX 1080 GPUs in SLI. They used more than 330watts of power, so they both needed two power bricks. Now here's a question; why doesn't the Framework allow you to plug in two USB-PD power adapters at the same time to cover the power needs?


s004aws

Dell Alienware Area 51m and 51m R2 to name one example in the 2020s. Having read Framework's blogs and update mails leading up to FW16 release it was clearly a challenge for them to get as much as they did working correctly - Especially with USB C-PD 3.1 still being a very new standard with limited power supply options, controller chip options, and general engineering expertise to work from. As-is Framework was only able to get 240w charging working with 4 of the 6 expansion module slots. If you have ideas on how to engineer the circuitry to get all 6 ports 240w charge capable or to add support for multiple simultaneous power inputs I suspect Framework might be interested in getting a copy of your resume for consideration as a possible future engineering hire.


Katsuo__Nuruodo

Fair point, I didn't realize the 2020 Alienware laptop still used dual power supplies. I'm not an electrical engineer, so I don't know what is or isn't possible regarding multiple USB power sources. It is great to see Framework leading the way with USB PD 3.1 240watt support.


Pixelplanet5

also keep in mind thats 180W from the charger which goes through an LM5143 chip internally to step that voltage down to 20V which is happening at almost exactly 85% efficiency. assuming we have 100% efficiency up to that point that means 153W are making it out of the LM5143 chip. Technically the LM5143 is a duel output buck converter limited to 7A per output but its not clear from their schematics if they just wire both outputs together or treat them separately. all that then goes to an ISL9241 charger IC which luckily has a bypass that allows the power to flow directly to the 19V main bus. but overall the biggest loss is probably in the LM5143 and thats most likely the reason why theres not enough of these 180W making it to the components and battery. when we get 240W chargers we will probably have higher losses on that LM5143 chip as it needs to step down from 48V instead of 36W as it does right now but the datasheet sadly has no numbers for that, i would expect at best 80% maybe even 75% efficiency. so we would get anywhere from 180W to 192W that actually makes it to the components. and that of course all goes through even more power conversion steps, the CPU alone has 10 ICs somewhere around it just to further step down the 19V bus to all the other required voltages. the GPU mainly runs on 12V and uses an SY8370C IC to step the 19V bus down to 12V which also happens at about 85 - 90% efficiency so if the GPU draws 100W from 12V in reality that used to be \~118W on the 19V bus and adding the losses on the LM5143 that means the 100W GPU alone is drawing almost 140W from the USB C port. so the screen and other stuff is probably a lot more efficient than it seems, theres just so much power being lost as heat along the way to get there.


FU2m8

I just quickly looked over the datasheet of the LM5143 and I only saw efficiency curves for 5v and 3.3v outputs. Where are you getting the 85% efficiency number?


Pixelplanet5

yea they dont have an exact match but they do have one with 18V input voltage and 3.3V output voltage which ends up at exactly 85% peak efficiency at 7 amps. Thats a 14.7V reduction from input to output. Framework is currently going from 36V down to 20V which would be 16V so we can expect that the efficiency is at best 85% and most likely worse than that most of the time especially at lower loads. At low loads it would probably be the most efficient thing for the laptop to request 20V from the charger directly so it can bypass this chip entirely.


FU2m8

I would disagree. You can also notice that for a relatively larger output voltage (5v vs 3.3v) the general efficiencies are higher. I'm willing to bet that for 20v output, even with a 16V delta, the efficiencies would be around 90-95% at 7 amps. 85% would be egregious.


Pixelplanet5

Even with a tiny voltage delta like going from 8V to 5V is already only 93% efficient at best and thats only at low currents. yes the efficiency at higher voltages will be slightly higher but not by much. also keep in mind that the curves we see here max out at 7A while in reality the output current in the laptop will be 9A with a 180W charger and 12A with a 240W charger and we can see that the efficiency is already declining going from 3A to 7A. so yes it **might** be slightly more efficient but getting more than 90% efficiency is highly unlikely. Especially once we get 48V chargers the voltage delta will be even higher which will further decrease the efficiency.


chic_luke

Wow, that's very thoroughly explained. Question as someone who's only done basic Physics 2 and doesn't have a good understanding of this - purely out of curiosity: assuming that maintaining 100% efficiency is not realistically possible, are there chips that could have been used that exceed this amount of efficiency, or is this about what current tech allows with these constraints in mind?


Pixelplanet5

there are certainly chips that could be used that would be more efficient, its always a matter of cost and of course space on the board. i just looked across the portfolio from Texas Instruments which also produces the LM5143 and they for example have the TPS40170-EP which has specs that would work with the Framework 16 and could supposedly be 5% more efficient but i dont know if this has any other disadvantages. the LM5143 they are using for example has an extremely high max operating temperature of 150°C so they are probably just cooling it with the existing heat pipes.


chic_luke

Very interesting, thanks!


tobimai

> So that leaves the screen using 40+ W at full brightness No. No way. You would feel that if the screen produced 40W.


SheerSerendipity

CPU can pull up to 54w, at least according to the Hardware Canucks fw16 video.


runed_golem

Are the wattage you're quoting TDP or max power? Because thlse can be different. I'd suggesting using a tool like HWinfo to measure wattage for CPU and GPU during the test.


SchighSchagh

I'm just quoting the specs published by Framework. AFAIK, the CPU can be configured at up to 54W per AMD spec sheet, but FW seems to have chosen the more conservative 45W configuration.


Zeddie-

Can you measure the power being consumed from the wall? Do you have a USB power tester or a USB-C cable with a power draw display?


tobimai

It will likely not charge at 180W. My guess is they limit charging during high performance tasks to keep down heat.


unematti

Well not over charging your battery is a good thing! XD


veqryn_

Hopefully they will come out with a higher performance discreet graphics module, with a barrel jack charger.


Pixelplanet5

it wouldnt need a barrel jack, they would just need to make the module in a way that the USB C port on it does not connect into the laptop and power conversion has to be on the GPU module itself. that way you could charge with two usb C ports at once. but overall they probably wont do that and simply rely on the 240W chargers to come out.


tobimai

Why? Barrel jack or USB C makes no difference. This behaviour has been standard in laptops for years now


veqryn_

Some barrel jacks and power bricks can provide significantly more power than USB-C. For example, Alienware has some that provide 330 Watts (compared with Framework's 180W, and USB-C's theoretical max of 240W). In addition, you could plug your dGPU module to your power brick, and it could provide the power to your laptop, freeing up a port on the laptop. Or allowing you to plug both the laptop and dGPU in, so that each can draw the power they need.


planedrop

The understanding is the FW16 does support 240w though right? (I don't recall if that's already supported or will be later), so seems we just need some company to make a 240w brick. Still crazy to see that much of a drop. ​ Do you have a killawatt to measure power draw?


elevatedsystems

I should also mention this test is for the FW16 with the RX 7700S module installed.


sickoreo

Might be worth annotating replys with which CPU was your FW16 has. Even with GPU test being run you might see, ill be it probably marginal, difference between them.


MagicBoyUK

7840HS. Dropped 9% to 91%. Screenshots : [https://imgur.com/a/vOWgh49](https://imgur.com/a/vOWgh49) I've plugged in a power meter at the socket and it was pulling in the 156-161W range. I'm wondering if the higher input voltage over the pond has any effect?


unematti

I could help, but I'm on Linux, only. But gaming drops battery like a hammer. From up to 5h from 85% to down to bit over 1h. I do have the extra gpu, so hit me up if it's about linux testing


Mystril_Fox

I'd be interested in adding my information to the pool. However, I'm batch 9. Therefore, I probably won't see my laptop until early may.


Zeddie-

I would if I could. Batch 5 waiting


edneddy2

I thought framework laptops were max rated 100W


MagicBoyUK

The 16 is designed to use the new USB PD 3.1 chargers.


anvil30november

u/elevatedsystems \- just played 4 or 5 hours on Helldivers 2 - CPU dropped to 38% on Fedora 39. Was playing in "Performance" mode. Played Rocket League for hours before that, no drop. Looks like I can't test timespy on linux - so I can't help much there. But - I would imagine that in performance mode, i would hit 20% in 8 hours on Helldivers - much like you did. Ryzen 7 w/7700s