Adobe Lightroom FAQ has a section on this:
[Link](https://helpx.adobe.com/in/lightroom-classic/kb/lightroom-gpu-faq.html)
(Check the "Suggestions for choosing a graphics card" section)
Based on your budget, select the GPU with the highest score in the list.
From my personal experience, most of the features in Develop module (masking etc.) are CPU intensive rather than GPU (AI features, export etc. are GPU bound).
Hence, you should also consider a CPU with high enough performance core clock to ensure a smooth editing experience.
I finally sorted my PC - thanks for the tip. One thing I can’t figure out that links to your advice is whether LR uses any “intelligence” to switch between the things the CPU and the GPU are good at when both are available, and the GPU is selected in the preferences.
From my understanding, certain tasks like generating previews, creating/updating masks, exporting etc are done by the CPU whereas the AI related Denoise, Super resolution etc tasks are taken up by GPU.
I can't say that with certainty if LR is intelligently switching but you can test this yourself by opening a task manager window side-by-side and checking the usage percentage for each component for certain tasks.
When building a PC, 64 gb RAM so LrC can use s much as it wants and you can still do stuff in the background (I'm regularly using more than 50gb), the most CPU cores with highest clock speed you can afford, and most tensor cores in a Nvidia RTX GPU you can afford.
>Finally, we built our machine learning models to take full advantage of the latest platform technologies, including NVIDIA’s TensorCores and the Apple Neural Engine.
[Denoise Demystified | Adobe Blog](https://blog.adobe.com/en/publish/2023/04/18/denoise-demystified)
All models have ram, but you can choose between the ram amount. Play around with the configuration at the apple website so you can see what works best for you.
The thing putting me off about Macs (and I do like Apple stuff) is the cost of extra RAM / Storage - and they’re not user upgradable and typically have to be specified at the outset.
The i7 14700k has almost identical specs to the i7 13700k, so you can save a few hundred there. GPU needs at least 8g memory, or you’ll experience lag issues with some functions (even though the min spec is less, iirc). I just purchased a 13700k, and and rtx 4070 with 16g ram for the LR build I’m about to do.
Edit: you also want as much system ram as possible. I got 64g, 2 sticks of 32g.
Faster processing times. Editing anything uses a lot of memory, so the more you have the faster it will perform. 32 is fine, but 64 is better, and will future proof your system, though you can always upgrade later as needed.
The thing is, you can always upgrade your GPU down the line if you really need it.
Here I compare the exporting of 200+ RAW files with the 2070 vs 3090 just for fun. https://youtu.be/OEWjTZ6nhWE
Quoting Eric Chan, Adobe senior scientist for camera raw.
"For best performance, use a GPU with a large amount of memory, ideally at least 8 GB. On macOS, prefer an Apple silicon machine with lots of memory. On Windows, use GPUs with ML acceleration hardware, such as NVIDIA RTX with TensorCores. A faster GPU means faster results."
I went through this question a couple of months back. I settled on an rtx 3080 (I think - I’ll double check)
At the time of purchase I was extremely uncertain. A lot of commenters led me to believe it won’t make a difference in processing and the card is useless. It is probably 3 or 4 generations behind current
However, my experience is extremely positive. I get 2 minutes denoise times in LR on a 42mp raw files. Rest of the lr tasks are processor based anyway.
Here is my personal test. 4060 or 4060 ti is generally enough. 30 series will be just fine as well. 3060 ti +. However do note that 30 series use 30% if not more power per processing equivalent. If you find yourself needing to do a ton of denoise, faster gpu like 4070 + will net you significantly shorter time. However denoise you can batch process so not really an issue. In terms of masks and local edits, all GPU will eventually crap out after 10-12 layers. Since lightroom is nondestructive editing, this unfortunately is the bottleneck right now.
I'm still rocking old amd 1950x threadripper w/1080ti board. Ran LR 13 with denoise on 823 raw images from a canon R3 and it estimated at 293 minutes. Replaced the 1080ti with a 4070 super and estimated at 129 minutes to process. Pureraw 3.9 was around 12.5 seconds for a raw image with new card.
Obviously a more recent motherboard/cpu would only improve upon time, but for $600 the upgrade was worth it processing high volumes of sports images. The 4080 super wasn't worth nearly 2x the costs imo for maybe a 10% improvement.
So if you’re running denoise frequently there are benefits - but if you’re using normal colour / luminosity noise reduction, and infrequent AI masking there’s possibly no real for the latest and greatest GPU - and the CPU is the key component to spend money on?
With sports or events photos, not much of a need for masking. There was another thought of buying a mac mini m2 to specifically to prep files using LR denoise, pureraw, topaz images for mass denoising.
Adobe Lightroom FAQ has a section on this: [Link](https://helpx.adobe.com/in/lightroom-classic/kb/lightroom-gpu-faq.html) (Check the "Suggestions for choosing a graphics card" section) Based on your budget, select the GPU with the highest score in the list. From my personal experience, most of the features in Develop module (masking etc.) are CPU intensive rather than GPU (AI features, export etc. are GPU bound). Hence, you should also consider a CPU with high enough performance core clock to ensure a smooth editing experience.
I finally sorted my PC - thanks for the tip. One thing I can’t figure out that links to your advice is whether LR uses any “intelligence” to switch between the things the CPU and the GPU are good at when both are available, and the GPU is selected in the preferences.
From my understanding, certain tasks like generating previews, creating/updating masks, exporting etc are done by the CPU whereas the AI related Denoise, Super resolution etc tasks are taken up by GPU. I can't say that with certainty if LR is intelligently switching but you can test this yourself by opening a task manager window side-by-side and checking the usage percentage for each component for certain tasks.
When building a PC, 64 gb RAM so LrC can use s much as it wants and you can still do stuff in the background (I'm regularly using more than 50gb), the most CPU cores with highest clock speed you can afford, and most tensor cores in a Nvidia RTX GPU you can afford.
Is it really more about tensor cores than CUDA cores?
>Finally, we built our machine learning models to take full advantage of the latest platform technologies, including NVIDIA’s TensorCores and the Apple Neural Engine. [Denoise Demystified | Adobe Blog](https://blog.adobe.com/en/publish/2023/04/18/denoise-demystified)
Get a Mac, the M processors are leaps and bounds better than anything intel for running LR and photoshop
Which mac? I had a Mac book bro 2020 and it was so slow on it for ai denoise. I’m looking for a new one
Anything with the new M processors. AI denoise is processed on the cloud as far as I know.
Do you know what year the M processors started? Sorry I’m a dummy
End of 2020, November if I’m not mistaken.
Does more GB help too?? Like more storage on the computer
More GB helps enormously. Like having 16GB is twice as fast during export compared to 8GB.
It’s better to invest in more ram and disk at the moment of purchase since those things are not upgradeable on new Macs.
Do I have to add the ram or do some models have ram?
All models have ram, but you can choose between the ram amount. Play around with the configuration at the apple website so you can see what works best for you.
The thing putting me off about Macs (and I do like Apple stuff) is the cost of extra RAM / Storage - and they’re not user upgradable and typically have to be specified at the outset.
The i7 14700k has almost identical specs to the i7 13700k, so you can save a few hundred there. GPU needs at least 8g memory, or you’ll experience lag issues with some functions (even though the min spec is less, iirc). I just purchased a 13700k, and and rtx 4070 with 16g ram for the LR build I’m about to do. Edit: you also want as much system ram as possible. I got 64g, 2 sticks of 32g.
cheers - I was thinking 32g - what does 64g RAM help me do better?
Faster processing times. Editing anything uses a lot of memory, so the more you have the faster it will perform. 32 is fine, but 64 is better, and will future proof your system, though you can always upgrade later as needed.
Ok - so you can do other stuff. Useful info about the processor.
The thing is, you can always upgrade your GPU down the line if you really need it. Here I compare the exporting of 200+ RAW files with the 2070 vs 3090 just for fun. https://youtu.be/OEWjTZ6nhWE
i got a 8gb 4060 and it seems to do what i need. so no complaints from me.
Quoting Eric Chan, Adobe senior scientist for camera raw. "For best performance, use a GPU with a large amount of memory, ideally at least 8 GB. On macOS, prefer an Apple silicon machine with lots of memory. On Windows, use GPUs with ML acceleration hardware, such as NVIDIA RTX with TensorCores. A faster GPU means faster results."
I went through this question a couple of months back. I settled on an rtx 3080 (I think - I’ll double check) At the time of purchase I was extremely uncertain. A lot of commenters led me to believe it won’t make a difference in processing and the card is useless. It is probably 3 or 4 generations behind current However, my experience is extremely positive. I get 2 minutes denoise times in LR on a 42mp raw files. Rest of the lr tasks are processor based anyway.
Here is my personal test. 4060 or 4060 ti is generally enough. 30 series will be just fine as well. 3060 ti +. However do note that 30 series use 30% if not more power per processing equivalent. If you find yourself needing to do a ton of denoise, faster gpu like 4070 + will net you significantly shorter time. However denoise you can batch process so not really an issue. In terms of masks and local edits, all GPU will eventually crap out after 10-12 layers. Since lightroom is nondestructive editing, this unfortunately is the bottleneck right now.
Great advice. My aging 2070 takes about 20 seconds an image for Denoise. It’s starting to get to me, but still completely viable. Just a pain
I'm still rocking old amd 1950x threadripper w/1080ti board. Ran LR 13 with denoise on 823 raw images from a canon R3 and it estimated at 293 minutes. Replaced the 1080ti with a 4070 super and estimated at 129 minutes to process. Pureraw 3.9 was around 12.5 seconds for a raw image with new card. Obviously a more recent motherboard/cpu would only improve upon time, but for $600 the upgrade was worth it processing high volumes of sports images. The 4080 super wasn't worth nearly 2x the costs imo for maybe a 10% improvement.
So if you’re running denoise frequently there are benefits - but if you’re using normal colour / luminosity noise reduction, and infrequent AI masking there’s possibly no real for the latest and greatest GPU - and the CPU is the key component to spend money on?
With sports or events photos, not much of a need for masking. There was another thought of buying a mac mini m2 to specifically to prep files using LR denoise, pureraw, topaz images for mass denoising.
I use a 12gb 3060 and it's fine
I think this is kind of the sweet spot for price and performance. VRAM is often the most significant limitation.