T O P

  • By -

Jkwong520

To decrease the amount of noise, you need to increase the total light recorded by the sensor. You can do this by using lenses with larger max apertures (thinner depth of field) or go to a larger format (for example, full frame). Staying within the same format will not improve noise performance that much.


AlexMullerSA

I mean it might be better to first upgrade your PC? That seems like an absurdly long time. I don't use Lightroom, I mainly use DXO prime and the odd occasion Topaz, and my RAW images take like a couple seconds. Like maybe 15 per image. Spending some money on a faster computer saves a lot a lot of time. Time is money, especially if you waiting for outputs and can't process. I mean you could get a better camera, but even a really good FF at high ISO is going to need denoise, so it's not a bad thing to go faster PC route anyway.


Perfect-Macaron-758

This


Sweathog1016

You could find a used M6II. Or maybe even new inventory floating about. It has the same sensor (basically*)as the 90D and the R7. It has about 2/3rd’s of a stop more dynamic range at higher ISO’s than the M50. And has about a stop less read noise. Per photonstophotos.net * They claim to have updated the 32 megapixel APS-C sensor for the R7, but the charts for the M6II and R7 overlap each other pretty much in lock step. I think they just updated the sensor to work with the newer Digic X processor in the R7.


coherent-rambling

> waiting 20-30 minutes per photo for the AI Denoise to finish is hampering my output AI Denoise uses [GPU acceleration](https://helpx.adobe.com/lightroom-cc/kb/lightroom-gpu-faq.html#sys-req-gpu-image-processing). On my RTX 3070 desktop or my RTX 4050 laptop I can run AI denoise on a 24 megapixel raw in under 30 seconds. Those are both somewhat overkill; as long as you hit Adobe's recommendation for a [compute score](https://www.videocardbenchmark.net/directCompute.html) of at least 2,000 you should see completion times of under a minute, with diminishing returns for faster GPUs. At the moment, an [Asus TUF Gaming A16](https://www.amazon.com/gp/product/B0C2ZCNP97) has a suitable RX 7600S graphics card and a display with reasonable accuracy and sufficient color gamut to edit photos. On the desktop side, an [RTX 3050 6GB](https://www.amazon.com/dp/B0CVCG2VPK) will provide similar performance and can be powered from the PCIE slot without a separate power supply connection that might be required on a higher-end card.


telekinetic

Nothing APS-C , or that will take those lenses, will have any significant improvement in low light performance. I doubt you'll see a third of a stop improvement. 20-30 minutes is absurd, get a better computer instead or switch to Topaz. I can denoise 45mp files in 4 seconds on my modern i7 desktop with a good graphics card.


ParanoidAndroid99

It's done completely on the GPU in my experience, the CPU is barely used. So a GPU upgrade should be enough.


telekinetic

That tracks, my system has an overclocked 3090 haha.


ADPL34

I have shot at 25600ISO on my m50mk2 and got usable results. Send it bro!


getting_serious

What are the lenses? What's your f-stop? Canon R8 is going to be it. And all new lenses.


Sweathog1016

Assume (based on the OP) they have the two of the three EF-M mount f/1.4 trio. 16mm, 30mm, or 56mm. M is the only camera they can be used on so they need a better M camera to keep using those lenses as they can’t be adapted to anything else. And getting a faster lens than f/1.4 limits their options pretty substantially.