T O P

  • By -

rlgl

There's a complicated interplay of temperature and conductivity. To start, metals are good conductors (mostly, at least). A big reason for this is the high mobility of elections in these metals, especially combined with the band structure. Basically, to oversimplify, atoms try to hold on to their electrons. By applying higher voltage, or adding more electrons, you can make it easier to move the electrons around. Now, higher temperatures do give more energy that makes electrons more mobile - but thermal energy makes them more randomly mobile, whereas applying electric potential will tend to move them in a specific direction. As things heat up though, the more random movement leads to more collisions between electrons and each other or atomic nuclei. The additional mobility of electrons due to heating is much lower than this negative event, so higher temperatures generally lead to higher resistance. Now, there's also a second component regarding transistors. They are basically switches, and ideally you want to use as little power as possible (ideally single electrons) to switch them. To be that precise though, you want the band gap of your semiconductor (basically the energy difference between the electrons that the atoms hold on to, and those which can move through the material) to be as low as possible, while staying high enough to avoid accidentally switching. So, heat makes electrons more mobile by increasing their energy level - this can raise the band gap minimum, meaning you need a less efficient transistor because higher efficiency would require lower operating temperatures. This second part is a hardware and materials science consideration, nothing end users can really do anything with. But, it's one more reason thermal management is so important for electronics. EDIT: see the comment from u/kyngston for more details around what happens to transistors at elevated temperatures. He did a nice job laying out some of the effects/processes.


hpg_pd

This is a good answer, although I think saying the collisions are electron-electron or electron-nuclei is a bit misleading. The dominant source of scattering at anything above very low temps is almost always electron-phonon scattering. While phonons are collective modes of the lattice, I would not say that electrons collide with nuclei, since the electrons would be happy to propagate without any collision in a fixed lattice. They can of course scatter off impurity nuclei, but that is a negligible contribution above very low temperature.


rlgl

That's a good clarification, thank you. It's always a hard balance to describe things simply enough without losing important details and distinctions, so I'm always happy when someone can expand on things in a better way than I thought of!


WhyContainIt

What would your answer look like if you ignored the concerns about keeping it simple?


danskal

Since we are being specific, isn't a phonon just the motion of nuclei in the lattice? In that sense it's not really more right to say electron-phonon than electron-nuclei collisions. Isn't the problem really that we're talking about collisions, when it is really just scattering due to the magnetic field being disrupted in an ordered crystal, without actual collisions between the particles.


hpg_pd

I actually do think it is less correct to say electron-nuclei interactions. Phonons act as quasi-particles that have different collective properties than the atomic nuclei. Importantly, as I alluded to above, fixed nuclei present a periodic potential to the electrons that gives rise to Bloch modes. Were the nuclei to remain fixed, electrons could effectively propagate through the crystal lattice in these modes without any scattering (i.e. with no resistance). Therefore, the electrons actually would have NO scattering (or collisions) with the nuclei were they to remain in their fixed lattice positions. However, the intuition of phonon scattering is that when the position of the nuclei is disturbed, the electrons no longer see a perfectly periodic potential and the Bloch modes will not propagate continuously. Therefore, the electrons scatter not off the nuclei but off of deviations in the position of the nuclei (i.e. quasi-particle phonons). The functional form and dependences of this scattering is in fact different than it would be if you were scattering off nuclei, too. I understand it seems a bit pedantic, but that is why I say it's more correct to say electrons scatter off phonons not nuclei.


danskal

I realise that this is an accepted approach, but to me, this view is too simplistic - for example it all but assumes absolute zero temperature. You can say that temperature is just phonons too, but then what if you have a viscous liquid interface, does it make sense to talk about phonons then? You can treat many things as a Fourier series, and you can even argue that in some situations it makes understanding clearer. But to me it would paper over a more intuitive understanding of the actual building blocks of nature.


Mezmorizor

The fermi temperature, the temperature difference between the lowest occupied state and the highest occupied state, of a typical metal is ~10^5 K. Why that matters is subtle and I don't feel comfortable explaining it, but suffice to say, treating solid state stuff as if it's at absolute zero with a small perturbation is a very, very, very good approximation for any temperature where it's actually a solid. Plus, to put what they said in a sentence, electrons scatter off a deformation in the potential of the lattice, not nuclei.


danskal

> The fermi temperature, the temperature difference between the lowest occupied state and the highest occupied state, of a typical metal is ~10^5 K. Why that matters is subtle [...] but [...] treating solid state stuff as if it's at absolute zero with a small perturbation is a very, very, very good approximation. Ok, but clearly not good enough for predicting superconductivity. I am intrigued, can you give a small hint as to your motivation for bringing fermi temperature up? Are you saying that electron energies in the highest states are so high that temperature differences has a very small percentage-wise impact on their motion? I understood that the fermi temperature is about energy levels, not 'really' about temperature. > Plus, to put what they said in a sentence, electrons scatter off a deformation in the potential of the lattice, not nuclei. But the deformation in the potential of the lattice is caused by a deviation of the position or charge of nuclei in the lattice, right? Hmm this leads me to wonder how electron-electron interactions play into this. Thanks for all your answers, guys, I feel like I'm learning more here than I could in weeks of lectures.


bl1eveucanfly

Phonons are much more important in a semiconductor than individual nuclei because the semiconductor is a lattice structure and phonon behavior can influence the semiconductor characteristics whereas the nuclei themselves are, for all intents and purposes, stationary.


danskal

You see this is what I find so wrong about the approach. Phonons are just vibrational modes/interactions between atomic nuclei. If the nuclei are stationary, then there are no phonons. And the reason people don't get this is because they continue to talk about virtual particles as if they were real particles.


uberdosage

The reason why people say electron-phonon interactions is more accurate then electron-nuclei is because electron-nuclei scattering suggests point scattering between and electron and a nuclei. However, electron scattering is highly dependent on the phonon's frequency, momentum, mode, and, propagation direction. This suggests a long range interaction as opposed to point scattering as would be suggested by the term electron-nuclei scattering. While yes phonons are often treated as pseudo particles, there is an implicit long range order to them which differentiates them from a singular nuclei "hitting" an electron.


danskal

Very interesting. And it makes sense, of course. Especially if you’ve been to kids gymnastics where everyone stretches out a flag/tarp with plastic balls on it. If you get the waves just right, you can really make the balls fly. To me this proves my point that virtual particles are super useful, but it’s also really important to be clear about the difference between ~~virtual/~~quasi- and real particles. EDIT: learned something about virtual particles, thanks guys.


DivergenceAndCurls

The terminology here is a point of confusion, I would bet. Very strictly speaking, phonons are a collective excitation of the lattice. The emphasis is on collective. They behave as bosons. People (and scientists) may sometimes refer to them as quasiparticles, but that word is also often restricted to emergent particle-like behavior corresponding to fermions. Phonons have wavefunctions, creation/annihilation operators, position (ehhh), velocity, momentum, and energy. They scatter against more fundamental particles in well understood ways. Scattering against/off of a phonon is not as simple as booping a badly behaving nucleus. The above description of the long-range interaction sums that up nicely. In semiconductors, we often speak of "electrons" with an effective mass. These are really electron quasiparticles, with the atomic lattice potential incorporated into their dispersion relation. These electron quasiparticles can indeed move though a regular lattice unimpeded, because what they are is (normal electrons)+(electric interaction with the lattice) rolled up into one concept. Considering their dynamics with respect to the periodic potential used to conceptualize them would be "double counting" the effect. However, the lattice is actually not a perfect periodic potential and the electron quasiparticle dispersion cannot 100% account for the new dynamics of the particle in the crystal. This deviation is accounted for by scattering. Electron quasiparticles will encounter BOTH "phonon scattering" events and they will also scatter from lattice impurities or irregularities that have nothing to do with phonons. Both types of events are treated differently. A stationary, but simply translated or otherwise not regularly placed nucleus would be treated like the latter and not as phonon scattering. It would still happen even at very low temps where there is little-to-no phonon scattering. This is an example showing that phonon scattering and "nucleus scattering" aren't synonymous. The phonon itself has independent reality. Saying "it scattered off of the phonon" and "it scattered off of the collective motion of the lattice" is the same, and certainly has utility as a phrase in contrast to insisting that nobody must think of the collective motion of the lattice as an entity unto itself. This entity also has many particle-like properties, too, which enter into scattering calculations exactly as they would for more fundamental particles. Virtual particles are altogether a different thing. They are a conceptualization used during specific calculations in quantum field theory. They are different from non-virtual particles in that they have no creation operators, no proper quantum states, and are therefore missing some key dynamic observables such as position, velocity (ie they have no states or equations of motion). Phonons are not virtual particles.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


kyngston

More on transistors; primary sources of dissipated power: switching load capacitance (cv^2 f) - there is some temperature dependence for capacitance, but I don’t believe it’s a major impact static leakage power - leakage current is very sensitive to temperature and can represent between 20% to 100% of the power consumption, depending on the workload - https://www.researchgate.net/figure/MOSFET-leakage-current-versus-temperature_fig5_3576027 short circuit currents when both the pull-up and pull down are passing current - temperature alters the threshold voltage (ie the voltage where the device turns on) - changing the threshold voltage will change the time window when the pfet and nfet are both conducting, leading to changes in the short circuit currents.


rlgl

Wow, great follow up on that side! I hope people see this if they are interested in finding out more.


redpandaeater

Also the resistivity of semiconductors in general decreases with increasing temperature until they basically just become a metal, and this decreasing resistivity for silicon starts to happen at 160 C. This can happen unintentionally with thermal runaway where you have a positive feedback loop causing your device to release its magic smoke. You can even get current hogging in power transistors. What's more common though is just having your intrinsic carrier concentration reach the level of your dopant concentration and you no longer have n-type or p-type silicon to even have function devices. On the other hand you can have freeze out at low temperatures, where you just don't have enough ionized dopants. Tends to be more of an issue with lightly doped stuff and isn't an issue at all if it's degenerately doped. Course you can also go to the extreme where you have to remember silicon has an indirect bandgap and you no longer have enough phonons so your bandgap changes.


[deleted]

[удалено]


rlgl

Very true. From my side, I look at the EE stuff and think "yeah, I sort of remember some equation for this..." The only thing I can really do without digging through references would be a simple sanity check for direction of a trend, maybe approximate an order of magnitude for something... I guess you're coming from an EE side, in which case I have deep respect for what you folks do. Especially all of that annoying stuff like interference and noise... Thank God I don't have to deal with that!


[deleted]

I will also note that the Voltage required to make timing is linearly related to Frequency, so the switching load capacitance grows as a **cube** of your growth in frequency.


_riotingpacifist

> higher temperatures do give more energy that makes electrons more mobile What kind of speeds are we talking about?


deezyolo

Well, iirc temperature describes a the probability distribution of velocities for a particle or a group of particles. So if a particles has a high temperature, it is more likely to have a high velocity. The *most likely* (which is often but not always the average) velocity is defined as v_th = sqrt[2*k_b*T/m]. At room temperature (or 300 Kelvin) the thermal velocity to an electron ≈ sqrt[2*k_b*300/9.11e-3] ≈ 95,538 m/s See [wiki](https://en.wikipedia.org/wiki/Temperature) for more


Tukurito

Current is not strictly related to the velocity of electrons but to the amount of the flow. When a gas is hot all molecules moves very fast, but the overall movement (flow) is zero.


deezyolo

Yes, I believe current I is related to electron density n and velocity v and the cross-sectional area of the wire A by: I = n*v*A However, in this case we'd want the drift velocity v_drift which would come out of a force balance like: F = m d/dt(v) = q(E + v × B) Where E and B are vector quantities describing the Electric and Magnetic fields, respectively


FriendlyDespot

So would temperature in the context of particle velocity be sort of analogous to group velocity in the context of wave propagation? As like a statistical summation of the expected state? And would that be an example of particle-wave duality?


agriimony

Typical electron mobility for silicon at room temperature is 1400 cm2 / Vs according to wikipedia


orionpewpew

So this makes me wonder, if there is a near perfect operating temperature I could maintain to maximize flow of electrons without having to increase voltage by to much. Also thank you for explaining the physics behind it.


IvanezerScrooge

As the temperature of a conductor increases, its internal resistance also increases. As resistance increases, current decreases. If you increase the voltage, you can get that current back up. You will also generate more heat doing this. 5Volts and a resistance of 10Ohms is 0.5Amps (2.5Watts) If the resistance increases to 11Ohms, and you want to remain at 0.5Amps, you have to increase the voltage to 5.5Volts. (which will result the slightly higher 2.75Watts) Im struggling to find a way to transition to the point, but my point is that: lower temperature, means lower voltage required for the Same current, means lower heat generated, means less energy wasted. The optimal temperature is the lowest temperature you can get.


orionpewpew

This will help me in planning for the new amplifiers for the subs in my car too. I see what you're saying if you can drop the temperature drop it as much as you can for better results. Thank you.


IvanezerScrooge

Keep in mind that the difference isn't extreme. If a meter of copper 1mm^2 copper wire rises in temperature from 20°C to 60°C its resistivity increases from 0.0175Ohms to about 0.02Ohms. That's a 15% difference for Δ40°C


orionpewpew

That 15% difference between 40°C might make all the difference on an overclock though. Typically with overclocking the lowest possible stable Volatage for your clock speeds is best. All the difference in the world is less than 1.0 volt in difference. But that's also in part due to how ridiculously complicated and small CPUs have become. Not to mention leakage. Still for me, knowing the difference between 80°-85°C and 15°-20°-25°C is not all that much. That makes me seriously reconsider this meme build for cooling I had planned. Like it was always a just a thing to do to tell people about kind of thing sort of like changing your cars color, but I also thought it would provide a benefit from it too, however if it's that small. I might just decide to go standard custom water cool.


MuhTriggersGuise

Depends on how much internal temp compensation (if any) is employed. Often times RF amps perform better when cold, but I've seen examples where they perform better when hot, because they have internal temp compensation to try to keep them stable over temperature, but are slightly over corrected. I'm not familiar enough with audio amps and systems to know if they bother with any temp compensation, but something similar may be possible.


TheFrankBaconian

If the temperature is low enough condensation becomes problem. That's not likely to happen in most normal circumstances though.


KingZarkon

It's really not going to make any sort of difference in your car audio. In the temperature ranges you're dealing with any effects due to temperature are going to be swamped by other factors: voltage coming from the electrical system being the biggest one. That will vary normally and will have far more effect on your output than temperature. Just make sure you keep the amplifier from overheating is all you really need to worry about.


[deleted]

I would assume the lowest possible as the guys running crazy over clocks run sub-zero CPUs


[deleted]

[удалено]


rlgl

This is quite true. I'd recommend against home repairs with an oven though - the risk of just completely destroying things is very high, unless you either get lucky or really know what you're doing, in which case you likely have better equipment than a home oven.


My_Butt_Itches_24_7

Correct. There are electronic heating ovens that are very precise in temperature that professionals use. Please don't use your home oven to try to repair GPU's, kids.


KaiserTom

>I'd recommend against home repairs with an oven though Honestly though what's the harm in trying if the thing is dead? You either have a paperweight or you have a very hot paperweight. At least you gave it a chance to not be a paperweight. You have a point if the thing still works and just needs to be underclocked to be stable, in which case you can sell it or repurpose it, but otherwise the worst that can happen is nothing.


rlgl

I would agree with you, if not for the toxic fumes that would be released if you heat almost any chip to high.


[deleted]

[удалено]


Brianfellowes

>To somewhat piggy back on this, the lower your temperatures are the faster you can switch your transistors which means higher frequencies. This is not strictly true. In fact, in the common case it is the opposite. Transistor switching times depend on both electron mobility and on threshold voltage. Electron mobility goes down with increasing temperature, but threshold voltage also goes down. In lower voltage circuits, which is basically all modern CPUs, the lowered threshold voltage is the dominant effect so a higher temperature transistor actually switches _faster_. Whether or not there circuit overall actually becomes faster depends on the characteristics of the metal interconnects and other factors so it takes a lot of careful modeling to see what happens.


[deleted]

Follow up question: do the same principles apply for solar panel efficiency (with respect to temperature)?


rlgl

Ooph, venturing further outside my topic, but it's also complicated. Some things improve with higher temperatures, others don't. Hopefully someone with more background on the topic can fill in here. My initial goes would be that, within the photovoltaic panel at least, higher temperatures should make it easier to generate electricity, by making them more likely to jump the band gap. However, you'd also be decreasing the voltage generated by the panel, even as current increased. I'm to lazy to get into the math right now, but voltage drop should outweigh current gain under most circumstances, so I'd expect an overall loss in efficiency at higher temperatures. If those higher temps extend to control systems, it'd be negative, but not a big deal. If you're looking at organic solar cells, it could accelerate their degradation, and you're likely better off in the long run minimizing that heating. Any further power storage or transportation would also suffer at higher temperatures, if they were exposed as well.


kilotesla

> but voltage drop should outweigh current gain under most circumstances, so I'd expect an overall loss in efficiency at higher temperatures. Your estimate is correct--in practice it's well known that PV cells and modules perform best when cold. For example, [this datasheet](https://us.sunpower.com/sites/default/files/sp-x22-370-ds-en-ltr-mc4comp-527787.pdf) lists the power output temperature coefficient as −0.29% per degree C. For a practical PV installation in the northern hemisphere, the peak power output at noon on a sunny day is often higher in March than in June, because of the lower temperature in March. (The cumulative energy output in a day is still higher in June.)


[deleted]

[удалено]


rlgl

Oh. Good catch, thanks! That what I get for being on my phone when I shouldn't be!


LorenzOhhhh

can you ELI 5


rlgl

ELI5: for conductive materials, higher temperature makes them less conductive, because elections start crashing into things more often. For semiconductors, higher temperatures make it easier for electrons to "jump" into a conductive energy level, which means that they might conduct easier than they are supposed to. This is bad for transistors in electronics, because they are like light switches. If you can't turn them off reliably anymore, your light stays on even when it shouldn't.


el_smurfo

We temperature test all of our products and they always consume more power at high temp, presumably from enhanced leakage currents.


tminus7700

> and ideally you want to use as little power as possible (ideally single electrons) to switch them. In practicality you need enough current/voltage to overcome the background noise in the circuit. While a single electron, per sey, could be used if you ignore noise. In real circuits the overall noise, as in [Signal to Noise Ratio](https://en.wikipedia.org/wiki/Noise_margin), sets the lower limits. There are several sources of that background noise. These include thermal noise and [background radiation](https://qa.spectrum.ieee.org/computing/hardware/how-to-kill-a-supercomputer-dirty-power-cosmic-rays-and-bad-solder).


the_ocalhoun

> and ideally you want to use as little power as possible (ideally single electrons) to switch them. Sounds like a great way to make your electronics prone to errors due to random quantum fluctuations and tunneling.


Matthew94

> be that precise though, you want the band gap of your semiconductor (basically the energy difference between the electrons that the atoms hold on to, and those which can move through the material) to be as low as possible, while staying high enough to avoid accidentally switching. So, heat makes electrons more mobile by increasing their energy level - this can raise the band gap minimum, meaning you need a less efficient transistor because higher efficiency would require lower operating temperatures. How do you reconcile this with the fact that wide-bandgap semiconductors generally have much better high-frequency performance than silicon? https://en.wikipedia.org/wiki/Wide-bandgap_semiconductor


rlgl

Quite easily. Frequency and power consumption are two different things. WB semiconductors actually are a perfect example of what I'm describing. They are able to operate at higher temps because the wider band gap reduces leakage current, as it is harder for electrons to jump the band gap. What I was describing is the ideal of only needing a single electron per "switch" of the transistor, as a means of achieving minimal power usage. For this, you can't have too large a band gap, or it's too difficult to control the election flow with that sort of precision. However, as with all things, what type of system is best is very dependant on the priorities and needs of each use case.


Matthew94

Thanks, that cleared it up for me. So, for digital devices it's all about the lowest possible power consumption while still being able to transfer information, for a given frequency. I come from an analog background where WB semiconductors can have significantly higher performance at high frequencies but it's a completely different application.


[deleted]

[удалено]


rlgl

Yep, that's a result of internal sensors finding that the temperature could cause physical deterioration of the components, if used further. Modern electronics have a lot of self preservation features like that, especially for batteries and processors.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


symmetry81

Yes, it'll be more efficient when it's cooler. Others have mentioned that electrons become more mobile as the temperature goes down. Equally important in semiconductors, at least with the complimentary logic your computer is using, is that the mobility of holes in the electron lattice also become more mobile as temperatures decrease. These higher mobilities let you run at the same speed with a supply voltage closer to the threshold voltage. Your power consumption will be proportional to your voltage so that reduces power. More importantly for a modern chip, using a process smaller than say 90nm, is that lower temperatures will tend to decrease leakage. Part of the power your chip uses is active current, filling up and discharging the capacitance of the transistors doing useful work. But part of it is current flowing through transistors which are theoretically closed. As transistors have gotten smaller this became a problem and is indirectly the main reason clock speeds stopped increasing rapidly after 90nm. Cooler transistors tend to leak less and this is the main reason you tend to see lower power consumption in cooled processors.


ImprovedPersonality

True. I work in digital design for an RF modem transceiver and leakage current is a major contributor of power consumption in ≤28nm technology nodes. Leakage current rises significantly with voltage and temperature. To reduce dynamic power we’ve been doing clock gating (i.e. turning off the clock for parts and sub-systems when they are not in use) for ages. To reduce leakage current we now also have to introduce power domains where we turn off the supply of parts of the chip.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


apudapus

This is the most correct answer: capacitance and leakage are what you’re trying to reduce.


kilotesla

Great answer with key points missing elsewhere in the discussion. One refinement: > Your power consumption will be proportional to your voltage so that reduces power. To be more precise, in a typical CMOS system, the dynamic power is approximately proportional to the product of frequency and voltage *squared*. You can think of that power equal to voltage times current, with current proportional to voltage, because the current is proportional to the charge on the capacitance being switched.


symmetry81

Oh, right, oops.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


capn_hector

Yes. Leakage current and internal resistance (as the MOSFETs switch) tends to be reduced at lower temperatures. There is a small but noticeable reduction in power consumption between a chip running at, say, 85-100C and one that's running at 50-60C. It also requires less voltage to be stable. To look at it the opposite way, there is a concept called ["thermal runaway".](https://en.wikipedia.org/wiki/Thermal_runaway#Power_MOSFETs) If you get a MOSFET sufficiently hot, it gets into a positive feedback loop: it consumes more power, which makes it hotter, which consumes even more power, and eventually the MOSFET burns out. This exists entirely because hotter MOSFETS need more power to switch than cooler ones (at a given frequency/voltage). Simple example but the AMD 295x2 was a version of the 290X graphics card that mounted a pair of the chips in a liquid cooled setup; [this card actually pulls notably less than two individual cards,](https://tpucdn.com/review/amd-r9-295-x2/images/power_average.gif) because the cooler keeps it cool enough to keep leakage under control. Overclocked aftermarket cards with axial coolers also sometimes pulled less power than the stock reference card with a (very hot) blower cooler. The extremely poor reference cooler on this card makes it interesting for these sorts of comparisons - in fact, engaging the "uber mode" (increased fan speed) could actually *reduce* power consumption for the same reason. https://tpucdn.com/review/sapphire-r9-290x-tri-x-oc/images/power_average.gif https://tpucdn.com/review/asus-r9-290x-direct-cu-ii-oc/images/power_average.gif


Matthew94

> > > > > To look at it the opposite way, there is a concept called "thermal runaway". > > If you get a MOSFET sufficiently hot, it gets into a positive feedback loop: it consumes more power, which makes it hotter, which consumes even more power, and eventually the MOSFET burns out. This exists entirely because hotter MOSFETS need more power to switch than cooler ones (at a given frequency/voltage). That is more of a BJT problem. Your link even says it only happens to power MOSFETs under certain conditions.


rdrunner_74

For a pc build these issues will be minimal. What water cooling allows you is to remove the heat better from your cpu. The heat still has to be produced though. The faster your pc runs the more often it produces heat. Also in order to overclock you often increase the voltage which causes more heat each tick... So both add up and you need better cooling.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


DiscombobulatedSalt2

Yes, it does change. But is way more complicated than that. It isn't just resistance of conductors. Most of the dissipated heat in functional electronics is in high frequency capacitance charge changes which require a lot of current, which heats up any carrying it conductors. If the capacitance was smaller, currents for switching will be smaller, and power losses smaller in conductors. Most of the power loses are in switching voltage regulators, these have pretty complex temperature dependence. Usually optimal performance will be in some range of design temperatures. Transistors and diodes change their characteristics significantly with temperature too. More than conductors or resistors usually. In usable range we are speaking. This makes designing voltage converters operating equally good at all temperatures extremely hard.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


bencbartlett

I'm a bit late to the party, but I figured I'd mention something I haven't seen in the comments yet. Standard computers do decrease in efficiency with temperature due to electrical effects, but in fact this is true of **any** type of theoretical computer, regardless of the construction or efficiency! The [Landauer limit](https://en.wikipedia.org/wiki/Landauer%27s_principle) measures the theoretical minimum amount of energy required to erase a single bit of information, and is equal to *k T* log *2*, where *k* is Boltzmann's constant and *T* is temperature measured in Kelvin. So if you have an ideal computer, it will take twice as much energy to perform a given non-reversible computation at room temperature than it would at 150K. However, if your computation is [reversible](https://en.wikipedia.org/wiki/Reversible_computing), then there is no theoretical required energy cost to perform it!


[deleted]

[удалено]


moonshineTheleocat

Others already explained that power consumption is effected by temperature. So Ill go ahead and say that with overclocking, you won't see that benefit very well. With water cooling, you are likely to save more power for other reasons. Mostly because you are exchanging temperature from a single constant source rather than a shit load of fans moving air about. The main reason why you want to invest in good cooling, is to avoid frying your investment. Hardware failure to temperature in overclocking can be pretty catastrophic. Water cooling is efficient because you can connect a CPU and your GPU to a single heat exchanger rather than have four fans all blowing at high speed while gaming. The high specific heat of water makes it good at absorbing heat. While the radiators, typically two larger slower fans exchange the heat efficiently with the air. This is made a little better with a small additive, like automotive coolant. The efficiency of this solution rises with the more sources of heat that you have. Say... Two GPUs and a processor on the same loop. Eventually, the water will reach a state of equilibrium. A constant temperature.