Submitted by terjeboe t3_10842e7 in askscience
JamesTKierkegaard t1_j3ueydg wrote
Reply to comment by kilotesla in Will water ice melt faster if allowed to drain, or remain in the meltwater? by terjeboe
It's not glass compared to air that's the issue so much as compared to water or ice. The air is the environment in this situation so if the system is in a glass container the heat transfer to the air will be negligible compared to the surface area of the water and ice (which again depends on the shape of the container greatly). If it's a thin metal container then water remaining will probably win in most configurations simply because it will act as a convective exchange surface. Realistically, radiation is going to be a meager source of heat loss, even hot water radiators to heat houses only supply about 5% of their heat contribution through actual radiation, and that's at higher temperatures.
kilotesla t1_j3uolby wrote
>It's not glass compared to air that's the issue so much as compared to water or ice. The air is the environment in this situation so if the system is in a glass container the heat transfer to the air will be negligible compared to the surface area of the water and ice (which again depends on the shape of the container greatly).
In a series circuit with a very low resistance resistor, a medium value resistor, and a large resistor, fed by a voltage source, the voltage drop across the medium value resistor is affected a lot more by the large resistor than by the smallest resistor. If we have 1 ohm, 33 ohms, and 1000 ohms in series, the drop across the 33 ohm resistor is 3% of the source value, even if we drop the 1 ohm resistor to 0.1 ohms. We can't conclude that the 33 ohm resistor will have a lot of voltage drop because it is huge compared to 0.1 ohms. That doesn't work.
The glass outer surface will be very close to the same temperature as the water. The heat flow per unit area is determined by the temperature difference between the water and ambient. If the outer glass surface were at 1 C instead of zero, the temperature difference with respect to ambient would not change significantly. And the surface temperature wouldn't even be that high.
>Realistically, radiation is going to be a meager source of heat loss, even hot water radiators to heat houses only supply about 5% of their heat contribution through actual radiation, and that's at higher temperatures.
-
5% is way too low. Modern "radiators" have fins which enhance convection but not radiation, so convention is typically larger, but radiation is still about 25%, even just counting the outward facing surface.
-
In the range of temperatures we are talking about, radiation is reasonably approximated by a linear function of temperature difference. Yes I know, that's counterintuitive with that fourth power, but it's T1^4 - T2^4 , not (T1-T2)^4. On the other hand, natural convection is nonlinear enough that it drops as a fraction of overall heat transfer when the temperature difference gets smaller.
Viewing a single comment thread. View all comments