°K to °C conversion: hardware or software?

This is not used for conversion but for normalisation. We normalise our input ranges because we want better, more precise measurement. For example, at 10mV/K, assuming 0V represents 0K, 5V would represent 500K. That's roughly 226C. If what we want to measure never reaches that temperature then we are wasting a lot of resolution on a range which will never be measured. Software cannot rectify this since the data is already sampled. How is the software supposed to know if 120K is really

120.2K or 119.9K if the hardware can only tell it 120K? Also, very few things which we usually want to measure reach down to zero K. Again we are wasting resolution by building hardware that can measure what will never be measured. So, people 'scale' the hardware to pick the appropriate range for example:

-20C to 80C for room temperature

-40C to 10C for freezers

50C to 400C for ovens

then you scale again in software to convert to human readable form.

Reply to
slebetman
Loading thread data ...

Oh yes there is. At least that's how we've always been taught in my country.

It's degrees Celcius but it's just Kelvins (without degrees).

Reply to
slebetman

BTW, note that there's no '°' used with Kelvins.

Anyway, if you need a 60°C span, then you have a span of 600mV with

10mV/K. If you scale and offset the voltage with a good op-amp, precision resistors, and an accurate and stable voltage reference, you can have a span of 5000mV, so almost two orders of magnitude better resolution with a given ADC resolution. OTOH, you have new sources of error to consider.

Whether it makes any difference for your application is something you have to decide-- it's not a matter of taking a popularity poll. High resolution (and low noise) helps in control applications in allowing you to approach an analog circuit in performance. For example, if you're implementing digital PID control and you have crap ADC resolution then your derivative signal will be crap. You can filter it so the actuator doesn't flap all over the place, but that has other undesirable effects-- once the resolution is gone, it's gone.

Best regards, Spehro Pefhany

--
"it\'s the network..."                          "The Journey is the reward"
speff@interlog.com             Info for manufacturers: http://www.trexon.com
Embedded software/hardware/analog  Info for designers:  http://www.speff.com
Reply to
Spehro Pefhany

I have to measure two temperatures: the first one is outdoor, the second one instead is indoor.

So I was thinking to use two very common sensors: AD590 for the outdoor and LM335 for indoor.

The measurement will be managed by a National Instruments DAQ USB card; the temperature will be displayed on PC.

The unit to use must be °C; the sensors are in °K so, of course, I have to make a conversion.

Then I have two options, the first "software" and the second "hardware"

1 - Acquire voltage related to °K (10mV/°K) - for the AD590 I have to use a I/V converter (of course) then convert it (with a simple software statement in source code) in °C

2 - Acquire voltage "scaled" to

0...5V, -20°C->0V and 40°C->5V

Thanks to Google I saw that the first solution is rarely used while the second solution is more used but of course involves more components.

I was wondering which solution was the better and why "software" solution is rarely used.

Thanks

Reply to
__frank__

"__frank__" wrote in message news:qZZaf.23478$ snipped-for-privacy@twister1.libero.it...

Actually you use both. No 2, _does not_ perform °K to °C, but simply scales the output range, to make best use of the ADC. So the resulting reading from the ADC, will be '0' for -20, and a software subtraction or addition will then be used to generate the required temperature reading. So the software solution _is_ being used. What is being done, is that the I/V conversion factor is being chosen to give an output for the full span of the ADC, for the required reading range, to make best use of the available resolution of the ADC. This is independant of the need to scale the temperature. The resulting output range,is then having a voltage subtracted, to bring the bottom of the used range, to the bottom of the ADC range. If (for instance), you have a 10bit ADC, you have potentially (ignoring ADC errors), 1024 'points' across the required 60C range. If instead you wanted to do the complete solution in software, and instead set the I/V factor, so that the maximum 40C 'point' was at 5v, with 0K at 0V, you would only have 131 points across the same -20 to +40 range.

Best Wishes

Reply to
Roger Hamlett

Becase the datasheets & appnotes for the devices were probably written when software was expensive due to the cost of the hardware to run it on, and most instrumunts were all analog and did not contain microcontrollers. Doing it in hardware nowadays would be silly.

Reply to
Mike Harrison

BTW, note that there's no '°' used with Celcius

Slurp

Reply to
Slurp

I read in sci.electronics.design that " snipped-for-privacy@yahoo.com" wrote (in ) about 'Re: °K to °C conversion: hardware or software?', on Sat, 5 Nov 2005:

Well, it's 'Celsius' and it does have the degree sign. But 'K' can have it or not, and they mean different things. '100°K' is 100 degrees above absolute zero, whereas '100 K' is simply any temperature difference of

100 kelvin.
--
Regards, John Woodgate, OOO - Own Opinions Only.
If everything has been designed, a god designed evolution by natural selection.
http://www.jmwa.demon.co.uk Also see http://www.isce.org.uk
Reply to
John Woodgate

I read in sci.electronics.design that John Devereux wrote (in ) about 'Re: °K to °C conversion: hardware or software?', on Sun, 6 Nov 2005:

I sympathise, without agreeing.

You are confusing the scale and the unit still. I don't find it easy to explain. The kelvin is certainly a unit. The ratio of two temperature differences is potentially just as meaningful, in an appropriate context, as the ratio of two Kelvin temperatures.

I don't know of any scale based on that, but that's a side issue.

You can look at the matter like this. Any temperature difference can be expressed in kelvins. But a temperature difference from absolute zero is distinguished by having the ° sign included.

--
Regards, John Woodgate, OOO - Own Opinions Only.
If everything has been designed, a god designed evolution by natural selection.
http://www.jmwa.demon.co.uk Also see http://www.isce.org.uk
Reply to
John Woodgate

I read in sci.electronics.design that " snipped-for-privacy@yahoo.com" wrote (in ) about 'Re: °K to °C conversion: hardware or software?', on Sun, 6 Nov 2005:

Did anyone say they had? Such a notation could have been introduced IF the idea of distinguishing between temperature differences and actual temperatures had preceded the move to the Kelvin scale and the kelvin unit.

Yes, and I tried to explain why.

Yes, which is not quite rigorous. But to make it rigorous, the concept of a unit 'celsius' would have to be introduced, identical to the kelvin.

See the explanation above.

This (C°) is a notation used only (AFAIK) by meteorologists. Everyone else uses K, except for Americans, of course. (;-)

--
Regards, John Woodgate, OOO - Own Opinions Only.
If everything has been designed, a god designed evolution by natural selection.
http://www.jmwa.demon.co.uk Also see http://www.isce.org.uk
Reply to
John Woodgate

I read in sci.electronics.design that John Devereux wrote (in ) about 'Re: °K to °C conversion: hardware or software?', on Sun, 6 Nov 2005:

I think they changed the rules at some point, to distinguish clearly between absolute temperatures and temperature differences.

No, degrees ('degrees Kelvin) are associated with a *scale* - the Kelvin scale in this case. Plain units are not associated with a scale, which is why expressing temperature differences in degrees is incorrect.

There is a similar thing with weighted noise levels. The difference between 80 dB A-weighted and 90 dB A-weighted is 10 dB, not 10 dB A-weighted. It's not weighted at all; it's a constant 10 dB at all frequencies.

--
Regards, John Woodgate, OOO - Own Opinions Only.
If everything has been designed, a god designed evolution by natural selection.
http://www.jmwa.demon.co.uk Also see http://www.isce.org.uk
Reply to
John Woodgate

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.