# °K to °C conversion: hardware or software?

• posted

This is not used for conversion but for normalisation. We normalise our input ranges because we want better, more precise measurement. For example, at 10mV/K, assuming 0V represents 0K, 5V would represent 500K. That's roughly 226C. If what we want to measure never reaches that temperature then we are wasting a lot of resolution on a range which will never be measured. Software cannot rectify this since the data is already sampled. How is the software supposed to know if 120K is really

120.2K or 119.9K if the hardware can only tell it 120K? Also, very few things which we usually want to measure reach down to zero K. Again we are wasting resolution by building hardware that can measure what will never be measured. So, people 'scale' the hardware to pick the appropriate range for example:

-20C to 80C for room temperature

50C to 400C for ovens

then you scale again in software to convert to human readable form.

• posted

Oh yes there is. At least that's how we've always been taught in my country.

It's degrees Celcius but it's just Kelvins (without degrees).

• posted

BTW, note that there's no '°' used with Kelvins.

Anyway, if you need a 60°C span, then you have a span of 600mV with

10mV/K. If you scale and offset the voltage with a good op-amp, precision resistors, and an accurate and stable voltage reference, you can have a span of 5000mV, so almost two orders of magnitude better resolution with a given ADC resolution. OTOH, you have new sources of error to consider.

Whether it makes any difference for your application is something you have to decide-- it's not a matter of taking a popularity poll. High resolution (and low noise) helps in control applications in allowing you to approach an analog circuit in performance. For example, if you're implementing digital PID control and you have crap ADC resolution then your derivative signal will be crap. You can filter it so the actuator doesn't flap all over the place, but that has other undesirable effects-- once the resolution is gone, it's gone.

Best regards, Spehro Pefhany

```--
"it\'s the network..."                          "The Journey is the reward"
speff@interlog.com             Info for manufacturers: http://www.trexon.com```
• posted

I have to measure two temperatures: the first one is outdoor, the second one instead is indoor.

So I was thinking to use two very common sensors: AD590 for the outdoor and LM335 for indoor.

The measurement will be managed by a National Instruments DAQ USB card; the temperature will be displayed on PC.

The unit to use must be °C; the sensors are in °K so, of course, I have to make a conversion.

Then I have two options, the first "software" and the second "hardware"

1 - Acquire voltage related to °K (10mV/°K) - for the AD590 I have to use a I/V converter (of course) then convert it (with a simple software statement in source code) in °C

2 - Acquire voltage "scaled" to

0...5V, -20°C->0V and 40°C->5V

Thanks to Google I saw that the first solution is rarely used while the second solution is more used but of course involves more components.

I was wondering which solution was the better and why "software" solution is rarely used.

Thanks

• posted

"__frank__" wrote in message news:qZZaf.23478\$ snipped-for-privacy@twister1.libero.it...

Best Wishes

• posted

Becase the datasheets & appnotes for the devices were probably written when software was expensive due to the cost of the hardware to run it on, and most instrumunts were all analog and did not contain microcontrollers. Doing it in hardware nowadays would be silly.

• posted

BTW, note that there's no '°' used with Celcius

Slurp

• posted

I read in sci.electronics.design that " snipped-for-privacy@yahoo.com" wrote (in ) about 'Re: °K to °C conversion: hardware or software?', on Sat, 5 Nov 2005:

Well, it's 'Celsius' and it does have the degree sign. But 'K' can have it or not, and they mean different things. '100°K' is 100 degrees above absolute zero, whereas '100 K' is simply any temperature difference of

100 kelvin.
```--
Regards, John Woodgate, OOO - Own Opinions Only.
If everything has been designed, a god designed evolution by natural selection.```
• posted

I read in sci.electronics.design that John Devereux wrote (in ) about 'Re: °K to °C conversion: hardware or software?', on Sun, 6 Nov 2005:

I sympathise, without agreeing.

You are confusing the scale and the unit still. I don't find it easy to explain. The kelvin is certainly a unit. The ratio of two temperature differences is potentially just as meaningful, in an appropriate context, as the ratio of two Kelvin temperatures.

I don't know of any scale based on that, but that's a side issue.

You can look at the matter like this. Any temperature difference can be expressed in kelvins. But a temperature difference from absolute zero is distinguished by having the ° sign included.

```--
Regards, John Woodgate, OOO - Own Opinions Only.
If everything has been designed, a god designed evolution by natural selection.```
• posted

I read in sci.electronics.design that " snipped-for-privacy@yahoo.com" wrote (in ) about 'Re: °K to °C conversion: hardware or software?', on Sun, 6 Nov 2005:

Did anyone say they had? Such a notation could have been introduced IF the idea of distinguishing between temperature differences and actual temperatures had preceded the move to the Kelvin scale and the kelvin unit.

Yes, and I tried to explain why.

Yes, which is not quite rigorous. But to make it rigorous, the concept of a unit 'celsius' would have to be introduced, identical to the kelvin.

See the explanation above.

This (C°) is a notation used only (AFAIK) by meteorologists. Everyone else uses K, except for Americans, of course. (;-)

```--
Regards, John Woodgate, OOO - Own Opinions Only.
If everything has been designed, a god designed evolution by natural selection.```
• posted

I read in sci.electronics.design that John Devereux wrote (in ) about 'Re: °K to °C conversion: hardware or software?', on Sun, 6 Nov 2005:

I think they changed the rules at some point, to distinguish clearly between absolute temperatures and temperature differences.

No, degrees ('degrees Kelvin) are associated with a *scale* - the Kelvin scale in this case. Plain units are not associated with a scale, which is why expressing temperature differences in degrees is incorrect.

There is a similar thing with weighted noise levels. The difference between 80 dB A-weighted and 90 dB A-weighted is 10 dB, not 10 dB A-weighted. It's not weighted at all; it's a constant 10 dB at all frequencies.

```--
Regards, John Woodgate, OOO - Own Opinions Only.
If everything has been designed, a god designed evolution by natural selection.```

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.