Oven temperature calibration Problem

I have a problem calibrating the oven temperature for cheap benchtop toaster ovens.

I decided against using a thermocouple since the wires would leave the oven door slightly open and could disturb the air flow inside the oven.

I took two new oven thermometers from different manufacturers and put them on an aluminum pan in the oven.

I took some standard 63/37 rosin core wire solder and melted a blob on the aluminum pan.

I changed the oven temperature to find the melting point of the solder. It is extremely sharp. A couple of degrees either way means melting the solder or leaving it solid.

The melting point of 63/37 solder is supposed to be 361 degrees F. I verified the melting point is extremely sharp. However both oven temperature gauges read 345 degrees F.

Where is the discrepany coming from. Are the two oven thermometers both miscalibrated by the same amount? or is it possible that contamination has changed the melting point of the solder?

Could contamination change the melting point and still leave it extremely sharp?

Any other ideas on how to calibrate the temperature would be highly welcome.

Thanks.

Reply to
Steve Wilson
Loading thread data ...

The melting points of pure elements are used for temperature calibrations in that sort of range. Lead, at 327C and cadmium at 321C are the obvious candidates. Zinc, which melts as 402C, might also be useful.

formatting link

Anthraquinone (C14H8O2) which melts at 286C might also be useful.

You can get high temperature thermistors with tight tolerances (and make the tolerances tighter by careful calibration).

formatting link

--
Bill Sloman, Sydney
Reply to
bill.sloman

Unlikely. Alloys with sharp melting points are almost always eutectic. Move away from the eutectic ratios, and the melting point is not sharp, as a rule.

Clifford Heath.

Reply to
Clifford Heath

Even if the toaster ovens aren't advertised as radiant heat models the toaster ovens I've used are pretty compact so I suspect that the heating elements are pretty close to your thermometers and are doing a lot of direct radiant heating where the emissivity and the distance from element to thermometer is important. That will give a false reading compared to the tray temperature and since the blob of solder is probably a different distance from the element than the thermometers that will give an error, too. I would fold up a little cover from aluminum foil to cover the thermometers and solder, with windows just big enough to read each thermometer and see the solder on the side facing the door so no direct line-of-sight path from a heating element to the inside of the box. That gives you a little oven within the oven so the internal temperature should be much more uniform. In the end, however, you will put a reasonably flat circuit board in the oven of some moderately dark color with pretty short components on the solder pads so the real "calibration" is whatever you need to set the oven dial to in order to get uniform soldering, no matter what your thermometers said :-).

--
Regards, 
Carl Ijames
Reply to
Carl

I did this with a number of different ovens. The result was the same.

I used an aluminum drip pan turned upside down on a lower grate. Yes, it is only a few inches away from the lower quartz heating elements, but they cycle on and off with a very low duty cycle. Mostly off.

The heavy aluminum drip pan provides a fairly uniform temperature surface. The two oven temperature gauges are side-by-side and both read the same temperature +/- several degrees. I can move the solder blob anywhere on the drip pan and get the same results.

My question is the oven temperature gauges are made by the millions, perhaps tens of millions. Any vendor will have some sort of accuracy specification that must be met. That's why the two gauges read the same.

However, 63/37 solder has an extremely sharp melting point that is the lowest value for any tin/lead ratio. Presumably if contamination can affect the melting temperature, it would be higher than 361 degrees F.

63/37 solder has been made for decades, and is made by the ton. You can easily tell if the ratio is correct by monitoring the melting temperature.

So the problem is which one is correct? The temperature gauges, or the solder melting point?

Thanks.

Reply to
Steve Wilson

...

Alloys usually melt at lower temperature than the pure-metal components.

Why can't they both be correct? The solder is resting on the pan, and has significant heat conductivity to the pan as well as radiative contact with other surfaces in the chamber. But, the contact with the pan and radiation from other surfaces aren't the same for the other sensors. You could couple these better in a liquid bath (boric acid, for instance, stays liquid in this temperature range) or with a conductive block with a few thermal wells... but that makes the solder hard to view.

Any oven, with heat elements constantly injecting heat, has thermal gradients.

Reply to
whit3rd

Until you reach a new eutectic point contamination always lowers the melting point. (something about entropy)

the solder.

--
  When I tried casting out nines I made a hash of it.
Reply to
Jasen Betts

Steve Wilson wrote

Why not take the hot molten solder pot out, put thermocouple in the solder, read temperature when it solififies?

Later melt it again get thermocouple back.

I use induction heater

formatting link

Reply to
<698839253X6D445TD

That was my thought. (radiation). Steve, put a little pcb in the oven and stick a thermocouple right on it. Measure T, as solder melts. (you can get TC's with ridiculously small wire that will fit through any oven door seal.)

George H.

Reply to
George Herold

Yes, with the additional assumption that the solder in the pot is close to isothermal, or that the wire is insulated. Otherwise you'll get some average of the temperature along the part of the wires that is electrically connected to the solder, so it will read low. You can think of it as a continuous array of thermocouples wired in parallel with some distributed series resistance.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

Phil Hobbs

All you need is a drop of solder. This scan be seen in the link your snipped:

formatting link

Reply to
<698839253X6D445TD

You were talking about a solder pot.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

en

m
e
t

er

s

You don't care about absolutes, all you care about is repeatability. It's e nough for you to note the oven temperature setting that melts the solder, a nd then mark that dial setting. The oven thermometers are probably bimetals with fairly large time constants, and the oven thermostat is probably equa lly crude, and cheap. The technology they use is good enough for an oven ap plication because the food mass being cooked has an even longer time consta nt. But they're not good enough for a minuscule solder blob test. For that you would use something else not worth going to the trouble to describe.

Reply to
bloggs.fredbloggs.fred

Phil Hobbs

Small one?

Maybe we could agree to a closing statement that E = aproximately m.c^2 ? That would show we are able to reach concensus.

Or else that human made global warming is a scam?

Reply to
<698839253X6D445TD

Thanks for the comments. You started me thinking, and it looks like the problem is solved.

I changed to a different oven. This has an internal fan to circulate the hot air:

Black and Decker TO4314SSD Toaster Oven

formatting link
I took the bread pan an laid it on its side, then put the thermometer and

63/37 solder inside. This shields the the items from the direct heat of the top elements, and reduces the heat from the lower elements.

The thermometer now reads 375 F when the solder melts, where before it read

345 F.

So the problem appears to be getting a uniform heat distribution. I am now satisfied that the readings as about the best they can be in this configuration.

If your wife likes baking and doesn't have an oven thermometer, I can recommend this model:

Canadian Tire Oven Thermometer

formatting link

0421609p.html Unlike the other one from Walmart, it has the Fahrenheight scale on the outside where there's more room. The graduations are further apart and easier to read, and most of the recipies are in Fahrenheight. They are very inexpensive - around CAD$6.90
Reply to
Steve Wilson

Thanks for the reply. You are probably correct. I changed to a different oven and used the bread pan to shield the items from direct radiation. The thermometer now reads 375 F when the solder melts instead of 345 F in the original configuration. The problem is getting uniform heat distribution. I am satisfied that is about the best I will get with this measurement.

See my reply to whit3rd for more details and links.

Reply to
Steve Wilson

This seems to be wrong. The thermocouple voltage is generated at the junction of two dissimilar wires. Temperature gradients along chunks of identical wire shouldn't generate any potential difference.

Thermal conduction along the wires could change the temperature at the junction, but even insulated wires in a solder pot are going to get very close to isothermal inside the pot.

--
Bill Sloman, Sydney
Reply to
bill.sloman

but the wires are different all the way back to the meter where the reference is

so won't it be metalA/metalB in parallel with a metalA/solder/metalB, and metalA/solder/metalB the equivalent to a metalA/metalB junction

Reply to
Lasse Langwadt Christensen

Yup. You can make quite reasonable thermocouples by soldering the wires together.

The resistivity of the solder is probably higher than the metal, but not that much higher. To a zero-order approximation you'll measure the temperature of the top of the solder, because contributions from lower down will get shorted out before they get there. Jan's tiny solder glob is probably a bit better but suffers from much larger temperature gradients, so it's not obviously a big win.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC / Hobbs ElectroOptics 
Optics, Electro-optics, Photonics, Analog Electronics 
Briarcliff Manor NY 10510 

http://electrooptical.net 
http://hobbs-eo.com
Reply to
Phil Hobbs

Yes, but not of the thermocouple wires are insulated, as Phil Hobbs suggested to be desirable.

--
Bill Sloman, Sydney
Reply to
bill.sloman

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.