I know this is a bit off-topic for this group, but I thought somebody might have faced this problem before. We need to put a low-light level video camera in a vacuum chamber to detect the image on a phosphor screen when the accelerator technician is tuning the beam at a particle accelerator lab. In the past, they have used "bullet cameras" for this and considered they were expendable. if left on more than 10 minutes, poof! I wanted to see if we could make the camera survive longer. It is 19mm in diameter and about 40mm long, in an aluminum anodized housing. The power dissipation is 792 mW (66 mA at 12 V DC.)
After a bit of brainstorming, we came up with the idea of a copper sheet with a semicircular trough to match the camera's diameter, and a clamp to press the camera into the trough. We have some Bergquist material here (GapPad) that we use to conduct heat from chips to cold plates that we would put between the camera and copper sheet to aid conduction.
But, I have NO IDEA how to figure where it will reach thermal eqilibrium depending on size of the sheet. The chamber is maybe
10 m^3, aluminum, and at room temperature. We have water cooling of other systems in the chamber, but the camera will be far from that equipment, and the less water lines we run, the better. So, if a reasonable sized piece of copper sheet will do, plain radiation would be the easiest way to go. So, if we had two "wings" of, say, 100 cm^2 each, that gives 400 cm^2 of radiating surface (both sides of both wings). Anybody know where tables of radiated power vs. temperature difference can be found? (Right, you already KNOW I'm not a physicist, now.)I think the camera can safely be allowed to self-heat to 35 - 40 C case temperature.
Thanks in advance for any tips or past experience!
Jon