--a question that doesn't have a simple answer :-)
The only direct comment I'll give is to your wavelength range: 8 -- 12 micrometers is used for low temperatures at (usually) low accuracy.
(Yes, CoB, "micr Sheesh. Three different answers. All correct. All incorrect. Those calling the others wrong are at least as wrong, and the opposing views are at least as right.
All these temperature sensors measure the power of absorbed radiation incident upon the sensor itself. From this measurement of absorbed incident power, you can infer the temperature of the source, _provided_ you know, among other things:
--the emission properties of the emitter (what you're trying to measure)
--the transmitting properties of the propagating medium
--the absorption properties of the sensor
--the ratio of emitting source area to surrounding field of view
--the temperature of the surroundings
--etc., for characteristics having lower impact upon measured accuracy.
The bolometer is one such sensor. It absorbs a wide range of radiation, and integrates its measurements by converting the total absorbed power to a local temperature. If you know its absorption spectrum, and its specific heat, and the temperature of its surroundings (into which it's radiating its own energy due to the absorbed heat), you can infer the total radiation power incident upon it. If, then, you know the transmission characteristics of the propagating medium (for instance, air and the lens system), you can infer the emitted power per area of the emitting source you're hoping to measure. If, finally, you know the emitting spectral characteristics of the emitter, you can infer its temperature.
The various semiconductors absorb radiation and either convert it directly to electrical energy or change their own characteristics, for instance, conductance, in a predictable fashion. Then all the external considerations I listed above apply before you can infer a temperature.
The thermopile is a string of thermocouples that generate a voltage based on the temperature of the sensor (bolometer) they are embedded into. Then all the external considerations I listed above apply before you can infer a temperature.
The typical garden-variety IR thermometer makes a boatload of assumptions about all these characteristics and takes a more or less educated guess as to what the emitter's temperature is. For most common situations, you know the absorption characteristics of the sensor and the propagation medium (air and lenses) pretty accurately. Since different emitters typically are different in emission, and sometimes very, very different, the emitters are usually a problem. Thus, you'll find that many high-quality IR thermometers have an adjustment for emission characteristics.
And BobW was right about the total amplitude not being accurate -- at least in some sense. That's why the higher-quality radiation thermometers use the two-color method. This, too, has to deal with emission, transmission, and absorption spectral characteristics. But the real problem area, the emitter's characteristics, are much easier to sort out with the ratio of two appropriate wavelengths.
And, of course, if you can get close to a black-body source for your emitter, the two-color method gives you a precise, accurate fundamental temperature measurement -- after you have accounted for the field of view, transmission medium, and sensor absorption characteristics.
John perry