By the sounds of your explanation there is nothing wrong or unexpected with your results or the MOSFET. I assume you are using a digital multimeter with a 10 Megohm input resistance on the voltage scale you are using.
If that is true, then take a look at the datasheet's drain-source leakage specification. It claims a maximum leakage current of 20uA at 16V drain-source with 25 deg. C junction temp. Ohms law says the 4.5V you measured across your meter's internal 10 Mohm resistance suggests a current of 450nA. Your meter was likely functioning as a pull up resistor, albeit
10Mohm. Is it possible your particular MOSFET sample under your conditions had a leakage current of 450nA (with 12V-4.5V=7.5V drain-source)? I would say so. The maximum spec isn't by any means a typical spec, and is probably quite a bit larger than typical. The other devices you tested probably have different actual leakage, hence the different results, but that doesn't mean they are broken either.It doesn't really make much sense to be measuring voltages on high impedance nodes anyway. When the MOSFET is off the drain is close to floating (assuming no load is attached to the drain), though not totally (pulled to source potential with a small amount of leakage). When you attached your meter to 12V and the drain you made a high impedance voltage divider. When you attached your meter from drain to gnd, you effectively put the 10Mohm meter resistance across the leakage "resistor", so you no longer had a voltage divider, just two resistors pulling to ground. Hence the zero volt reading.