Measuring coolant temperature

My 1947 Hudson pickup has just been converted to 12 volts. For the gauges, I used Park Waldrop's instructions for 48 - 50, i.e. install a 15 ohm, 5 watt resistor in the line to each gauge.  I think the gauge is reading high, but I'm not sure because the truck had overheating problems prior to the conversion, and I had not had it on the road to verify whether or not the cooling system fixes were successful. My first question is, does anyone know if a hand-held infrared thermometer pointed the the temperature sending unit will give a reliable reading of the actual coolant temperature?  When the infrared thermometer reads 165 at the sending unit, the gauge is about 2/3 of the way to hot and the coolant at the top of the radiator measures about 150 degrees.  When the reading at the sending unit is 182 degrees, the gauge reads just above 3/4 of the way to hot and the coolant at the top of the radiator measures about 165. It only gets that hot idling, and if the engine is revved, it will drop back to the range of 165 at the sender.  So, does this methodology seem like a reliable way to measure actual engine temperature?
Second question, does anyone know what the temperature sending unit resistance readings should be at various temperatures?  I was getting a reading of 7.8 ohms at about 165 degrees (between the wire & ground with the ignition switch off). Does someone have a chart of temperatures and correct ohm readings for those temperatures?

Peter Cohen
Pleasanton, CA

Comments

  • I hope someone out there understands how the temperature gauge system works on the 1947 Hudson...
    Since I didn't get an answer to the original question, I decided to take some measurements.  Immersing the sending unit in heating water, I measured the following resistances:
    150 11.2 ohms
    160 11.1 ohms
    170 11.1 ohms
    180 10.6 ohms
    212 10.3 ohms.
    This really doesn't seem like a great variation, so I remained skeptical, but it did tell me that increased temperature = lower resistance.  I then obtained a 0 - 100 ohm potentiometer (variable resistor), and hooked it between the sensor wire and ground (eliminating the sending unit). I dialed the potentiometer until the gauge was exactly in the middle.  This produced a reading of 20.5 ohms cold to 22.3 ohms hot. (The potentiometer was not robust enough to properly handle the current, and it would heat up when used like this. The higher resistance reading came after a minute or so.)

    In theory, if the 10.6 ohms produced by the sender at 180 degrees caused the gauge to read 3/4 high, and 20.5 ohms caused the gauge to read in the middle, adding a 10 ohm resistor between the sender and the wire should correct the gauge reading. WRONG!!!!!!!!!!!!!!!

    What I found was that adding 10 to 20 ohms in series with the sender made virtually no difference in the position of the gauge. (I also found that adding 10 to 20 ohms in parallel with the sender made virtually no difference.)

    Next, I tried varying the resistance on the power input to the gauge. (Remember, my starting point was Park Waldrop's advice to add a 5W 15 ohm resistor to the input side.)  I tried using two 1W 10 ohm resistors in series (20 ohms resistance), then a single 10 ohm resistor, and then no resistor at all. None of this had a significant effect on the position of the needle.  (I ran these tests with the potentiometer installed, in place of the sender, dialed up to read 3/4 hot.)

    Something I noticed, while running the engine at operating temperature (180 degrees), with the sender connected as normal and measuring the resistance between the connector on the sender and ground, the resistance fluctuated drastically, moving between 13 ohms at idle, and 3 ohms with the engine revved up a bit.

    So now I am completely flummoxed as to understanding exactly how this circuit works, and what affects the readings.  Apparently, the temperature sender is not a simple thermistor.  But, what the heck is it?
  • Park_W
    Park_W Senior Contributor
    edited January 2015

    The Google source discusses a thermistor type sensor, but the '50 and earlier (i.e., King-Seeley) gauges are not that type.  Instead there's a bi-metallic strip and electrical contacts in both the sensor and in the gauge.  As the contacts open and close with fluctuation of the bi-metallic strip, an intermittent voltage is supplied to the gauge, with the length of time of the "pulses" varying with temperature.  That's why, Peter, if you measure the voltage at the gauge you'll see it fluctuate in a regular manner.  But the needle movement is damped, so you don't see the needle wiggle unless you look very closely.

    The reason for this kind of system is that it's pretty much independent of voltage fluctuations with varying engine speed.  There's very little resistance in the system, so the current supplied to the gauge doesn't change, just the length of time that the current is applied.  I measured the current on a 6v car (my '47 C8), and the recommended resistance to be inserted is based on bench-testing a mockup system's behavior with 12v applied.  The resistor should reduce the current to what it is with 6 - 7v applied.

    In '51 Hudson changed the gauges to a resistive type, which doesn't have the bi-metallic strip arrangement in the sensor or gauge.  Thus the need for the little "instrument voltage regulator" on the '51 and later cars.  But here again, it's actually not a steady voltage but an "off and on" voltage supplied by a bi-metallic strip in the regulator. So here also, the gauge needle is damped to reduce wiggle in the needle.  Modern solid-state instrument voltage regulators supply a steady 5v, which generates a current that's equivalent to the "time averaged current" provided in the original make-and-break systems.

  • Thank you for the explanation, Park.  It sounds like further testing (perhaps using a heavier duty potentiometer on the power source side of the gauge) may yield some results, although from what I have seen so far, it doesn't seem likely. Failing that, I guess Plan B would be to bend the needle so that it reads in the center when the engine temperature would now make it read 3/4 high.
  • Park_W
    Park_W Senior Contributor
    edited January 2015
    Peter, I may have ID'd the problem:  There's a wire that goes to one gauge, then there's a jumper that carries power to the other gauge.  As I recall, the resistor value recommended is for the wire that goes the "first" gauge. That way it's affecting the voltage arriving at both gauges.  Doing each gauge with a 15 ohm resistor will result in less voltage drop, and thus high readings.  I'll do a double check (since I'm watching the Packers game) and confirm the resistor value test.
  • I do, in fact, have a resistor on both gauges. But the resistors are between the power wires and the gauges themselves.

  • Park_W
    Park_W Senior Contributor
    edited January 2015

    OK, so you've got the gauges powered separately.  That actually makes things better.  A 20 ohm resistance in each gauge feed wire should be pretty accurate.  Doesn't  seem like much difference, but look at it on a percentage basis.

    The 15 ohm figure is a compromise for when the two gauges are feeding off a single power wire.  The complication is that with each gauge having its own "make and break" contact points, it's likely they're not going to be opening and closing the circuit in unison, so the current is going to fluctuate somewhat randomly between 0 and about 0.6A.

  • Per
    Per Member
    Park,

         My understanding of the system is that the sender unit (screwed into the head of the engine) is a make-and-break unit.  However, the gauge is not.  The current through the bimetallic strip in the gauge causes it to bend.  The current moves the gauge, but there is no set of contact points inside the gauge.  The gauge is damped ( it responds slowly to changes in current).  Therefore it does not wiggle as the sender supplies current for a few seconds, then does not supply current, then supplies current again, etc.

         When I installed a 1951 engine in a 1949 car, I found that a 20 ohm resistor was about right so the temperature gauge reads at the middle when the engine is at normal operating temperature.  However, it reads backwards, so when the engine is not yet at normal operating temperature, the gauge is reading high.  I do not have a voltage stabilizer supplying the gauge, so the reading is off a bit when the car is not charging, which I have gotten used to.

                    Per
  • The 1947 gauge also read backwards when I installed a stepdown sending unit. That led me to go back to the splasher  sender.  But I can't recall why I switched sending units in the first place, nor whether the one that is now installed is the one that was there in the first place.  I do have another splasher sender but it looks terrible from the outside.  I guess I'll try that one before modifying the needle, if the 20 ohm resistor has no effect.

    Peter
  • Per
    Per Member
    Peter,

         I once had a 1941 Hudson where the sender unit became unreliable.  Since that was not a situation I liked, I asked around, and was told that the opening and closing of the contact points in the sender unit would eventually wear out those points.  Therefore I switched to the newer (starting in 1951 in Hudson) sender, and lived with the reversed readings, in order to get increased reliability. 

                    Per
  • Park_W
    Park_W Senior Contributor
    Park,

         My understanding of the system is that the sender unit (screwed into the head of the engine) is a make-and-break unit.  However, the gauge is not.  The current through the bimetallic strip in the gauge causes it to bend.  The current moves the gauge, but there is no set of contact points inside the gauge.  The gauge is damped ( it responds slowly to changes in current).  Therefore it does not wiggle as the sender supplies current for a few seconds, then does not supply current, then supplies current again, etc.
     
                    Per

    Per, of course you're right on there being no make/break contacts in the gauge.  I should've taken a quick refresher look at the manual before trusting my ancient memory!  Fortunately, that difference doesn't change anything in my suggestions to Peter.
  • Update... I obtained a 5 watt 0 - 100 ohm potentiometer and wired it into the circuit. Tweaking the resistance did not make a lot of difference, and even zero ohms of resistance did not get the reading low enough.  So, it was time to turn my attention to the gauge itself. The gauge is held on by 2 screws that are easily accessible without removing the instrument cluster. I was hoping for a screw adjuster on the back, but no such luck. So, I removed the faceplate (which is crimped on in 4 positions), and upon examining the inside of the unit, the adjustment mechanism became obvious.
    There are two adjusters. The one on the left (when viewed from the front of the gauge) adjusts the bi-metallic strip, and the one on the right adjusts the contact. It is not necessary to take off the faceplace to do the adjustment. There are two holes on the back which become accessible when you remove the gauge from its mounting plate. There are four teeth on each adjuster that allow you to move it with a small screwdriver. I did all my adjusting on the bi-metallic strip side. (I will attempt to attach a photo. Hopefully it will be visible to the forum readers.)

    I actually brought the wiring down so that I could hook up the gauge and leave it hanging under the dash while I got the engine up to temperature and made the adjustments. (A significant amount of movement of the adjuster is required.). I decided to retain the 5 watt 15 ohm resistor that Park had originally recommended, and all testing was done with that in place.

    Use caution... The needle is very thin and it is easy to bend it so that it drags on the gauge.  Lots of tweaking and testing was required before the final re-installation of the unit into the dash, and every time you touch the unit, you have another chance to bend it.
This discussion has been closed.