A cathode heater is a heated wire filament used to heat the cathode in a vacuum tube or cathode ray tube. The cathode element had to achieve the required temperature in order for these tubes to function properly. This is why older electronics often needed some time to "warm up" after being powered on; this phenomenon can still be observed in the cathode ray tubes of some modern televisions and computer monitors. The cathode heats to a temperature that causes electrons to be 'boiled out' of its surface into the evacuated space in the tube, a process called thermionic emission. The temperature required for modern oxide-coated cathodes is around 800–1000 °C (1472–1832 °F)
The cathode is usually in the form of a long narrow sheet metal cylinder at the center of the tube. The heater consists of a fine wire or ribbon, made of a high resistance metal alloy like nichrome, similar to the heating element in a toaster but finer. It runs through the center of the cathode, often being coiled on tiny insulating supports or bent into hairpin-like shapes to give enough surface area to produce the required heat. The ends of the wire are electrically connected to two pins protruding from the end of the tube. When current passes through the wire it becomes red hot, and the radiated heat strikes the inside surface of the cathode, heating it. The red or orange glow seen coming from operating vacuum tubes is produced by the heater.
There is not much room in the cathode, and the cathode is often built with the heater wire touching it. The inside of the cathode is insulated by a coating of alumina (aluminum oxide). This is not a very good insulator at high temperatures, therefore tubes have a rating for maximum voltage between cathode and heater, usually only 200 - 300 V.
Heaters require a low voltage, high current source of power. Miniature receiving tubes for line-operated equipment used on the order of 0.5 to 4 watts for heater power; high power tubes such as rectifiers or output tubes would have used on the order of 10 to 20 watts, and broadcast transmitter tubes might need a kilowatt or more to heat the cathode. The voltage required was usually 5 or 6 volts AC. This was supplied by a separate 'heater winding' on the device's power supply transformer that also supplied the higher voltages required by the tubes' plates and other electrodes. A more common approach used in transformerless line-operated radio and television receivers such as the All American Five was to connect all the tube heaters in series across the supply line. Since all the heaters were rated at the same current, they would share voltage according to their heater ratings. Battery-operated radio sets used direct-current power for the heaters, and tubes intended for battery sets were designed to use as little heater power as necessary, to economize on battery replacement. Radio receivers were built with tubes using as little as 50 mA for the heaters, but these types were developed at about the same time as transistors which replaced them. Where leakage or stray fields from the heater circuit could potentially be coupled to the cathode, direct current was sometimes used for heater power. This would eliminate a source of noise in sensitive audio or instrumentation circuits.
Read more about this topic: Hot Cathode