Why it is destructive? To get discharge, you need free electrons. To get them, you need to liberate them from the cathode material. One way is to provide high electric field, which practically pulls them out by force. That is how cold cathodes work. The side effect is, that high electric field also accelerate the ions from the plasma, which then hit the cathode material with high momentum (because they are rather heavy) and practically sandblast it off, while that released material land somewhere and darkens that region out, aka sputtering. There are various designs to both reduce tyat abrasion and to make the material to practically settle back. The other drawback of this type of electrode is tge high voltage drop across this region, which consumes power but does not generate light, but heat up the cathode instead (by the ion bombardment).
The hot cathodes use different approach: By heating the cathodes, the heat is what provides the electrons with the energy to get released readily, so immediately contribute to the light generation, eliminating the area with high field, so there is nothing accelerating the ions anymore. But for that the cathode has to be kept warm at the emission temperature. If not, the heat won't be sufficient to liberate the electrons, so some of the energy has to be supplied from elevating the field. Normally cathodes are designed at a balance, when the field is low enough to make the ion speed low enough to limit sputtering, but not too high so the emission coating does not evaporate that easily. If you operate the lamp at too low current without providing auxiliary cathode heat, the temperature will drop and the sputtering will become faster. The problem is, keeping the cathode warm also cost a power that does not generate light. And this tends to be lower percentage with longer lamps.
Why fluorescents are underdriven in battery lanterns? There is a problem when designing rather low power battery lantern, lets say 130lm: If you use 4W lamp and drive it full, you get full lifetime, but the efficacy would just be of the 4W lamp, so need to feed it by 4W from very expensive batteries (assume single use ones). So such lantern may be cheap on lamp cost (because lamp will last 5..8k hours), but rather expensive on battery cost (needs full 4W). But if you use 8W rated lamp but drive it at 2..3W, the lamp cathode will be underdriven, so result in shorter lamp life, but consume just 2..3W from the batteries. So what you get is 30..50% savings on battery cost, at the expense of tge lamp lasting few 100's hours. That means you pay more for the lamp wear, but save many times more on the battery cost (don't forget 30% lower drain uses to yield 30% higher usable capacity with most batteries, so double the battery runtime). By thbe way thhe same is the reason why the lamps are driven at DC (the lamp itself is what rectifies the current): At DC only one side is the cathode, so heating only one side is enough. This by itself means about factor of 2 of total power reduction without any detrimental effect on the lamp life, so the 2..3W the F8T5 uses to be operated at means in fact operation at 50..66% of the rated load, so still fairly close to the rated hot cathode operation. It just takes longer to warm the electrodes up, so the life limitation comes mainly from starting (at battery power the turn ON time per start is a few minutes, not the 3 hours like the general lighting is rated for).
So for a lantern designed to deliver what F4T5 is rated to deliver, about 8W lamps are used. The F4 and F6 are then used for even lower light outputs, so power levels around 1..1.5 and 1.5..2W. Bellow that the fluorescents do not make any efficacy sense anymore, you will become better with incandescents (all above is written from the perspective of 70's till 90's, before LEDs were available).
|