Author Topic: Why did they even bother with 50-65% output residential ballasts?  (Read 3254 times)
Luminaire
Member
***
Offline

View Posts
View Gallery

Why did they even bother with 50-65% output residential ballasts? « on: April 21, 2011, 11:03:07 PM » Author: Luminaire
Many of residential market T12 ballasts have a ballast factor ranging from 0.50 to 0.65, often driving a pair of lamps.

Why did they even exist? Why not just drive one lamp at 0.95 to 1.00? 

The standard output for commercial use T8 is now 0.87 and even residential ones have the same output.  There are some special 0.61 output ballasts, but they're designated XL(Extra Low) output and hardly used. 
Logged
Medved
Member
*****
Offline

Gender: Male
View Posts
View Gallery

Re: Why did they even bother with 50-65% output residential ballasts? « Reply #1 on: April 22, 2011, 02:17:01 AM » Author: Medved
Household usually use lamps closer to illuminated are and do not need as intense illumination, so require less intense light sources (so the 4' tube should have only about 1000lm). When more light is needed, it is usually required at larger area, so two tubes are needed anyway.
And cheapest way is to take the most common commercial lamp (F40T12) and drive it at reduced output (~25W). With RS ballast this does not harm the lamp, as it keep electrodes at correct temperature by auxiliary heating - so in fact no need for special (and so more expensive, as household market is way smaller then professional) long, but lower output lamps.
Then some lamps intended only for home use (e.g. low CCT, higher CRI phosphors) were rated directly for 25W (it offered higher efficacy, as the lower current filament consume less power for it's heating)
Logged

No more selfballasted c***

Luminaire
Member
***
Offline

View Posts
View Gallery

Re: Why did they even bother with 50-65% output residential ballasts? « Reply #2 on: April 22, 2011, 02:43:45 AM » Author: Luminaire
But they're not installing those 0.6 to 0.7 ballast factor ballasts into residential T8s they have now.
Logged
Medved
Member
*****
Offline

Gender: Male
View Posts
View Gallery

Re: Why did they even bother with 50-65% output residential ballasts? « Reply #3 on: April 22, 2011, 12:13:51 PM » Author: Medved
There are two aspects, that came to my mind:
1) Spread of fluorescent tube technology make lamps cheaper, so more affordable, what allow to market wider selection of tubes, so tubes with less output per unity length become available (T5,...)
2) T12 operated quite cold, so at normal room temperature they were n the optimum mercury pressure, so reducing the wattage didn't affect it either (as it is between room temperature and full power). T8 need to warm up quite a bit above their "ambient", so the reduced wattage mean they operate colder then designed. This has the impact on the mercury pressure: It is lower then designed and it affect the efficacy. So such systems could not compete in efficacy with T5 (e.g. F21T5).
Try to run an ALTO lamp on reduced wattage...
Logged

No more selfballasted c***

Silverliner
Administrator
Member
*****
Offline

Gender: Male
View Posts
View Gallery

Rare white reflector


GoL
Re: Why did they even bother with 50-65% output residential ballasts? « Reply #4 on: April 22, 2011, 06:08:41 PM » Author: Silverliner
I don't see why low power shoplight ballasts and even current generation CFLs are allowed. A power factor of 50%-70% is very wasteful, you will have to generate more electricity to power them than the actual wattage recorded on the watthour meter due to the higher current draw. A 2 lamp shoplight powering the 40w tubes at 25w apiece will consume 40-50 voltamps apiece (up to 100 VA for the whole unit). You will have to supply 100w worth of electricity to run such awfully underpowered fluorescent tubes.
Logged

Administrator of Lighting-Gallery.net. Need help? PM me.

Member of L-G since 2005.

Collector of vintage bulbs, street lights and fluorescent fixtures.

Electrician.

Also a fan of cars, travelling, working out, food, hanging out.

Power company: Southern California Edison.

Luminaire
Member
***
Offline

View Posts
View Gallery

Re: Why did they even bother with 50-65% output residential ballasts? « Reply #5 on: April 22, 2011, 10:33:51 PM » Author: Luminaire
I don't see why low power shoplight ballasts and even current generation CFLs are allowed. A power factor of 50%-70% is very wasteful, you will have to generate more electricity to power them than the actual wattage recorded on the watthour meter due to the higher current draw. A 2 lamp shoplight powering the 40w tubes at 25w apiece will consume 40-50 voltamps apiece (up to 100 VA for the whole unit). You will have to supply 100w worth of electricity to run such awfully underpowered fluorescent tubes.
PF and BF are unrelated. 

They can still make high power factor low BF ballast, such as the GE-232-MVPS-XL, which drives F32T8s at 61% BF, while having 0.99 PF, as well as 0.9 ballast factor(90%), 0.5 PF such as most residential grade T8 ballasts.

I just don't see why they made shop lites with 50-65% output, but now  85 to 100% output.
Logged
Silverliner
Administrator
Member
*****
Offline

Gender: Male
View Posts
View Gallery

Rare white reflector


GoL
Re: Why did they even bother with 50-65% output residential ballasts? « Reply #6 on: April 23, 2011, 12:31:36 AM » Author: Silverliner
Well the reason for these low power shoplights is simple. Made to be cheap, you get what you paid for.
Logged

Administrator of Lighting-Gallery.net. Need help? PM me.

Member of L-G since 2005.

Collector of vintage bulbs, street lights and fluorescent fixtures.

Electrician.

Also a fan of cars, travelling, working out, food, hanging out.

Power company: Southern California Edison.

Medved
Member
*****
Offline

Gender: Male
View Posts
View Gallery

Re: Why did they even bother with 50-65% output residential ballasts? « Reply #7 on: April 23, 2011, 12:55:53 AM » Author: Medved
@Silverliner:

Power factor of 0.5 does not mean, then you have to GENERATE twice as much electricity, but the power line (transferring the power from generator to the load) is loaded twice as much.
But the low power factor "stop" at near-by power distribution transformer, where the power factor is corrected (phase lag/lead by capacitor/inductor banks, harmonic distortion by combining multiple phases). So lines upstream are then operated at optimum PF, so the low power factor affect then only last few miles.

And why to make low power factor ballasts? Because they are simpler and lighter, so for given budget could be easily more efficient (so dissipate less heat) and/or more reliable (less components to fail}.
And with some product types (mainly the electronic selfballasted CFL's) the HPF is not achievable, as they could not dissipate the extra heat the PFC would create and the already tight space would get even tighter...
« Last Edit: April 23, 2011, 01:09:12 PM by Medved » Logged

No more selfballasted c***

Luminaire
Member
***
Offline

View Posts
View Gallery

Re: Why did they even bother with 50-65% output residential ballasts? « Reply #8 on: April 23, 2011, 01:39:26 AM » Author: Luminaire
Ok, maybe you didn't see my question right. Forget power factor. That is irrelevant and thats not what I'm not talking about.

I asked why make low BALLAST FACTOR. 
Logged
Medved
Member
*****
Offline

Gender: Male
View Posts
View Gallery

Re: Why did they even bother with 50-65% output residential ballasts? « Reply #9 on: April 23, 2011, 01:08:35 PM » Author: Medved
Ok, maybe you didn't see my question right. Forget power factor. That is irrelevant and thats not what I'm not talking about.

I asked why make low BALLAST FACTOR.  

Im sorry for the confusion, this was response to Silveriner's "A power factor of 50%-70% is very wasteful,..."
Logged

No more selfballasted c***

Print 
© 2005-2024 Lighting-Gallery.net | SMF 2.0.19 | SMF © 2021, Simple Machines | Terms and Policies