Voltage stabiliser in chipped loco....?

Zerogee

Clencher's Bogleman
Country flag
Hi, this may be a very silly question, but please bear with me.....
A few weeks ago I DCC chipped my black Dingler DR Ts3, some of you may recall the threads about it that I posted on here. The only thing that went wrong is that stupidly I didn't check the lighting voltage first - I set the decoder (Massoth LS) down to 5 volts, thinking that would surely be low enough, and still managed to blow all the bulbs out.... :@ I haven't managed to replace them yet, it's going to be a tricky job as they seem to be tiny grain-of-wheat bulbs that are hardwired in - they appear to be 3 volt, and make the LGB bulbs look robust!
Now, that little disaster aside, I now need to put a similar decoder into the new green version of the Ts3 that I've just acquired - obviously I've learned my lesson about the bulb voltages, but here is where the silly question comes in:
The loco, as supplied in analogue form, has a voltage stabiliser package on a little circuit board - this is a pic of the one that I took out of the black loco:

6246091a73c34cf4b2dd7e4f0dc0098b.jpg


This seems to supply 3 volts to the lighting systems; there is no directional lighting, they are either all on or all off.
Now, when I do the install in the new green loco, would there be any problems if I hedge my bets by leaving this voltage stabiliser IN the circuit rather than removing it, and simply connect the inputs of the stabiliser to the interior light output of the LS - so that whatever voltage the LS puts through it's lighting output, the stabiliser will still drop it to the voltage required for the bulbs? I know this is theoretically redundant, as the LS lighting output could just be set down to 3 volts, but my question is would the belt-and-braces approach of leaving it in actually do any harm?

Jon.
 
If it were me I'd leave it in circuit. As I've said before on other threads, I don't rely on adjusting the decoder CVs to drop the function output voltage, for the very reason you've discovered - ie. if you install a new decoder and forget to make the adjustments then you blow the bulbs!
 
Thanks Nick - the really annoying thing is that I DIDN'T forget to alter the lighting CV - what I neglected to do was to test what voltage the bulbs actually needed, and instead assumed that 5 volts would be low enough.... :@

Now, as we all know, ASSUME makes an ASS out of U and ME.... ;)
Too late, after the event, I checked the output of the voltage stabiliser that I had removed and found it was putting out only 3 volts. Put 5 volts through a 3 volt sub-micro grain-of-wheat bulb, and you get a nice bright flash followed by darkness.....:crying:

Glad you agree with my thoughts, though, that leaving the stabiliser in-circuit can't do any harm.

Jon.
 
Slightly off topic, but still on topic (bear with me...), the dimming function of the Massoth decoder (or any other decoder that uses this function) is not really suitable for 5V light bulbs, as it achieves the dimming by applying maximum voltage for short periods of time to the bulb, i.e. on/off, to achieve the effect of a lower voltage. Consequently, the bulb is still subjected to the maximum voltage, just for very short periods of time, which in turn significantly reduces the life span of the bulb.

I would suggest, if not replacing the 5V in LGB locos with 24V ones or using a separate Massoth voltage stabiliser, to use a decoder with dedicated stabilised and adjustable low voltage outputs such as the Zimo MX695.
 
I concur with Nick.

However, I reckon Mark is also correct about the voltage?

DCC is effectively a frequency modulated format, whereas DC is amplitude....
 
Yes, I was going to mention that the lighting voltage control of a decoder is probably PWM driven in many decoders (just like the motor output) as this is easy to achieve through programming. so unless some sort of smoothing is applied then you're likely to be applying the full voltage across the load even if only for a small part of the duty cycle. However, I must admit I've not put a 'scope across a decoder function output to prove this.
I just avoid the problem and always use the function outputs at full voltage, with suitable resistors for the lighting etc. Resistors are cheaper than bulbs and LEDs!
 
bunnyrabbit03 said:
the dimming function . . is not really suitable for 5V light bulbs, as it achieves the dimming by applying maximum voltage for short periods of time to the bulb, i.e. on/off, to achieve the effect of a lower voltage. Consequently, the bulb is still subjected to the maximum voltage, just for very short periods of time, which in turn significantly reduces the life span of the bulb.
Thats news to me. It implies that operating lamps on AC, 3V rms (equivalent to over 10V pk-pk) will significantly shorten life compared to DC.
Are you sure?

My lab work with incandescent lamps showed that thermal issues were probably the most significant. You might expect that a device that emits light, because it is heated, depends on temperature.
I don't understand why voltage rather than power determines lamp life in stable thermal conditions.

Can anyone please explain

Thnx

Don
 
Jon, remember to set your light voltage before you connect the lighting as decoders have a habit of switching on the lights as soon as you power up the decoder first time, thus supplying the lights with full voltage.

I normally set the voltage to as low as possible, you can always turn it up later.

How did I find that out, I'll leave you to fill the gap in:rolleyes:
 
I agree that most decoders supply maximum voltage to the bulbs for short periods of time and all that changing the so called voltage control CVs does (some decoder manufacturers describe them as dimming CVs) is control for how long maximum voltage is supplied before a break of no voltage. I don't see why that should significantly affect the life of bulbs though, as Don says power is going to be the most significant factor. I've not heard lots of moaning about how bulbs now blow quicker because DCC is used.

If this was a significant issue then I'm sure Massoth would have included voltage stabilisation in their 'Onboard Adapter', but they declined to do so.

Having said that if a voltage stabiliser is already present, and there is space, I would leave it in. I'd leave it in not for the bulb life issue but so that misprogramming the CVs is not a problem. I know some people always recommend ripping out existing electronics when installing DCC but I don't agree with that when there is voltage stabilisation in the existing electronics, such as, in the case of LGB, boards that support the latest 'DCC Interface'.
 
Back
Top Bottom