Dell Laptop Transformer Powering 8 Cameras?

I was taught long ago by a supervisor that most ANALOG cameras (except for the BIG PTZ Domes that use 24 Volts) should be either powered independently by a 12V - 2 Amps transformer (like this one) or in the case of 4 cameras together: by at least a 12V - 4 Amps transformer with a 4 tips-adapter cable (like this one), so that each camera receives at least 1 Amp of current. And also that if I am using Cat5e network cable for power, I should at least tie both, let's say "white blue and blue together" and "white orange and orange together", for sending electricity. NOT using only 1 pair, but 2 of the 4 pairs available.

Last week I went to a customer's house to inspect a job done by another company, since he was complaining that it was the second time that the SAME CAMERA had died and that another one stopped working recently as well. Upon inspecting the power connections, I saw that the other technician used a DELL branded Laptop 19.5V - 3.34 Amps transformer for powering 8 (EIGHT) cameras together in total !!! And worse, he only used 1 pair of the network cable for each camera. He cut the power cable and joined all the camera cables together with electrical tape.

The thing is, if I do the math, 3.34 Amps between 8 cameras is only 0.41 Amps per camera. The camera should NOT have powered on, but the strange thing is that the cameras were all working OK for a couple months before they started "dying". It seems like a "delayed internal electronic decay caused by insufficient electrical current" or "sustained voltage/current drop for a long period of time" to me (for lack of a better scientific explanation). Why is that (electrically speaking) ???

Also, why am I seeing more and more technicians "saving" on independent power transformers ?? For me, it is simply bad customer service that will make the customer NOT buying from him again, once he finds out that he had cut corners. Also using independent transformers will not raise the cost too much, so it doesn't make sense to me what the technician did on this case.


The typical analog camera (no IR, no heater) is going to use less than 3 watts. That's about .25 amps @ 12VDC.

From an electrical standpoint, it doesn't matter how you connect the cameras to the power supply, as long as you have a good/solid connection. The 4-tip adapter cable makes it easier to manage the wiring, but is no better than just twisting all the wires together with a wirenut and electrical tape.

Although 12VDC is the common supply voltage, most of the internal components are operating at 5V, or 3.3V, or even less. The "cheap" way to do voltage regulation is to throw off excess voltage as heat. If you're powering cameras with a 19.5V supply, it's *possible* there could be overheating problems because the cameras weren't designed to properly regulate 19.5V (or whatever is left after voltage drop ont he cable) down to 5V or less.

I agree that this job sounds like it was done poorly, but it's hard to say exactly what would be causing the issue from what you describe.

All electrical components (and subsequent assemblies like cameras) operate within a power supply range.

Even though spec sheets describe power requirement as specific values, these are typically 'nominal' and ideal for production use.

Think about a battery powered flashlight: as the batteries drain, the light gets dimmer. The light does not go out until the batteries are well under the 'nominal' values of a fresh set.

So the answer to this:

" It seems like a "delayed internal electronic decay caused by insufficient electrical current" or "sustained voltage/current drop for a long period of time" to me (for lack of a better scientific explanation). Why is that (electrically speaking) ???"

Is probably a combination of physical changes to the devices. Conductors oxidize, temperatures change, and components break down (are more inefficient) due to less than nominal electrical supply.

I've found that underpowering some electronic items is usually far worse than over powering them.

Interesting. As in permanent damage? Which devices? Do you know what sub component failed exactly?

Conductors oxidize... due to less than nominal electrical supply.

Am I reading that right?

In my experience 99% of electronic equipment out there is not permanently damaged by being underpowered.

In this case, assuming simple parallel wiring at the transformer, we are over a number of volts, depending upon the voltage drop of the cable. However, the actual power delivered though might not be that off since .41 amps at 19.5 volts is actually the same power as .67 at 12.

Would have to agree with B then that it's more likely a symptom of too much voltage than too little power. (DC to DC converter overheating?)

Now if the hack had only used a parallel-serial config of 4 parallel runs each with two cameras in series, now that might have worked better. Only joking.

No.

Conductors oxidize due to general exposure to corrosive environments and air.

Componentry is damaged due to less than nominal electrical supplies.

In the same way, EEs and electrical distribution guys usually prefer a blackout over a brownout any day of the week.

Ok, I guess I didn't get why you were referencing oxidation and temperature changes if it wasn't specific to under-powering.

This answer is ..probably a combination of physical changes to the devices. Conductors oxidize, temperatures change, and components break down (are more inefficient) due to less than nominal electrical supply.

So do you mean the normal aging of components, combined with components breaking down due to underpowering, probably led to this specific failure? Or can you reword it?

In any event do you still hold to the underpower theory, since there is evidence that the power requirement would have been met by the Dell supply? Also the fact that they worked for some time before failure suggests they weren't severely underpowered.

As for brownouts, I will concede that some equipment can be damaged, as Scott indicates. My own personal experience is that brownouts didn't cause anything to permanently break. I've only seen two of any length, though.

On the other hand when the power comes back on before you can shut everything off first, the voltage spike has caused at 5-10 devices to fail (over the years), often hard drives, or power supplies.

Yea, over the years I have run into several components that fried due to a incorrect powersupply being used to insufficiently power a unit.

Feel free to correct me if I am wrong here, but from what I have known, devices use watts of power, not necessarily volts or amps. So when a camera is rated to use .25A of power at 12VDC, what they really mean to say is it will use 3W of power. If you supply more than 12VDC, it will take LESS current (amps) to reach 3W (volts x amps = watts).

For example, you supply 16VDC, then the current draw would only be .1875A to reach the 3W needed.

Now, if you supply too LITTLE voltage, let's say 8VDC, then you will need to draw .375A to reach the 3W needed. Where that could be a major issue is if the internal connections (wires, traces, etc) aren't large enough to handle that current load, then they will overheat and possibly fail.

It is very common for conductors to fail due to higher than expected current draws (amps), but rarely the same is true for higher voltages. Most wire can handle much more voltage than you think. It's the amperage that burns up wire.

It's the amperage that burns up wire.

True.

Most wire can handle much more voltage than you think.

Also true, but only because I have a hard time thinking of infinite quantities.

Now, if you supply too LITTLE voltage, let's say 8VDC, then you will need to draw .375A to reach the 3W needed. Where that could be a major issue is if the internal connections (wires, traces, etc) aren't large enough to handle that current load, then they will overheat and possibly fail.

Mostly true, since simply 'needing to draw' .375A does not make a device actually draw .375A, only lowering the resistance does that. Which is essentially what a switching power regulator will do, but a working one will not self-destruct by increasing the current to the point of melting the wires or components.

And while I admit to being convinced by Scott and others that underpowering electronics can cause failure, I still think it is pretty rare (and probably not the reason for OP's failure)

Go ahead and take any old, but working camera and then just underpower the hell out of it. I mean really starve it, all the way down to 0 volts and back. Even leave it semi-powered for a couple years or so. Let me know when it fails

That is very baffling. I have seen similar setups and was dissapointed that the installer would jeopardize their customer's setup. Also, when it came to using cat cable i usually saw the (blue and white blue) and (Brown and white brown) pairs used as power cable.

A quick check of a sample analog camera shows DC input of 12V +10%/-15%. 19.5VDC would definitely damage the power conversion circuitry in the camera over time, if not immediately. The only test is a volt meter at the camera, while operating. Also, using only 2 small conductors (22 or 24 AWG) for the power can have a significant voltage drop depending on the cable length. e.g. 100' of 22AWG will convert to heat about 15% of the power transmitted. So 3W at the camera would require ~3.45W at the power supply. If you have any poor/corroded connections, more power is required.