Dahua CVI Bandwidth VS Analog

Hello, we have a reseller telling us that dahua's cvi uses less bandwidth their analog.

Is that true? If the quality is better then the bandwidth must be higher...

We have a bunch of clients with low upload speeds... maybe 1 - 2 Mbps.
Right now we just use dahua analog dvrs there...

* Our main use for these dvr's is remote viewing of the video so bandwidth is very important to us...

Anyone know how the bandwidth of analog and cvi compares?

Thanks


This seems like an odd comparison.

If you're comparing bandwidth between CVI and analog directly, you're talking about the cable between the camera and the DVR. As long as you have a decent cable the bandwidth there is really not relevant to anything else.

Upload speed is presumably talking about Internet bandwidth, which is the DVR to the Internet. That would not technically be dependent on the camera to DVR technology or bitrate. A DVR could theoretically produce a stream of any practical resolution/bitrate.

Maybe a new h.264-based DVR using CVI would use less network bandwidth than an old-school analog DVR with MJPEG streaming, but that's not a comparison of analog to CVI.

I find it hard to believe that bandwidth would be lower. In our Dahua HDCVI 2.0 Test, camera bandwidth was relatively high, with a lot of the CVI cameras running over 4 Mb/s in the daytime, higher than most IP cameas in the same scene.

We don't have them listed in that chart, but a 900TVL camera in that scene usually runs well under 1 Mb/s. This is assuming average 28 quantization, VBR, no cap.

It's also impossible to make a blanket "A is lower than B" statement, since the scene, compression, brightness, sharpness, camera performance, and a lot of other variables impact it. Do you know what cameras/DVRs they were talking about, specifically?

The good news for remote viewing is that pretty much all of their CVI DVRs will send a secondary stream, up to VGA resolution. So you wouldn't necessarily be losing anything on the remote view side by switching, and you'd gain higher resolution recorded video.

Ethan, how was the bandwidth measured for the CVI cameras, thru vlc of an rtsp stream off the dvr, or by disk space consumption, or by remote client app monitoring?

VMS client and confirmed using the DVR itself. It has a real time bitrate display.

Thanks for the info.

They were saying that remote viewing bandwidth for CVI would be the same as analog but the remote viewing picture quality would be much better. :)

I was wondering how that was possible...

Thanks again.

Their claim could only be possible if you take into account that CVI cameras are progressive scan vs the analog SD being interlaced. However, if you had two cameras in the same scene, one CVI and one SD, both set to the same substream parameters, they should in fact have the same bitrate and image quality.

...one CVI and one SD, both set to the same substream parameters, they should in fact have the same bitrate and image quality.

Do you mean same bitrate and resolution?

Why couldn't CVI cameras have better image quality?

Because they are both "dumbed down" to a lower resolution via the substream. To compare apples to apples, they should have the same paramaters for the given substream.

ex. CIF/7fps/VBR/256k

So, they will have the same framerate and resolution. They are both being encoded by the same device. Why would they NOT have the same bitrate?

"if you had two cameras in the same scene, one CVI and one SD, both set to the same substream parameters, they should in fact have the same bitrate and image quality."

Why would set them to the same 'substream parameters'? Also, bitrate and image quality can vary based on sensor used and encoder turning, so dangerous to automatically conclude that.

So if both substreams were set to CIF/7fps/VBR/256k then you should be able to assume very similar image quality. I haven't tried it, but at that low of resolution, any difference should be imperceivable. You won't have much variance from sensors, since the DVR is doing all of the encoding.

"So if both substreams were set to CIF/7fps/VBR/256k then you should be able to assume very similar image quality."

That's like saying you are going to drive a Ferrari and a Ford Escort at 20mph. I guess but it misses the point of one buying fundamentally different products.

Except that we're talking about cameras and not cars.

CVI cameras are designed to deliver 720p or 1080p.

SD cameras are designed to deliver 480p.

You are positing running all cameras at 240p.

This is analogous to taking a car that can do 150mph and a car that can run at 80mph and running them both at 20mph.

Returning to Kenny's original question, the only way Dahua CVI is using less bandwidth is if they are running at far less than their designed / max capable resolution. Or if the 'analog' here is some other unknown system that has different sensors, different encoding process, etc.

Analog SD isn't progressive scanned.

A better analogy would be to take a Ferarri and an Escort and giving them both a lawn mower motor. They would both be piss slow.

Jon, how are you helping to address the original question in this discussion?

I will divert from discussing your car analogy in my future replies.

Let's return to a key assertion of the OP:

If the quality is better, then the bandwidth must be higher...

This would seem logical, but on the other hand look at any bandwidth comparison table on IPVM, you will find wildly varying bitrates for the same frames rates and resolutions, even for the same mfr.

Moreover, the subjective quality assessment does not seem to correlate to the bitrates. Meaning that the cameras with the highest bitrates don't necessarily have the highest quality.

My contention against that theory is based upon the fact that you have a singular encoding device in a DVR vs comparing various IP cameras (your IPVM example). It's an apples vs apples instead of apples vs bananas.

Even if the encoder is the same, the imager and the image processing done on the camera side can vary (low light performance, gain control, WDR support, etc.).

My contention against that theory is based upon the fact that you have a singular encoding device in a DVR.

In actual fact a singular encoding device is not stipulated. The OP says only that the reseller claims that Dahua's CVI uses less bandwidth than their analog.

If Dahua CVI DVRs use less bandwidth than Dahua SD DVRs, then the claim would be true, no?

B,

While I agree that there may be a difference between a traditional SD DVR and the newer CVI DVR, it would be foolish to buy an SD only DVR these days, if you could even find one. The CVI DVRs are priced as low as the SD DVRs, they accept SD and CVI, as well as IP, so there would have to be a good reason to not use a CVI DVR.

I happen to have a CVI DVR here and a good mix of CVI and SD cameras that I will test out tomorrow. I will reply back with my findings so we can put this to bed with an actual test.

Not to sound like a p***k, but do you have a Dahua analog only DVR also?

Of course no one should buy one these days, but the eager distributor was comparing analog dvrs/cameras to cvi dvrs/cameras, so to test his statement we would need to compare both, no?

I do have one on hand. It is a very old model, but if it would make you happy, I could test it as well.

But at the end of the day the fact is... you are NOT going to get analog sized bandwidth with IP quality. :)

Thanks all

What is IP quality?

VGA MJPEG stream @ 4mbps

I can beat that in quality AND stream size with CVI/1080p/h.264

Jon, please stop.

I think we all understand you can choose unrealistic settings to get unconventional results but throwing out a counter example of MJPEG encoding in 2015 is unhelpful.

What Kenny is alluding to is both using the same codec - H.264 - which, as you know, is ubiquitous in surveillance today.