Camera Color Fidelity Shootout

Author: Figen Bekisli, Published on Sep 04, 2013

The surveillance industry is obsessed with more and more pixels. But color fidelity - accurately rendering yellow as yellow or blue as blue - can be critical for identifying individuals or objects.

We decided to test 5 camera manufacturers - Axis, Avigilon, Bosch, Dahua and Sony - to see how faithful their cameras could reproduce color.

The Test

We used a ColorChecker (aka Macbeth chart) and shot out all cameras side by side simultaneously. Here's a preview of the results with the models blacked out.

This was a fairly simple low light indoor scene but still notice how much color fidelity varied across them.

Scenes

Moreover, we tested in a variety of conditions including different forms of outdoor street lighting to see how LED, mercury vapor, and sodium lights impacted performance.

We concluded with real world examples of a subject wearing different colored shirts.

Here's one example of a major manufacturer who made a range colors look basically the same:

Contrast to another major manufacturer who handled the same scene, same time dramatically better:

Who Won?

There were clear winners and losers here. Who was it? Was it Avigilon - who claims to deliver the best image quality? Axis - who touts their conformance with HDTV standards? Dahua - the up and coming Chinese company? Or did the old guard - Bosch or Sony pull it out? Answers and detailed analysis inside.

*** ************ ******** ** ******** **** **** *** **** ******. But ***** ******** - ********** ********* ****** ** ****** ** blue ** **** - *** ** ******** *** *********** *********** or *******.

** ******* ** **** * ****** ************* - ****, ********, Bosch, ***** *** **** - ** *** *** ******** ***** cameras ***** ********* *****.

The ****

** **** ************* (*** ******* *****)*** **** *** *** ******* **** ** **** **************. ****'* a ******* ** *** ******* **** *** ****** ******* ***.

**** *** * ****** ****** *** ***** ****** ***** *** still ****** *** **** ***** ******** ****** ****** ****.

******

********, ** ****** ** * ******* ** ********** ********* ********* forms ** ******* ****** ******** ** *** *** ***, ******* vapor, *** ****** ****** ******** ***********.

** ********* **** **** ***** ******** ** * ******* ******* different ******* ******.

****'* *** ******* ** * ***** ************ *** **** * range ****** **** ********* *** ****:

******** ** ******* ***** ************ *** ******* *** **** *****, same **** ************ ******:

Who ***?

***** **** ***** ******* *** ****** ****. *** *** **? Was ** ******** - *** ****** ** ******* *** **** image *******? **** - *** ***** ***** *********** **** **** standards? ***** - *** ** *** ****** ******* *******? ** did *** *** ***** - ***** ** **** **** ** out? ******* *** ******** ******** ******.

[***************]

Key ********

**** *** *** *** ******** **** *** ****:

  • ********'* ** *** *** ****' ***** *** *** **** ****** in *** *****, **** ********** ** ****** *** ******, ********** difficulty *************** ******* ****** ******.
  • *** ***** ***-**** *** **** ***-***** **** **** ******** *** consistent ****** ******.
  • *** ***** ********** (~* **) ********* **** ******** ****** *** **** of *** ******* ******, **** **** ****** ********** **** ********* to *******.
  • *******, ** **** *****, *** ******** **** *******, ***** **** Avigilon *** **** ****** ********* *****.
  • *********** *** *** ****** ******** ********** **** ****** ******** ** mercury ***** *** ****** ******.
  • ******* ***** *** ****** ***** ******* *** **** ************* *** differentiation ******, ********* ** ******** *** ********* *****, ************.
  • *** ******** ***** ******** ** *** ******. ** **** *****, colors ****** ******** **** ****** *** **** *** **, *******, they **** ********* ****** ** *************. *******, *** *** ****** to ** ****** ** ***** ******** ** *** **** *****, which *** **** ********** ** *******.
  • ** ***** ********* ***** ******* ******** ********* ****** ** *** scenes **** ***** *****-******** *****. 

***************

***** ******* ****** ***** ** * **** ******* ** ****** should ******** ***** *** ****, ** **** **** ****** ***** the ****** ******* ** ***** ****** *** *****. **** *** Avigilon ** *** *** **** ***** *** ******** ******** ********** representing ******, *** ****** ** ******* ** ***** ************.

** ********* ***** ********* ***** ******* ** ** ******* ****** *********** ***** **** **********. ******** ***** *** ****** ******** should ** **** **** ******* ** **** *** ****** ******** as ***** ******* ********** *** ***. 

Overview ********** [***** ****!]

**** * ******* *** *** ** **** *** *** *** key ******** ** **** ***** **********:

White ******* ********

*** ****** ** **** **** *** ***** ***** ********* ***** balance ********. ***** * ****** ** ******* ******* ******** ***** in ***** ** ********** *** ********* ******** *****, ** ***** automatic ******** ******* **** ***********. *** *******, ** *** **** of ****, ***** *** ****** ****** ***** ************ *** ******* or ****** ********, *** ****** *********** *** *** **** ****, staying **** ** ***** ******* ********* ****. 

************, ***** ***** ******** ****** ***** ******* ** ********** *** harsh ******* ******** **** ** ******. *******, ****** **** ** turned ** *** *** ** * ********, ********* ** ********** for ****** ***** ** ***** ***** **** ******* ***** ******, as *** *** **** **** ** *********.

*** ***** *******, ********* ***** ******* ** *********** ****** ******** problems *** ********, **/*, ******* ******, ** ***** **** ****** white ******* ***** ** *********.

Indoor ********

** ****** *** ******* ***** ***** ********* ****** ******** **********. ***** tests **** ********* ***** *********** ****** ********, **** ****** ** commercial *********.

******** ***********, ~*** **

***** ******** ** **** ******* *** ****** ****. *** **** M1114 *** *** **** ****** ********, **** ******, ***** ***** *** other ******* ****** ****** *****************. ************, ***** **** ***** ** the ****** *** ******** *****. ******** **-** *********** ******* ********, ****** ** a ****** ******. ****** **** *** *********** ****** **** *********** on *** ****** *****. *** ****** ** *** ********* **** cameras **** *****, **** **** ***** ***** **** ******* **** the ***** *****. 

** ***

**** *** ***** *** ******* ** ** ***, ****** ***** changes ******** ** *** *******. *** ****** **** **** ******* *** the ****************** ******** **** **** ***** *** ***** *******. *******, the ***** ********, **** ****** **** *********** **** ** **** light.

*** *****, ~* **

** * ***, *** ***** ********** *** ***** ******* ****** the ***** *********** ** *** *******. ** *** ******** **-** camera, **** ****** ****** ******** ***** (***********). ****** ****** ******** in *** **** ***** *** ***** *******, *** ****** ***** affected **** ***********. ****** ** *** ***** ***** ** **** out ** **** ***** *****. **** *** ******** ** ***** in **** *** **** ***** *** **** *****. *** ***** improved *** *****, **** ****** **** **** ** **** ***** level. *** ***** ***-**** ******** **** **** ***** *** ***** conditions.

** **** *** **** ***** ***** ***** * ****** ******* different ******* *-****** ***** *** **** ******** ********** *** ***. This *** ** ***** *** *** **** ******* ***** ***** results ***** **** **** **** *******. ***** *** **** ****** color *-****** ***** ****** ** ***** **** ******** **-**, ********** with *** ******* ***** *******. *** ******* ** *** ***** ***-**** are **** ***** *** ********** ** *** ***** *** **** *********** in **** *****.

Outdoor *******

** **** ***, *** ******* *********** ******. *** ******** **-** and **** ***** ********* ********* ** *******, **** ******* ****** and **** ***** ****** ***. ****** ** *** ***** **** pale, **** ***** ***** **** ** *******. *** *** ***-****, light ***** *** ****** **** ********* ** ***********. *** ***** displayed ****** ***** (*** *****) *** **** (****** *****) ****** the ****. 

***** ****** *****, ******, *** ****** **** ***** *** **** washed *** ****** ** *** ******** ** *** **** *****, we ****** ***** ****** ** ****** ** *** *******. ****** still ****** *** ** ******, ***** ****** *** ****** ** discern ** *** ******* ***** ********* ***** *********** ***** *** wrinkles ** *** ***** **** **** ******* **** *** ****** direct *** ********. *** ******* ***** ** * ******, **** surface, ** ** *********** **** ** **** ***** *****, ********* in ****** ********.

Outdoor ******

** **** ****** ***** ***** ********* ******* ******** *****, ** order ** *** **** ******* **** *** ** ********.

***

***** **** ******* ****** ***** *****, ** *********** ****** ** the ******** **-** *** **** *****, ***** **** *** ********** distinguishing **** ****** ** *** *****. *** ********* **** ******* handled *** ***** ******** ****, ******* ** *** ****** ****.

***** ****** **** **** ** *** **** ********* ****** **** our **** *****, (****, ****** *****, ******, *** ******) ** our ******* *** *** ******* **-**, ***** (*** *****) *** Sony ***** (*** ** *** ****). ********** *** **** *** other ** ********* ***** *** ****** ***** *** ***** *** both **-** *** ***** ***** ***** *****'* **** *** ******.

******* *****

******* ***** *****, ******* ****** ******* ******, ******* * ******-***** tint. **** *** ** **** ** *** ***** *** *******, which *********** **** ************* **** * **** ******** ****. ***** ******* were **** *** **** ** ***** *** ********.

***** ****** **** ****** *** *** ******** **-** *** **** M1114 ***** ****** ****** **** *************** ** *** *-***** ***** ***** mercury *****. **** ****** ** ********** **** *** ******* *****.

 

****** *****

****** ***** ****** *** ********* *** ****** **** ***** *********, due ** *** ******** ****** ***** **** *******. ***** * sodium ****, ************* *** ************* ***** *** ***** ******* *** *** Axis *****. ******** *** ******* ******* ***** ******* ******** ** other ******* ******** **********, **** ********* **** ************** ** ***** of **** ***** ******** *** ******************.

*** *********** ****** ** *** ********* **** **** ***** ******* ***** sodium ***** ***** *** ** **** ** ********* ****** ** five ********* ******* *-****** *****. ******** **-**, ***** ****** ************* ****** out ******, *** *** ***-***, ***** ******** **** ** **** scene, *** ***** ***** *** **********.

WDR ** **. ***

*********, ***** ******** *** ****** **** *** **. 

******

*****, ** *** ****** *****, *** ***** ******** **** *** turned **, **** ****** **** ********* *** *****.

*******, **** ******* ********** *********, ** ** *** ***** *** lighting ******. **** ****** **** ******** *****, ****** ****** **** nearly *********.

**** *-***** ******* ***** *** *********** ** ****** *** **** colors ** *** *****. **** **** *** ******* ***** *****, bright ***** ** ****** *** **** *** ***, ***** ** is ******* ******* **** ****** **. ****** ******, **** ** red ** *** ******* *****, *** *** * *******.

****** *****

***** ****** *****, *** ******** ******** ***** ** *** ****, WDR *** *** ******* ****** ** *** *****. **** ** the **** ***** *** *********** **** *** *** ****** ***. However, **** *** **, *** ****** **** ** *** ****** vapor ******** ** **** **** **********.

*** ********** ***** ************ ***** ****** ***** *** ******* ** different ******* *-******. **** *** ***, ****** ***** ** ********* washed ***, ********* *****. *******, ******* ** **, *** ****** scene ******* ****** ******, ********* *** **** ****, *** *** shirt ***** **** **** ****** **** *****. **** ****** *** much **** ** ** *****, ****** *** **** *******.

 

Test **********

*** *** *******, ** **** *** **** ****** ********.  **********, *********, and ******** **** *******. *** ******** ******* *** */** * and ***** ******* *** *********. ***** *** *** ******** ******** we **** ** **** ****:

Comments (42)

This is awesome, great job! The only thing we could have asked for was more cameras to be tested but we know you can't test them all!

We are going to look to mix this in to standard camera testing as now we have a benchmark to compare against and an understanding of what scenes problems are most likely to arise.

Good to hear. This is a good factor that is so difficult to explain from a data sheet.

Amazing test! I had also noticed quite a few differences between fidelities, even amongst a single brand.
Which already made me wonder how the comparison between brands is.

It just shows how big of a difference it is between the large scale of camera's on the market and that it's one of the things you really need to test when you have a demo camera in your hand. You can't conclude that due to the fact it's the same brand you will prolly get the same results.

PS: Looks like someone got a new wardrobe as well :)

I knew that the Sony would perform well as I've been doing a lot of Sony lately and I'm quite familiar with its capability of producing very-very close to spot-on colour.

It's totally the opposite of their commercial products like CyberShot and HandyCam which I also sold for over 5 years. There, their colour is always very "punched-up" red is very red etc. And because of this colour bleeding becomes very apparent.

Love this review, it would need a different colour chart without the black border between them to get the idea of colour bleeding especially between colours like Red and Orange as well as Purple and Blue.

Great article and information... I'm hoping you'll expand this type of testing with other brands.

Thank you for taking on a great topic for evaluation. In meeting with customers it is sometimes difficult to communicate this in an effective manner as they do not have the time, or ability to see their cameras under these different lighting conditions. I have found that most often, they have just accepted that their cameras will not represent colors well, and that under high pressure sodium, metal halide, and flourescents that "it is what it is". What your service has been able to show effectively is that there are better products for specific desired features.

This article in particular was very thorough, well presented, and lacked any bias. As a professional scientifically trained myself, I say great job Figen Bekisli.

Robert, thanks. I am curious how this article 'lacked any bias'. Do you believe other IPVM camera tests are biased? Specifically?

Perhaps the bias-challenged articles tone is due to the tabula rasa of Mr. Bekisili...

Ah correction, that would be Ms. Bekisili... Perhaps the comically challenged Mr. Dearing would be better of contemplating the source of his own bias next time...

In any event sincere apologies to Ms. Bekisili...

Ha, I'm sure she will appreciate the correction. Figen joined us at the end of July and this is her first report.

To be clear, all IPVM articles are tested, researched and written under a standard approach. We do not hire new people and let them do their own thing.

Secondly, I object to the 'bias' reference because such claims almost always circle around a specific report either being favorable or unfavorable to the brands a commenter is selling.

If people are going to make allegations/allusions of bias, be specific or say nothing.

John, my comment was meant as a compliment to Figen. The report was very well done and presented in a very clear manner with little subjective input. The colors were either represented well, or they were not under the different conditions. In other studies there can be more subjective measurements, where the researcher's (presenter's) past experiences can influence the results. I have not found this often in the IPVM service however, I have found it difficult to eliminate in any study. It doesn't matter to me if you were to look at studies on global warming, the economy, or baseball statistics. The exact same data can presented in different manners based on the researchers past or belief system of what is important.

This report, had a clear reference point of measurement and was able to present the results in a clear objective manner. Good report.

Sorry for the delay in replying.

Robert, the bias I more frequently see is that people will praise us when our tests reach positive conclusions for products the commenter sells but criticize us when they do not.

By and large I agree with what has been said so far. Very precise and detailed information in these findings.

It would, however, be interesting to compare the MSRP pricing for each of these cameras. I believe being able to present this information to existing and/or potential end users could provide a lot of value…allowing them to see how they might be able to get the most "bang for their buck".

Any thoughts?

Great test John, thanks.

Totally Agree with you Russ Porter,

It is "The more you pay; better you get" scenario.

Also I am surprised that PELCO sarix cams are not there in the shootout.

And Panasonic was not included. And Mobotix was not included. And Hikvision was not included. And Arecont, etc.

Please don't be surprised. There will always be 'major' manufacturers in any given test that will not be covered. It's just not feasible for us to cover all of them, every time.

I guess this kind of relates to the other post, “Why Are There So Many Damn Manufacturers?”

We will love to see a similar study using other brands of cameras and making an comparison with this brands; for example Vivotek, Geovision, D-MAX, Panasonic, Samsung and Hikvision.

We'd love to be able to test every brand. And we are going to test more cameras and as we do include color fidelity with them. However, this is going to take time as we have to balance testing a variety of products and technologies.

A general comment on people requesting other brands, etc. As a rule, when I do camera testing, I always check color with the MacBeth chart, at least indoors to see if anything is majorly off. I'll expand that into other lighting settings.

Thanks everyone, also, the readership is way more excited about this test than I expected!

John,

You guys are doing great job. no doubt about it.

I also agree with your comments for not able to cover them all but there will be much better clarity for everyone if we try to compare "APPLE TO APPLE" for such technical shootout.

Here's the problem. If we test 10 manufacturers instead of 5, it will take at least 50% as long, and probably close to double the amount of time. But we will need get twice the number of member sign ups, twice the number of renewals, etc.

Indeed, this report has very weak readership, ~30% less than an average test.

I think this is great research and I am happy that a small number of you really like it, but we need to smart about where we invest our resources. Right now, overall members are voting with their time and money that this is not an area to spend a lot more resources on.

Hey there John,

What medium did you use to capture the images and was the media player a faithful representation of what the camera could do?

Hi Darren, we used multiple methods for the test. We viewed them in VLC media player, the camera's web interface, as well as Exacq. We viewed them on multiple monitors, in case one was calibrated differently, and three of us looked at these images.

The biggest differences were between different monitors, but even those were relatively minor. Major problems, such as not being able to differentiate between dark or bright colors, were no better or worse depending on monitor.

Great job guys...It’s good to see cameras under real world lighting situations. I would also like to see the MSRP of the tested cameras.

John,

Thanks for doing this test it's always great to see the quality differences. Just a quick question. Would the Axis P-3367-E have an similar result to the Q1604?

Dustin, we really can't say. The P3367 uses a different imager than the Q1604, or even the rest of the P33 domes. As we saw in this test, different series perform quite differently, like the Q1604 vs. the M1114.

We didn't test all of them formally, but we've seen major differences between other Axis cameras, as well. If you take this image from our test of the M30 series, for example, you can see that all three of the M30 domes are different, too, based on the color of the back wall and skintone.

I've added online pricing to the cameras at the end of the post. They price out as follows:

Ethan - Sony G6V series cameras SNC-VB600, SNC-VB600B and now the new SNC-EB600 in the G6E series all use the same sensor and IPELA Engine, so I would expect all three of these cameras to have the same results in this color fidelty test. VB600B and EB600 would be at a lower price point than the VB600 tested.

A member raised an interesting point about the role of the IR cut filter impacting color fidelity. IR light is known to warp color rendition and is the reason why most professional cameras block IR light durng the day (See our day/night camera tutorial). However the quality and/or range of IR light blocked may differ across manufacturers/models, impacting color fidelity.

This member said he added on an IR cut filter on a camera having color fidelity issues and that significantly helped reduce the problem. Of course, buying and installing such a filter has issues of its own, including cost, form factor restrictions and issues in low light but it's an interesting example.

one interesting test that IPVM should perform is an Integrated IR camera without a cut filter (cheaper kit or consumer cameras) to see how much their color fidelity differs from the professional d/n cameras we tested here.

It's difficult to actually take anything away from this. I mean you tested two $800-900 range cameras, two $500 cameras, and a $300 camera. While I can appreciate the Dahua being included as a point of reference, if its a shoot out I feel that should be added as a side note. 'By the way, the $280 Dahua came in here in the shoot out'. But to say that an Axis M1114 should be compared with a Sony VB600 is kind of odd... Of course the Bosch and Sony cameras are going excel in this shoot out.

Well, the $280 Dahua 'beat' ~$550 Axis and Avigilon cameras in color fidelity (despite being half the price). That says something to me.

We do the tests because we don't naively assume that just because something is more expensive, it does not mean it is better in every way. Under your logic, the Dahua should have been worse than the Axis and Avigilon cameras, right?

In any event, what would you like to see tested? How would you do this?

When we do our own tests, we try and keep the feature sets alike- not so much the price point. I agree that price does not indicate quality. However, the M1114? Who uses this camera, or would agree it belongs next to the Sony VB600? Q1604, sure. I feel there should be 2-3 tests in varying degrees of feature sets and price points. It doesn't take a terrible amount of time to conduct each test and document the differences. It does take time though... Lastly, I would use your buying power to purchase more Arecont, Indigo Vision, Panasonic, even Verint cameras and give them a shake at some of these tests. Our distributors often let us purchase x amount of cameras for testing with the return allowance that they are not damaged. As long as you can plan your tests, you should be able to buy, test, and return in under 30 days.

One value of testing a high end Axis vs low/mid level Axis is to understand the tradeoffs on color fidelity. As such, I am happy we did that even if you found no value in it.

We have Arecont and Panasonic cameras in house as well as many others. As I have explained to earlier commenters, it is a tradeoff of time in testing more cameras in an individual shootout vs doing different types of tests. Over time, we will test other combinations of cameras.

I understand. By the time you get the tests done though, many of the cameras may be EOL.

We're continuously testing new cameras so I am not concerned.

I have any idea regarding which cameras or brands will be included in the test. Why you don't make a poll ahead of your test and see the statistics. Thanks

Who do you want to include?

We can try a poll. Ultimately though we have years of daily reader statitics that show what people are interested in. The reason we don't test the companies we don't cover is that past coverage / mentions resulted in low traffic.

As an example, I will prefer if the Panasonic camera was included in the test instead of let us say Dahual other may like Arecont. I think the poll will make most of us happy. Thanks

The poll is NOT going to make most of you happy. The only thing that might make more people happy is testing more cameras at the same time. However, there is a tradeoff involved. Right now, we test between 4 to 6 cameras per shootout. If we increased to 8 - 10, we would cover more cameras which would make some happier, but then be able to do less tests, which would decrease other member's happiness.

We have lots of readers statistics and it's clear that people overall prefer more tests with fewer manufacturers than the opposite.

As for Panasonic and Arecont, we do test their cameras periodically and will do so in the future again. Dahua is a hot up and comer - a key reason we are testing them right now. Panasonic needs to improve because overall they have fallen behind other 'big brands' like Axis, Sony and Bosch (see our test of Panasonic's SV-509).

I am happy to discuss this further but the color fidelity test report is an inappropriate forum to do so. Please start a new discussion.

Login to read this IPVM report.
Why do I need to log in?
IPVM conducts unique testing and research funded by member's payments enabling us to offer the most independent, accurate and in-depth information.

Related Reports

Testing Bandwidth Vs. Low Light on Jan 16, 2019
Nighttime bandwidth spikes are a major concern in video surveillance. Many calculate bandwidth as a single 24/7 number, but bit rates vary...
WDR Tutorial on Jan 11, 2019
Understanding wide dynamic range (WDR) is critical to capturing high quality images in demanding conditions. However, with no real standards, any...
Worst Products Tested In Past Year on Jan 09, 2019
IPVM has done over 100 tests in the past year. But which products performed the worst? Which ones should users be most aware of? In this report,...
2019 Video Surveillance Cameras Overview on Jan 07, 2019
Each year, IPVM summarizes the main advances and changes for video surveillance cameras, based on our industry-leading testing and...
IPVM Best New Products 2019 Opened - 70+ Entrants on Jan 07, 2019
The inaugural IPVM Best New Product Awards has been opened - the industry's first and only program where the awards are not pay-to-play and the...
Axis Tailgate Detection Tested on Jan 04, 2019
Axis is aiming to tackle tailgating, one of access control's biggest issues, with the Tailgating Detector ACAP application. This camera app claims...
Camera Course January 2019 on Jan 03, 2019
This is the only independent surveillance camera course, based on in-depth product and technology testing. Lots of manufacturer training exists...
8MP / 4K Fixed Lens Camera Shootout - Dahua, Hikvision, TVT, Uniview on Dec 17, 2018
8MP / 4K fixed lens models are now common in lower cost lines, with nearly every Chinese brand and their OEMs now offering multiple options. To...
Ubiquiti $79 Flex IP Camera Tested on Dec 07, 2018
U.S. Manufacturer Ubiquiti has released a 1080p, integrated IR IP camera, selling it directly for $79, making this one of the least expensive IP...
Hanwha L Series Lowest-Cost Camera Tested on Dec 04, 2018
Hanwha has released their lowest-priced IP camera line ever, the L series, that competes on price with low cost competitors Dahua and...

Most Recent Industry Reports

The IP Camera Lock-In Trend: Meraki and Verkada on Jan 18, 2019
Open systems and interoperability have not only been big buzzwords over the past decade, but they have also become core features of video...
NYPD Refutes False SCMP Hikvision Story on Jan 18, 2019
The NYPD has refuted the SCMP Hikvision story, the Voice of America has reported. On January 11, 2018, the SCMP alleged that the NYPD was using...
Mobile Surveillance Trailers Guide on Jan 17, 2019
Putting cameras in a place for temporary surveillance where power and communications are not readily available can be complicated and expensive....
Exacq Favorability Results 2019 on Jan 17, 2019
Exacq favorability amongst integrators has declined sharply, in new IPVM statistics, compared to 2017 IPVM statistics for Exacq. Now, over 5 since...
Testing Bandwidth Vs. Low Light on Jan 16, 2019
Nighttime bandwidth spikes are a major concern in video surveillance. Many calculate bandwidth as a single 24/7 number, but bit rates vary...
Access Control Records Maintenance Guide on Jan 16, 2019
Weeding out old entries, turning off unused credentials, and updating who carries which credentials is as important as to maintaining security as...
UK Fines Security Firms For Illegal Direct Marketing on Jan 16, 2019
Two UK security firms have paid over $200,000 in fines for illegally making hundreds of thousands of calls to people registered on a government...
Access Control Cabling Tutorial on Jan 15, 2019
Access Control is only as reliable as its cables. While this aspect lacks the sexiness of other components, it remains a vital part of every...
Avigilon Favorability Results 2019 on Jan 15, 2019
Since IPVM's 2017 Avigilon favorability results, the company was acquired by Motorola and has shifted from being an aggressive startup to a more...
Gorilla Technology AI Provider, Raises $15 Million, Profiled on Jan 15, 2019
Gorilla Technology is a Taiwanese video analytics manufacturer that recently announced a $15 million investment from SBI Group, saying this...

The world's leading video surveillance information source, IPVM provides the best reporting, testing and training for 10,000+ members globally. Dedicated to independent and objective information, we uniquely refuse any and all advertisements, sponsorship and consulting from manufacturers.

About | FAQ | Contact