Camera Color Fidelity Shootout

Author: Figen Bekisli, Published on Sep 04, 2013

The surveillance industry is obsessed with more and more pixels. But color fidelity - accurately rendering yellow as yellow or blue as blue - can be critical for identifying individuals or objects.

We decided to test 5 camera manufacturers - Axis, Avigilon, Bosch, Dahua and Sony - to see how faithful their cameras could reproduce color.

The Test

We used a ColorChecker (aka Macbeth chart) and shot out all cameras side by side simultaneously. Here's a preview of the results with the models blacked out.

This was a fairly simple low light indoor scene but still notice how much color fidelity varied across them.

Scenes

Moreover, we tested in a variety of conditions including different forms of outdoor street lighting to see how LED, mercury vapor, and sodium lights impacted performance.

We concluded with real world examples of a subject wearing different colored shirts.

Here's one example of a major manufacturer who made a range colors look basically the same:

Contrast to another major manufacturer who handled the same scene, same time dramatically better:

Who Won?

There were clear winners and losers here. Who was it? Was it Avigilon - who claims to deliver the best image quality? Axis - who touts their conformance with HDTV standards? Dahua - the up and coming Chinese company? Or did the old guard - Bosch or Sony pull it out? Answers and detailed analysis inside.

*** ************ ******** ** ******** **** **** *** **** ******. But ***** ******** - ********** ********* ****** ** ****** ** blue ** **** - *** ** ******** *** *********** *********** or *******.

** ******* ** **** * ****** ************* - ****, ********, Bosch, ***** *** **** - ** *** *** ******** ***** cameras ***** ********* *****.

The ****

** **** ************* (*** ******* *****)*** **** *** *** ******* **** ** **** **************. ****'* a ******* ** *** ******* **** *** ****** ******* ***.

**** *** * ****** ****** *** ***** ****** ***** *** still ****** *** **** ***** ******** ****** ****** ****.

******

********, ** ****** ** * ******* ** ********** ********* ********* forms ** ******* ****** ******** ** *** *** ***, ******* vapor, *** ****** ****** ******** ***********.

** ********* **** **** ***** ******** ** * ******* ******* different ******* ******.

****'* *** ******* ** * ***** ************ *** **** * range ****** **** ********* *** ****:

******** ** ******* ***** ************ *** ******* *** **** *****, same **** ************ ******:

Who ***?

***** **** ***** ******* *** ****** ****. *** *** **? Was ** ******** - *** ****** ** ******* *** **** image *******? **** - *** ***** ***** *********** **** **** standards? ***** - *** ** *** ****** ******* *******? ** did *** *** ***** - ***** ** **** **** ** out? ******* *** ******** ******** ******.

[***************]

Key ********

**** *** *** *** ******** **** *** ****:

  • ********'* ** *** *** ****' ***** *** *** **** ****** in *** *****, **** ********** ** ****** *** ******, ********** difficulty *************** ******* ****** ******.
  • *** ***** ***-**** *** **** ***-***** **** **** ******** *** consistent ****** ******.
  • *** ***** ********** (~* **) ********* **** ******** ****** *** **** of *** ******* ******, **** **** ****** ********** **** ********* to *******.
  • *******, ** **** *****, *** ******** **** *******, ***** **** Avigilon *** **** ****** ********* *****.
  • *********** *** *** ****** ******** ********** **** ****** ******** ** mercury ***** *** ****** ******.
  • ******* ***** *** ****** ***** ******* *** **** ************* *** differentiation ******, ********* ** ******** *** ********* *****, ************.
  • *** ******** ***** ******** ** *** ******. ** **** *****, colors ****** ******** **** ****** *** **** *** **, *******, they **** ********* ****** ** *************. *******, *** *** ****** to ** ****** ** ***** ******** ** *** **** *****, which *** **** ********** ** *******.
  • ** ***** ********* ***** ******* ******** ********* ****** ** *** scenes **** ***** *****-******** *****. 

***************

***** ******* ****** ***** ** * **** ******* ** ****** should ******** ***** *** ****, ** **** **** ****** ***** the ****** ******* ** ***** ****** *** *****. **** *** Avigilon ** *** *** **** ***** *** ******** ******** ********** representing ******, *** ****** ** ******* ** ***** ************.

** ********* ***** ********* ***** ******* ** ** ******* ****** *********** ***** **** **********. ******** ***** *** ****** ******** should ** **** **** ******* ** **** *** ****** ******** as ***** ******* ********** *** ***. 

Overview ********** [***** ****!]

**** * ******* *** *** ** **** *** *** *** key ******** ** **** ***** **********:

White ******* ********

*** ****** ** **** **** *** ***** ***** ********* ***** balance ********. ***** * ****** ** ******* ******* ******** ***** in ***** ** ********** *** ********* ******** *****, ** ***** automatic ******** ******* **** ***********. *** *******, ** *** **** of ****, ***** *** ****** ****** ***** ************ *** ******* or ****** ********, *** ****** *********** *** *** **** ****, staying **** ** ***** ******* ********* ****. 

************, ***** ***** ******** ****** ***** ******* ** ********** *** harsh ******* ******** **** ** ******. *******, ****** **** ** turned ** *** *** ** * ********, ********* ** ********** for ****** ***** ** ***** ***** **** ******* ***** ******, as *** *** **** **** ** *********.

*** ***** *******, ********* ***** ******* ** *********** ****** ******** problems *** ********, **/*, ******* ******, ** ***** **** ****** white ******* ***** ** *********.

Indoor ********

** ****** *** ******* ***** ***** ********* ****** ******** **********. ***** tests **** ********* ***** *********** ****** ********, **** ****** ** commercial *********.

******** ***********, ~*** **

***** ******** ** **** ******* *** ****** ****. *** **** M1114 *** *** **** ****** ********, **** ******, ***** ***** *** other ******* ****** ****** *****************. ************, ***** **** ***** ** the ****** *** ******** *****. ******** **-** *********** ******* ********, ****** ** a ****** ******. ****** **** *** *********** ****** **** *********** on *** ****** *****. *** ****** ** *** ********* **** cameras **** *****, **** **** ***** ***** **** ******* **** the ***** *****. 

** ***

**** *** ***** *** ******* ** ** ***, ****** ***** changes ******** ** *** *******. *** ****** **** **** ******* *** the ****************** ******** **** **** ***** *** ***** *******. *******, the ***** ********, **** ****** **** *********** **** ** **** light.

*** *****, ~* **

** * ***, *** ***** ********** *** ***** ******* ****** the ***** *********** ** *** *******. ** *** ******** **-** camera, **** ****** ****** ******** ***** (***********). ****** ****** ******** in *** **** ***** *** ***** *******, *** ****** ***** affected **** ***********. ****** ** *** ***** ***** ** **** out ** **** ***** *****. **** *** ******** ** ***** in **** *** **** ***** *** **** *****. *** ***** improved *** *****, **** ****** **** **** ** **** ***** level. *** ***** ***-**** ******** **** **** ***** *** ***** conditions.

** **** *** **** ***** ***** ***** * ****** ******* different ******* *-****** ***** *** **** ******** ********** *** ***. This *** ** ***** *** *** **** ******* ***** ***** results ***** **** **** **** *******. ***** *** **** ****** color *-****** ***** ****** ** ***** **** ******** **-**, ********** with *** ******* ***** *******. *** ******* ** *** ***** ***-**** are **** ***** *** ********** ** *** ***** *** **** *********** in **** *****.

Outdoor *******

** **** ***, *** ******* *********** ******. *** ******** **-** and **** ***** ********* ********* ** *******, **** ******* ****** and **** ***** ****** ***. ****** ** *** ***** **** pale, **** ***** ***** **** ** *******. *** *** ***-****, light ***** *** ****** **** ********* ** ***********. *** ***** displayed ****** ***** (*** *****) *** **** (****** *****) ****** the ****. 

***** ****** *****, ******, *** ****** **** ***** *** **** washed *** ****** ** *** ******** ** *** **** *****, we ****** ***** ****** ** ****** ** *** *******. ****** still ****** *** ** ******, ***** ****** *** ****** ** discern ** *** ******* ***** ********* ***** *********** ***** *** wrinkles ** *** ***** **** **** ******* **** *** ****** direct *** ********. *** ******* ***** ** * ******, **** surface, ** ** *********** **** ** **** ***** *****, ********* in ****** ********.

Outdoor ******

** **** ****** ***** ***** ********* ******* ******** *****, ** order ** *** **** ******* **** *** ** ********.

***

***** **** ******* ****** ***** *****, ** *********** ****** ** the ******** **-** *** **** *****, ***** **** *** ********** distinguishing **** ****** ** *** *****. *** ********* **** ******* handled *** ***** ******** ****, ******* ** *** ****** ****.

***** ****** **** **** ** *** **** ********* ****** **** our **** *****, (****, ****** *****, ******, *** ******) ** our ******* *** *** ******* **-**, ***** (*** *****) *** Sony ***** (*** ** *** ****). ********** *** **** *** other ** ********* ***** *** ****** ***** *** ***** *** both **-** *** ***** ***** ***** *****'* **** *** ******.

******* *****

******* ***** *****, ******* ****** ******* ******, ******* * ******-***** tint. **** *** ** **** ** *** ***** *** *******, which *********** **** ************* **** * **** ******** ****. ***** ******* were **** *** **** ** ***** *** ********.

***** ****** **** ****** *** *** ******** **-** *** **** M1114 ***** ****** ****** **** *************** ** *** *-***** ***** ***** mercury *****. **** ****** ** ********** **** *** ******* *****.

 

****** *****

****** ***** ****** *** ********* *** ****** **** ***** *********, due ** *** ******** ****** ***** **** *******. ***** * sodium ****, ************* *** ************* ***** *** ***** ******* *** *** Axis *****. ******** *** ******* ******* ***** ******* ******** ** other ******* ******** **********, **** ********* **** ************** ** ***** of **** ***** ******** *** ******************.

*** *********** ****** ** *** ********* **** **** ***** ******* ***** sodium ***** ***** *** ** **** ** ********* ****** ** five ********* ******* *-****** *****. ******** **-**, ***** ****** ************* ****** out ******, *** *** ***-***, ***** ******** **** ** **** scene, *** ***** ***** *** **********.

WDR ** **. ***

*********, ***** ******** *** ****** **** *** **. 

******

*****, ** *** ****** *****, *** ***** ******** **** *** turned **, **** ****** **** ********* *** *****.

*******, **** ******* ********** *********, ** ** *** ***** *** lighting ******. **** ****** **** ******** *****, ****** ****** **** nearly *********.

**** *-***** ******* ***** *** *********** ** ****** *** **** colors ** *** *****. **** **** *** ******* ***** *****, bright ***** ** ****** *** **** *** ***, ***** ** is ******* ******* **** ****** **. ****** ******, **** ** red ** *** ******* *****, *** *** * *******.

****** *****

***** ****** *****, *** ******** ******** ***** ** *** ****, WDR *** *** ******* ****** ** *** *****. **** ** the **** ***** *** *********** **** *** *** ****** ***. However, **** *** **, *** ****** **** ** *** ****** vapor ******** ** **** **** **********.

*** ********** ***** ************ ***** ****** ***** *** ******* ** different ******* *-******. **** *** ***, ****** ***** ** ********* washed ***, ********* *****. *******, ******* ** **, *** ****** scene ******* ****** ******, ********* *** **** ****, *** *** shirt ***** **** **** ****** **** *****. **** ****** *** much **** ** ** *****, ****** *** **** *******.

 

Test **********

*** *** *******, ** **** *** **** ****** ********.  **********, *********, and ******** **** *******. *** ******** ******* *** */** * and ***** ******* *** *********. ***** *** *** ******** ******** we **** ** **** ****:

Comments (42)

This is awesome, great job! The only thing we could have asked for was more cameras to be tested but we know you can't test them all!

We are going to look to mix this in to standard camera testing as now we have a benchmark to compare against and an understanding of what scenes problems are most likely to arise.

Good to hear. This is a good factor that is so difficult to explain from a data sheet.

Amazing test! I had also noticed quite a few differences between fidelities, even amongst a single brand.
Which already made me wonder how the comparison between brands is.

It just shows how big of a difference it is between the large scale of camera's on the market and that it's one of the things you really need to test when you have a demo camera in your hand. You can't conclude that due to the fact it's the same brand you will prolly get the same results.

PS: Looks like someone got a new wardrobe as well :)

I knew that the Sony would perform well as I've been doing a lot of Sony lately and I'm quite familiar with its capability of producing very-very close to spot-on colour.

It's totally the opposite of their commercial products like CyberShot and HandyCam which I also sold for over 5 years. There, their colour is always very "punched-up" red is very red etc. And because of this colour bleeding becomes very apparent.

Love this review, it would need a different colour chart without the black border between them to get the idea of colour bleeding especially between colours like Red and Orange as well as Purple and Blue.

Great article and information... I'm hoping you'll expand this type of testing with other brands.

Thank you for taking on a great topic for evaluation. In meeting with customers it is sometimes difficult to communicate this in an effective manner as they do not have the time, or ability to see their cameras under these different lighting conditions. I have found that most often, they have just accepted that their cameras will not represent colors well, and that under high pressure sodium, metal halide, and flourescents that "it is what it is". What your service has been able to show effectively is that there are better products for specific desired features.

This article in particular was very thorough, well presented, and lacked any bias. As a professional scientifically trained myself, I say great job Figen Bekisli.

Robert, thanks. I am curious how this article 'lacked any bias'. Do you believe other IPVM camera tests are biased? Specifically?

Perhaps the bias-challenged articles tone is due to the tabula rasa of Mr. Bekisili...

Ah correction, that would be Ms. Bekisili... Perhaps the comically challenged Mr. Dearing would be better of contemplating the source of his own bias next time...

In any event sincere apologies to Ms. Bekisili...

Ha, I'm sure she will appreciate the correction. Figen joined us at the end of July and this is her first report.

To be clear, all IPVM articles are tested, researched and written under a standard approach. We do not hire new people and let them do their own thing.

Secondly, I object to the 'bias' reference because such claims almost always circle around a specific report either being favorable or unfavorable to the brands a commenter is selling.

If people are going to make allegations/allusions of bias, be specific or say nothing.

John, my comment was meant as a compliment to Figen. The report was very well done and presented in a very clear manner with little subjective input. The colors were either represented well, or they were not under the different conditions. In other studies there can be more subjective measurements, where the researcher's (presenter's) past experiences can influence the results. I have not found this often in the IPVM service however, I have found it difficult to eliminate in any study. It doesn't matter to me if you were to look at studies on global warming, the economy, or baseball statistics. The exact same data can presented in different manners based on the researchers past or belief system of what is important.

This report, had a clear reference point of measurement and was able to present the results in a clear objective manner. Good report.

Sorry for the delay in replying.

Robert, the bias I more frequently see is that people will praise us when our tests reach positive conclusions for products the commenter sells but criticize us when they do not.

By and large I agree with what has been said so far. Very precise and detailed information in these findings.

It would, however, be interesting to compare the MSRP pricing for each of these cameras. I believe being able to present this information to existing and/or potential end users could provide a lot of value…allowing them to see how they might be able to get the most "bang for their buck".

Any thoughts?

Great test John, thanks.

Totally Agree with you Russ Porter,

It is "The more you pay; better you get" scenario.

Also I am surprised that PELCO sarix cams are not there in the shootout.

And Panasonic was not included. And Mobotix was not included. And Hikvision was not included. And Arecont, etc.

Please don't be surprised. There will always be 'major' manufacturers in any given test that will not be covered. It's just not feasible for us to cover all of them, every time.

I guess this kind of relates to the other post, “Why Are There So Many Damn Manufacturers?”

We will love to see a similar study using other brands of cameras and making an comparison with this brands; for example Vivotek, Geovision, D-MAX, Panasonic, Samsung and Hikvision.

We'd love to be able to test every brand. And we are going to test more cameras and as we do include color fidelity with them. However, this is going to take time as we have to balance testing a variety of products and technologies.

A general comment on people requesting other brands, etc. As a rule, when I do camera testing, I always check color with the MacBeth chart, at least indoors to see if anything is majorly off. I'll expand that into other lighting settings.

Thanks everyone, also, the readership is way more excited about this test than I expected!

John,

You guys are doing great job. no doubt about it.

I also agree with your comments for not able to cover them all but there will be much better clarity for everyone if we try to compare "APPLE TO APPLE" for such technical shootout.

Here's the problem. If we test 10 manufacturers instead of 5, it will take at least 50% as long, and probably close to double the amount of time. But we will need get twice the number of member sign ups, twice the number of renewals, etc.

Indeed, this report has very weak readership, ~30% less than an average test.

I think this is great research and I am happy that a small number of you really like it, but we need to smart about where we invest our resources. Right now, overall members are voting with their time and money that this is not an area to spend a lot more resources on.

Hey there John,

What medium did you use to capture the images and was the media player a faithful representation of what the camera could do?

Hi Darren, we used multiple methods for the test. We viewed them in VLC media player, the camera's web interface, as well as Exacq. We viewed them on multiple monitors, in case one was calibrated differently, and three of us looked at these images.

The biggest differences were between different monitors, but even those were relatively minor. Major problems, such as not being able to differentiate between dark or bright colors, were no better or worse depending on monitor.

Great job guys...It’s good to see cameras under real world lighting situations. I would also like to see the MSRP of the tested cameras.

John,

Thanks for doing this test it's always great to see the quality differences. Just a quick question. Would the Axis P-3367-E have an similar result to the Q1604?

Dustin, we really can't say. The P3367 uses a different imager than the Q1604, or even the rest of the P33 domes. As we saw in this test, different series perform quite differently, like the Q1604 vs. the M1114.

We didn't test all of them formally, but we've seen major differences between other Axis cameras, as well. If you take this image from our test of the M30 series, for example, you can see that all three of the M30 domes are different, too, based on the color of the back wall and skintone.

I've added online pricing to the cameras at the end of the post. They price out as follows:

Ethan - Sony G6V series cameras SNC-VB600, SNC-VB600B and now the new SNC-EB600 in the G6E series all use the same sensor and IPELA Engine, so I would expect all three of these cameras to have the same results in this color fidelty test. VB600B and EB600 would be at a lower price point than the VB600 tested.

A member raised an interesting point about the role of the IR cut filter impacting color fidelity. IR light is known to warp color rendition and is the reason why most professional cameras block IR light durng the day (See our day/night camera tutorial). However the quality and/or range of IR light blocked may differ across manufacturers/models, impacting color fidelity.

This member said he added on an IR cut filter on a camera having color fidelity issues and that significantly helped reduce the problem. Of course, buying and installing such a filter has issues of its own, including cost, form factor restrictions and issues in low light but it's an interesting example.

one interesting test that IPVM should perform is an Integrated IR camera without a cut filter (cheaper kit or consumer cameras) to see how much their color fidelity differs from the professional d/n cameras we tested here.

It's difficult to actually take anything away from this. I mean you tested two $800-900 range cameras, two $500 cameras, and a $300 camera. While I can appreciate the Dahua being included as a point of reference, if its a shoot out I feel that should be added as a side note. 'By the way, the $280 Dahua came in here in the shoot out'. But to say that an Axis M1114 should be compared with a Sony VB600 is kind of odd... Of course the Bosch and Sony cameras are going excel in this shoot out.

Well, the $280 Dahua 'beat' ~$550 Axis and Avigilon cameras in color fidelity (despite being half the price). That says something to me.

We do the tests because we don't naively assume that just because something is more expensive, it does not mean it is better in every way. Under your logic, the Dahua should have been worse than the Axis and Avigilon cameras, right?

In any event, what would you like to see tested? How would you do this?

When we do our own tests, we try and keep the feature sets alike- not so much the price point. I agree that price does not indicate quality. However, the M1114? Who uses this camera, or would agree it belongs next to the Sony VB600? Q1604, sure. I feel there should be 2-3 tests in varying degrees of feature sets and price points. It doesn't take a terrible amount of time to conduct each test and document the differences. It does take time though... Lastly, I would use your buying power to purchase more Arecont, Indigo Vision, Panasonic, even Verint cameras and give them a shake at some of these tests. Our distributors often let us purchase x amount of cameras for testing with the return allowance that they are not damaged. As long as you can plan your tests, you should be able to buy, test, and return in under 30 days.

One value of testing a high end Axis vs low/mid level Axis is to understand the tradeoffs on color fidelity. As such, I am happy we did that even if you found no value in it.

We have Arecont and Panasonic cameras in house as well as many others. As I have explained to earlier commenters, it is a tradeoff of time in testing more cameras in an individual shootout vs doing different types of tests. Over time, we will test other combinations of cameras.

I understand. By the time you get the tests done though, many of the cameras may be EOL.

We're continuously testing new cameras so I am not concerned.

I have any idea regarding which cameras or brands will be included in the test. Why you don't make a poll ahead of your test and see the statistics. Thanks

Who do you want to include?

We can try a poll. Ultimately though we have years of daily reader statitics that show what people are interested in. The reason we don't test the companies we don't cover is that past coverage / mentions resulted in low traffic.

As an example, I will prefer if the Panasonic camera was included in the test instead of let us say Dahual other may like Arecont. I think the poll will make most of us happy. Thanks

The poll is NOT going to make most of you happy. The only thing that might make more people happy is testing more cameras at the same time. However, there is a tradeoff involved. Right now, we test between 4 to 6 cameras per shootout. If we increased to 8 - 10, we would cover more cameras which would make some happier, but then be able to do less tests, which would decrease other member's happiness.

We have lots of readers statistics and it's clear that people overall prefer more tests with fewer manufacturers than the opposite.

As for Panasonic and Arecont, we do test their cameras periodically and will do so in the future again. Dahua is a hot up and comer - a key reason we are testing them right now. Panasonic needs to improve because overall they have fallen behind other 'big brands' like Axis, Sony and Bosch (see our test of Panasonic's SV-509).

I am happy to discuss this further but the color fidelity test report is an inappropriate forum to do so. Please start a new discussion.

Login to read this IPVM report.
Why do I need to log in?
IPVM conducts unique testing and research funded by member's payments enabling us to offer the most independent, accurate and in-depth information.

Related Reports

Axis Launches First Stereographic Panoramic Camera on Apr 05, 2018
Fisheye panoramics have not had much innovation in the past few years, with their multi-imager cousins getting most of the advances, e.g.,...
Dahua 4K Micro 4/3 Camera Tested (IPC-HF8835F) on Apr 03, 2018
Dahua is going big. Their new 4K HF8835F Micro Four Thirds sensor is one of the largest now on the market. Dahua is claming: Starlight...
Axis Z-Wave IP Camera Tested Poorly on Mar 20, 2018
Z-Wave is drawing notable interest for video surveillance use. In IPVM's initial coverage, 84% expressed interest in it, with nearly half being...
May 2018 Camera Course on Mar 16, 2018
Our next course starts on May 8th. Register now for the Spring 2018 Camera Course This is the only independent surveillance camera course, based...
Camera Form Factor Guide on Mar 16, 2018
When selecting surveillance cameras, users may choose from a number of different form factors, each with its own unique strengths and weaknesses,...
Next Gen 5MP / 6MP Camera Shootout (Axis vs Dahua vs Hanwha vs Hikvision) on Feb 28, 2018
Many manufacturers have released new generation 5MP / 6MP cameras that tout super low light, WDR and other features historically typical in 1080p...
Hikvision 6MP Camera Tested (DS-2CD4565-IZH) on Feb 27, 2018
In the next installment of our ongoing testing of 5MP/6MP cameras, we test the Hikvision DS-2CD4565F-IZH, a 'Smart Series' vandal dome combining...
Hanwha Wisenet X 5MP Camera Tested (XNV-8080R) on Feb 13, 2018
Wisenet X is Hanwha's high-end camera line. We tested their Wisenet X 1080p camera last year. Now Hanwha is offering 5MP cameras listing super low...
Axis 'Best' Q3517-LVE 5MP Camera Tested on Feb 05, 2018
Axis boasts of its Q3517-LVE: Simply put, AXIS Q3517-LVE delivers the best video quality possible. But is it really the 'best'? We bought...
Dahua 5MP Starlight Camera Tested (N52BM3Z) on Jan 30, 2018
Is 5MP the new 1080p? According to our recent statistics, average resolution continues to trend upwards. And now, manufacturers releasing new...

Most Recent Industry Reports

Global Real-Time Video Surveillance - EarthNow on Apr 20, 2018
A new company, EarthNow, with backing from Bill Gates, Airbus and more, is claiming that: Users will be able to see places on Earth with a delay...
Dedicated Vs Converged Access Control Networks (Statistics) on Apr 20, 2018
Running one's access control system on a converged network, with one's computers and phones, can save money. On the other hand, hand, doing so can...
April 2018 IP Networking Course on Apr 19, 2018
This is the last chance to register for our IP Networking course. Register now. NEW - 2 sessions per class, 'day' and 'night' to give you double...
Rare Video Surveillance Fundraising - Verkada $15 Million on Apr 19, 2018
Fundraising in video surveillance (and the broader physical security market) has been poor recently. Highlights are few and far in between...
'Best In Show' Fails on Apr 19, 2018
ISC West's "Best In Show" has failed. For more than a decade, it has become increasingly irrelevant as the selections exhibit a cartoon level...
Security Camera Cleaning Frequency Statistics on Apr 18, 2018
150+ integrators told IPVM how often they clean cameras on customer's sites and why.  Inside we examine their answers and break down feedback...
Worst Access Control 2018 on Apr 18, 2018
Three access control providers stood out as providing the most problems for integrators. In this report, we analyze the answers to: "In the...
Axis VMD4 Analytics Tested on Apr 17, 2018
Axis is now on its 4th generation of video motion detection (VMD), which Axis calls "a free video analytics application." In this generation, Axis...
Arecont CEO And President Resign on Apr 17, 2018
This is good news for Arecont. Arecont's problems have been well known for years (e.g., most recently Worst Camera Manufacturers 2018 and starting...
Strong ISC West 2018, Says Manufacturers, GSX / ASIS Expected Weaker on Apr 17, 2018
Manufacturers say ISC West 2018 was strong, continuing the trend we have seen in 2017 results and 2016 results. However, those same 100...

The world's leading video surveillance information source, IPVM provides the best reporting, testing and training for 10,000+ members globally. Dedicated to independent and objective information, we uniquely refuse any and all advertisements, sponsorship and consulting from manufacturers.

About | FAQ | Contact