Readers Respond: 134 Votes, 25 Comments Examined

Published Feb 15, 2010 00:00 AM
PUBLIC - This article does not require an IPVM subscription. Feel free to share.

Over 100 people voted and more than 25 left detailed comments to our recent question/discussion. In this post, we review and respond to the comments offered.

134 people responded to our survey on "How Critical Should We Be of Video Surveillance Products?". Here are our results:

Non-manufacturers

For the 104 non-manufacturers responding, such as integrators and end-users, the breakdown was:

  • 68%: Just Right
  • 9%: Too Critical
  • 23%: Not Critical enough

Manufacturers

For the 30 manufacturers responding, the breakdown was:

  • 60%: Just Right
  • 7%: Too Critical
  • 33%: Not Critical Enough

We are glad to see that readers value critical coverage and pleasantly surprised to see the strong numbers of manufacturers who want us to be even more critical (given that their products are the subject of such criticisms).

Reader Comments and Our Response

Below are 25 reader comments and our feedback. We found them to be very interesting and thought-provoking. Thank you!

Reader comment: "Highlight what works, not just what does not. Start accepting sponsored ads (j/k!)"

Our response: At the beginning of our test summaries, we usually lead with 1 or 2 key positives. In our conclusions, we note ideal/best fits. In the middle, a significant portion of the reports is critiques (things that either do not work or work poorly under certain conditions or for users with certain skill sets. We always link to the product's marketing material/datasheets for those who want the 'pitch'. This being said, we will keep this in mind. If there are specific ways to highlight what works, please share.

As for accepting ads, it's great that we do not because it makes it easier for manufacturers to trust us. Most media/analysts are always hitting up manufacturers for money which creates a questionable relationship.

Reader comment: "Be clearer about the methodologies and criteria used.  How does your testing/analysis relate to the real world applications of the products?  For what types of customers and environments? Supporting what use cases?  Etc."

Our response: We have guidelines but have not shared them broadly enough nor have we refined them as much as we should. See existing guidelines: IP camera guidelines,exposure setting guidelines and VMS evaluation criteria. We are going to revise these guidelines in the next few weeks and provide more opportunity for feedback.

Reader comment: "Deeper analysis of bandwidth & CPU peformance in client PCs, particularly with multiple H.264 streams & scenes with movement and/or marginal lighting conditions"

Our response: We think performance on client PCs is an important point that we have not covered. We think such testing will be especially valuable for use of multiple megapixel cameras. We plan to add this in the next few months.

Every camera tests includes marginal lighting conditions (with new tests including tests at a measured .3 and 1 lux as well as outdoor, street lighting).

Reader comment: "More evaluations of video analytics: think about typical applications like perimiter crossing or loitering in sensitive areas. Ease of set-up & false alarm rate."

Our response: We only have 1 video analytic test (on VideoIQ). This week, we are starting a new test on video analytics and plan to increase this over the next few months. We will likely to do fewer tests of video analytics than VMS or cameras as the market demand is much higher for than the later.

Reader comment: "Less emphasis on the camera's HTML user interface, bearing in mind that the general idea is to integrate most if not all camera-specific parameters into a commercial VMS interface, or in any case only adjust camera parameters during the initial set-up phase."

Our response: We spend significant time of the web interface because many of the high end features of cameras are only available for configuration on the web interface.

Reader comment: "Johns' nirvana seems to be consumer-ready products at dirt cheap prices.  I'm a security system integrator, and if that happens I'm out of a job.The reality is that this stuff is complex, the market is complex, and installation and configuration should be left to professionals.  While I enjoy the product reviews, I think ease of use should be measured with a yardstick, not a whip."

Our response: I think that's a good point. While I don't literally dream of 'consumer-ready products at dirt cheap prices, I believe these are both two key values for users. As such, we focus on them. At the same time, we spent extensive time in both screencasts and reports explaining complexities to help integrators.

Reader comment: "Benchmarking and comparison of multiple products and suppliers is what the industry needs.  I would look forward to see more benchmark comparisons in de VCA video content analysis field.  All too many suppliers make own statements of performance without critical view."

Our response: We think benchmarking is valuable but want to have 

Reader comment: "We need full disclosure of all system features, functions and benefits (and issues) - all measured on the same yardstick."

Our response: See existing guidelines: IP camera guidelines,exposure setting guidelines and VMS evaluation criteria. We agree that disclosure is important. If there are specific features or functions you want disclosure on, please suggest.

Reader comment: "Maybe more cost justification on given products. Cost based on features that address very limited market expectations can often be a deal breaker.  ROI on a single component, not just on the 'system ' seems to be the building block of value to an end user."

Our response: We agree. We did a business case analysis on megapixel cameras last year and have a general examination of Video Surveillance ROI. Those noted, we good definitely do more of those. Any suggestions on specifics would be welcome.

Reader comment: "Megapixel vs HD race update.  More products have been introduced since the laste  article about a year ago.  Thanks for a great site."

Our response: We have a new megapixel comparison report being released next Monday that will provide a major update. Agreed that it is time.

Reader comment: "Include actual clips of the video under the test conditions."

Our response: We do provide actual clips of the video for every camera test. For examples, review our premium service preview that includes downloads from one MP camera.

Reader comment: "1. I'd like to learn more about analog camera, e.g., WD camera, Spoeed dome camera,...2. Could you make a comparison the performance for using differents IC (ASIC, SoC, DSP) to IP camera and to DVRs?"

Our response: We do very little testing on analog cameras (one test on EzWatch's low end kit just to get a baseline of very inexpensive analog). As IP is the growth market, our focus is analyzing those products - both advantages and disadvantages (like the crazy debate on ease of installation).

Reader comment: "Benchmarking and comparison of multiple products and suppliers is what the industry needs.  I would look forward to see more benchmark comparisons in de VCA video content analysis field.  All too many suppliers make own statements of performance without critical view."

-- and related ---

Reader comment: "I am relatively new to the site as a subscriber of two months.  I was attracted to this site because of positive and negative information in the reviews and if IP Video Market stops offering honest criticism to become a clearing house of product commercials then I would not have a need to renew my subscription.  I would like to see the reviews arrive at more of a conclusion such as a rating system (similar to Consumer Reports).  While it is true that a rating system risks angering manufacturers that suffer from poor ratings, if the rating criterion is clear, then they will know what they need to do to improve their ratings.  Perhaps a process that allows manufacturers a way to respond to criticism would add to the fairness and keep companies with poor reviews involved.  If a manufacturer decides that they do not want to participate in reviews because of bad results instead of responding to criticism and correcting problems then that says a considerable amount about their company."

Our response: I do not like rating systems for a number of reasons: (1) ratings obscure differences in value for different applications (a product for a community bank might rate a 9/10 but only a 2/10 for use in a military base), (2) doing ratings presume testers have outstanding comparative testing and understanding. I think that's a high bar to reach that most testers do not reach and simply subjectively guess. (3) Worse, ratings are best for manufacturers because it allows them to use reviews in marketing promotions. Often such ratings have hidden funding from manufacturers (especially and not surprisingly the top rated ones). This being said, I am definitely open to discussing ways of better summarizing or comparing products.

Reader comment: "I would like to see a summary comparison where the results (positive and negative) are tabulated according to the features like consumer reports."

Our response:

Reader comment: "Criticism on products - including 'tricks' that skew claims of fact or specs - is awesome!  Keep everyone honest and make comparisons more fair and just.Criticism on marketing - which are more general - sometimes goes overboard.  Marketing is not always evil or misleading as it is usually positioned.  Some would argue it is necessary to fuel growth, or at a minimum, to survive."

Our response: I have grown somewhat more tolerant of generic marketing claims. What concerns me are marketing claims that significantly over-reach or misrepresent (see our debate on 'riskiest companies'). I think some of these more extreme cases of over-marketing hurt everyone including other manufacturers who either need to respond in kind or lose ground in public opinion. In these cases, we think our critiques can be beneficial for the overall market.

Reader comment: "Ruthlessly expose the marketing claims of manufacturers and the lack of uniform standards in critical performance metrics such as lux ratings for cameras."

Our response: We try to be relentless but not ruthless. We want to be fair to all sides and mainly resist when we believe a manufacturer is hiding performance problems (though this is a minority of cases).

Reader comment: "Would like to see more standardized tests.  Especially of resolution, at , say, 3 different light levels.  Both IP and analog cameras are usually well short of their advertised levels in my experience.Also total file sizes of a standard moving and fixed image would be helpful.  Because for example, H.264, JPEG, etc have many possible implementation subsets and compression efficiency."

Our response: We do our camera tests at .3 and 1 lux as well as an outdoor intersection scene to provide measured references. 

Reader comment: "I believe that you show a bias against IP-based products in favor of analog manufacturers.  The tone and quantity of your critical articles about IP products seems to be overly vehement against."

Our response: A site called "IP Video Market Info" and spend 99% of their coverage on IP is probably not biased against IP video. The IP manufacturers obviously say good things about IP. They also spend significant money buying or writing articles in trade magazines who then also say good things. We see our site as an independent balance that pushes critical debates on IP's flaws. We think this will foster a stronger and more mature market. If we were vehemently against IP, we'd go pull coax cable.

Reader comment: "I am an end user of CCTV products and one of our critical needs is to know and understand all the operational functionality, shortfalls and upsides of a given product/process. And the viewpoint for this is most often from the Control Room operator - what does this new gizmo give him/her; what advantage does it bring into the Control Room and how is his/her job going to be made easier?I look forward to your reviews with keen interest and find them refreshing, useful and thought provoking, so keep it up!"

Our response: Good point. We have done a poor job covering the basics as most of our focus has been on researching newer products and trends. We think your suggestion is valuable and will remember to provide more coverage on the fundamentals.

Reader comment: "We'd like to see tests for Avigilon, Lumenera, Iqeye PRO, Luxriot, Bosch, SeeTec. Also it would be great if Ipvideomarket start to post comparisons between IP cameras with 'for' and 'against' in one table."

Our response: We are speaking with or in progress with most of the vendors on the list. For most vendors, it's simply a matter of time. Luxriot should be released next week.

Reader comment: "more group tests please. Tests like, cameras from different manufacturers against each other showing simple pros and cons.same for software, storage and the rest."

Our response: Group tests are valuable but we want to have the detailed research to support research that provides novel insights rather than simply rehashes marketing specifications. As we finish up testing products of a certain category we will release group test results.

Reader comment: "Maybe organise (paginate) each review for consistent layout and points covered, for easier quick product to product comparisons of your reviewse.g. http://techreport.com/articles.x/17402 (see bottom for pagination, this layout is often used for IT product reviews)"

Our response: For cameras and VMS we do have standard organizations (though we lay out the complete review on a single page (12 page reviews often do this to maximize advertising revenue - a concern we do not have). For cameras, it's usually (1) overview, (2) pricing, (3) physical, (4) configuration/optimization, (5) image quality, (6) recommendations.

Reader comment: "1. Examining the company and its product within the targeted marketplace to understand overall positioning and chances for success.  I think you do this REALLY well, John, and your insights show your many years of experience.2. Real-world testing.  This is an area that's SO difficult to cover, especially in, for example, the area of analytics testing to determine reliable and replicable metrics around false positive mitigation AND positive detection in multiple scenarios.  For example, I believe perimeter detection/verification is a complete application in itself:  there are specific capabilities (and economics) required there that may not be applicable in a generic Public Safety application. Anyway,  I think it requires more resources (people, locations, time) than you may have--and am wondering if you should hook up with some independent company that may have more infrastructure."

Our response: We could certainly work with other companies to do test. Ultimately though something must cover the costs. If we partnered with another company, either we would have to generate more subscriptions, charge higher prices or have the other organization fund it (in exchange for?). We are a growing, profitable business and plan to continue to expand the depth and sophistication of our research and testing. We think this is the best way to remain independent and true to the needs of users and integrators of products.

Reader comment: "I value your research but wish the subscription model was more affordable.I feel like we end up being unpaid beta testers for every new product out there. Mfr says it is ready for deployment and we end up pointing out all the shortcomings during the install - on our time."

Our response: We agree that a lot of integrators wind up being unpaid beta testers. This happens to us frequently. Our results help integrators and users avoid bad decisions which can easily save far more than the cost of our subscription (about the price of a cube camera for a year). We'd be happy to charge whatever rates that provided sufficient revenue to cover costs. However, independent research that is dedicated to finding flaws need to be paid for by users and integrators. The service can be made cheaper by getting manufacturers to fund but I can assure you that the value of the service would collapse as manufacturers pushed for more favorable treatment. That being said, we are happy to entertain other ideas on pricing that does not resolve around manufacturer funding. Also, see our subscription preview to see the value the service provide.

Conclusion