How to Specify a Preferred IP Camera

Published Aug 12, 2011 00:00 AM
PUBLIC - This article does not require an IPVM subscription. Feel free to share.

Often users will find a camera that they trust and will prefer to get that camera rather than risk an RFP respondent low bidding a 'junk' camera and screwing the end user over. This issue was raised in our WDR megapixel shootout where one vendor's performance was clearly superior to others. How do you ensure you get the highest quality without simply requiring a specific model number? In this note, we provide recommendations on best practices for a fair, thorough, non biased RFP.

Only Specify Where Needed

First, the specification should only require the specific functionality when needed - not for all cameras.  We see this frequently - a customer finds a camera they like and calls for it at every location. This is overkill and can wind up costing a lot more money (e.g., the Sony WDR camera (CH-140) costs a few hundred dollars more than the non-WDR version (CH-120). If you are going to deploy 100 cameras, only a fraction will really need WDR. Don't overspec.

Difficulties in Specifying Performance Rather than Hardware

Spec'ing WDR without calling out a specific product is difficult because anyone can call themselves WDR and there is no simple way to check or disprove this. Specifying WDR is much harder then hardware elements such as resolution, lens F stop, day/night support, etc. because WDR is not based on a single physical feature. To overcome this inherent risk, often people specify the individual implementation of WDR for the camera they prefer (e.g., if you wanted Sony, you say the camera must support View-DR and XDNR - Sony's specific implementation / marketing terms). However, this is basically the same thing as hard spec'ing and worse because it osbcures the fact.

Require Acceptance Testing

Require a specific acceptance testing procedure. A vendor neutral way of forcing an RFP responder to deliver a quality solution is holding them to a clear acceptance process. For instance, you can specify that the system will not be accepted and final payment will not be made until the integrator passes WDR tests where the camera delivers a clear image of facial details in direct sunlight (e.g., like the scenes we used). This would force the integrator to choose a camera with WDR that really worked or risk costing them significant penalties and failing to finish the project.

Hard Specifying Based on Evidence

The main problem with hard specification is that it is often done based on a cursory review of the market or based on what salesperson the RFP writer likes the best (sometimes this is mundane, sometimes unethical). The resulting product selection then tends to cost far more than it needs to, underperform relative to options available or, worse, both.

End User Bake Offs

Some larger end users conduct sophisticated bake offs where cameras are brought on site and tested in the planned field locations. If an organization is willing and able to do so, this can be a good way to find the 'best' solution. However, some bake offs are poorly run including allowing vendors to 'trick' out their equipment to beat the test or allowing vendors to structure the tests selected. Both are very bad ideas as they will invariably result it vendors rigging the test.

Independent Test Results

If there are independent, comprehensive test results, those can be very useful and efficient ways of justifying a hard specification (or a short list of approved products). This is done in some industries but has not historically been the case in video surveillance. Our tests certainly are independent and are becoming much more comprehensive and sophisticated (especially with our expansion to large group shootouts). This is something to consider, especially as we further grow our shootout program to cover more camera models.

Conclusion

Here are the big 4 recommendations on specifying a preferred camera:

  • Only specify in locations where you really need the camera's performance.
  • Hold integrators to clear acceptable tests for key performance metrics (WDR, low light, etc.).
  • If you have the time and money, conduct rigorous bake offs that you control completely.
  • Consider indepedent testing as a justification source.