VMS Testing Plan 2013/2014By John Honovich, Published Jul 23, 2013, 12:00am EDT
We are embarking on a major new VMS testing program where we will perform in-depth tests both on features and products never done anywhere before. To lead that, Sarit Williams has joined IPVM. She previously led QA for a VMS developer. Now, she will be solely focused on testing and researching VMSes.
This note explains our testing approach and timetable for the next year.
Previously on IPVM
The last major round of VMS testing we did finished in 2010 - 2011, including more than a dozen VMS test reports and a VMS competitive comparison.
Step 1 - Feature Comparisons / Shootouts
This time around, we are starting with feature comparisons / shootouts where we look at specific VMS areas and break them down. For instance, we have already done mutli-streaming and exporting, where we take 6 VMSes, examining the low level details of how they implement these features.
The purpose of this is to deliver fundamental research. No one publicly has ever closely compared and contrasted how VMSes work on a feature by feature basis. For instance, people talk about exporting but what does it really consist of? What makes it better or worse? What options exist? This initial research is to develop a foundation to understand these specifics.
We anticipate about a dozen similar feature comparisons / shootouts. Update November 2013: already done:
- Live Monitoring
- Search / Investigation
- Camera Discovery / Addition
- Camera Configuration
- Alert Management
- Remote Monitoring Performance
- Health monitoring
- VMS Load Testing
Ultimately, we plan to test up to 30 VMSes. However, for the feature / comparisons, we are limiting to 6 to keep it manageable. (The more VMSes we test, the longer each report takes, the fewer ones will be released). We believe 6 is enough to show patterns while making the tests manageable to execute.
There are 4 we consider a priority simply based on usage and member interest:
- Genetec and Milestone (the 'Old Guard')
- Avigilon and Exacq (the "Up and Comers')
This does not mean they are the 'best'. Indeed, many aspects are surprisingly lacking. However, it does cover companies that have broad appeal (whether it is to use or to beat them).
Beyond that, we will rotate 2 others in (so far we have done Network Optix and VideoInsight but we might also use OnSSI Ocularis and a few others over time).
Step 2 - Individual VMS Tests
After we complete the (initial round of) feature tests - likely some time in November, we will switch to individual VMS tests, where the report will focus on just a single manufacturer. We will use the feature tests as a baseline for comparison (e.g., if when we test Luxriot, we can say, "Luxriot's exporting is better/worse than competitors because they have/lack specific elements X, Y and Z previously tested in the exporting comparison, etc.).
There are clearly dozens of VMSes worth testing and we will plan to do so, far more than the 6 we do for the feature tests.
Step 3 - First Half 2014
Before ISC West, we plan to release an all new IPVM VMS Competitive Comparison that breaks down feature by feature how VMSes match up, letting specifiers and users know where the unstated weaknesses are and what VMSes perform better than their marketing / sales hype.
Obviously, that is just the beginning. After Step 3, we will continue to expand, with more advanced testing as well as new VMS testing. This, though, should give an overview of our general approach for the next year. Let us know what questions or comments you have.
Back to Top