Criteria for Evaluating VMS Systems

Published May 25, 2011 00:00 AM
PUBLIC - This article does not require an IPVM subscription. Feel free to share.

For our VMS Competitive Comparison, we identified 9 key differentiators to contrast the abilities of Video Management Systems. In this update, we explain how we ranked each differentiator.

Inside the competitive comparison itself, we provide a 23 minute webcast that explains each one. This update provides a written summary of the criteria.

Levels

We assigned 6 levels for each differentiator:

  • None: System does not support
  • Weak: System supports but is of very limited use
  • Moderately Weak: System supports but has major constraints
  • Moderate: Basic functionality or support but not outstanding compared to competitors
  • Moderately Strong: High degree of functionality but lacking compared to a few competitors
  • Strong: On the highest level of functionality compared to competitors

Metrics for Each Differentiator

We used the following metrics to judge each differentiator:

Pricing

The strongest rating was provided to those systems with the lowest up-front license costs (including base, per channel and add-on license charges). Secondary consideration was given to maintenance charges. No consideration was provided to value or features per price. While this is important, rating value is relative to the features and functionalities one needs. We recommend users compare price to features needed to determine the value for one's own application.

Simplicity

The strongest rating was provided to systems that (a) made commonly used functionalities easy to access (like selecting cameras to view video, adding a camera, exporting a video, etc.), (b) had one or minimal number of client applications and (c) that had 'intelligent' defaults or wizards that simplified setup or technical knowledge required to start running a system.

3rd Party IP Camera Support

The strongest rating was provided to systems that supported the most total number of 3rd party manufacturers. Supporting of 'standards' was not a factor as industry support is too limited (We will likely change the weight on standards later in 2010 as support increases). Supporting of larger camera manufacturers was not a practical matter as all VMS systems that supported many 3rd party cameras supported the largest manufacturers (e.g., Axis).

3rd Party System Support

The strongest rating was provided to systems that supported the most number of access control, PSIM systems, PoS systems and video keyboards/matrixes. Additional weight was added for VMS systems that supported large providers (e.g., supporting Lenel is more valuable than almost all other access control systems).

Alarm/Event Management

Rating was based on systems that had integrated alarm management inside of their VMS client software. Additional strength was provided to systems that allowed prioritizing alerts, forward alerts, acknowledging alerts and reviewing response procedures for alerts.

Analytics

Strongest rating was provided to systems that supported the broadest number of 3rd party video analytic systems. Weight was detracted from systems that we believed supported more 'motion detection+' rather than 'intelligent video'.

Enterprise Management

Strongest rating was provided to systems that provided centralized account management, configuration and live monitoring across multiple servers. Specifically, strong systems eliminate the need to update passwords manually for multiple servers and allow batch updates to configurations of multiple cameras.

Redundancy

The strongest rating was provided to systems that supported redundancy for their (A) management/enterprise server and (B) recording of specific video channels. No weight was provided to supporting RAID/redundant storage as all VMS systems can support that through COTS hardware.

Video Distribution

The strongest rating was provided for systems that allowed video to be shared or pushed from one client workstation to another over a network. We did not provide any weight to supporting multiple monitors from a single PC (this is very common for systems).

Accuracy of Levels

We believe that most 'ratings' are accurate within 2 levels. For example, we may rate something 'weak' but another informed user might the performance 'moderately weak'.
 
While we have tested these features, we can certainly be slightly off on some details (though we are confident that we are not significantly off - e.g., rating something a strong when really it should be moderately weak, etc.). Secondly, exact ratings from one level to the next is subject to some level of interpretation (e.g., is a system really 'moderately strong' or 'strong').

Discussion

Please use the comments to ask questions or provide suggestions on the criteria selected. 

Please do not comment on specific manufacturer's performance on this thread. I will transfer such comments to the comparison thread as that is the appropriate forum for such discussion.

The Comparison

Read the VMS Competitive Comparison now.