Subscriber Discussion

VMS Benchmarking- Quantiative Test Metrics For Key Feature Validation

KD
Krishnaiah Dayyala
Jun 29, 2013

The challenge is the VMS vendors claim their product is the best in class. How as a customer validate that? For this we are in the process of the VMS benchmarking. I am looking for key features, their testing methods, test metrics (numeric values) and test tools. Also looking for Camera stream simulation software for stress testing. Request for the necessary inputs from the IPVM Team.

JH
John Honovich
Jun 29, 2013
IPVM

Krishnaiah, let's break this into 2 discussions. I've started a new discussion on IP camera stream simulation software.

As for benchmarking, what features are you thinking of benchmarking?

KD
Krishnaiah Dayyala
Jul 04, 2013

Hi John,

Sorry for the delayed response. We would like to the test Enterprise VMS for the following features with more quantitative confirmation:

  1. Live Video View/stream: Response time/latency for a given server configuration. What is the gold standard as of now in the market considering various settings of Resolution, frame rate and bandwidth?
  2. Recording throughput
  3. UI experience/ key metrics for its evaluation benchmarking against the best of the breed
  4. Application Security
  5. Dynamic Bandwidth optimization based on the network availability?
  6. Storage optimization- clear metrics vs. best in class VMS for enterprise versions
  7. Stress testing of the VMS at least to the 1000 cameras for simultaneous loading

Request for your inputs.

Thanks, Krish Dayyala

SW
Sarit Williams
Jun 29, 2013

Krishnaiah,

First, great question. I myself wanted to create a matrix of some kind measuring VMS features and assign value indicators to each to have somekind of a gauging mechnism to compare them thus attempting to remove all biased opinions.

The problem is benchmarking by definition assumes there are some standards in place we could compare and contrast with the VMS in question. There are no such standards I am aware of though IPVM has a general post on how to rate VMSes.

Moreover, VMS are chuck full of features and different ways of implementing those features across vendors. So if we were to take a simple feature such as layouts for example:

VMS A offers layouts ranging from 2 to 36 camera layouts

VMS B offers layouts 2 to 100 camera layouts

What would the ratings be in this case? should VMSB be rated higher or better? could an end user meet all of their needs with say 36 layouts?

Another way to look at it is you may say: does the VMS we are considering offer a minimum of 18 camera layouts? if so Option A would work just fine.

Are you in the process of choosing a best of breed VMS for internal implementation? if so, it may be good to start with what you currently have and ask questions that will help you define your requirements to look for in a future VMS:

  • Are you upgrading a current system or will this be a new install?
  • If an analog system is currently in place you may be want to note the pain points you want to avoid in the next system.
  • you may want to look at VMSes that support hybrid installs so you can still use analog cameras and maybe add on some IPs
  • If it's a large organization, Enterprise management might be key:
    • Active Directory support
    • central management of users/server/camera configuraions
    • Storage considerations
  • many more...
Avatar
Carl Lindgren
Jun 29, 2013

That's exactly what we did for our VMS evaluations. I created a document with a large number of questions/factors and boxes labeled 0-9 for use in our evaluations. A "0" meant the system didn't offer that feature/function and "1" through "9" were awarded based on our judgement of how well the VMS performed that function. Many questions/factors also had a section for measured data like bit rates, latency and time to create clips and blank lines for comments.

The numerical results were put on a spreadsheet and the results were averaged for each manufacturer and all reviewers to obtain a single score. The top three manufacturers moved on to Phase 2, where the evaluation questions were expanded and refined, with factors weighted for importance. Two out of three were invited to continue to the RFP process. In their cases, each system had its plusses and minuses. While one system was better in some areas, the other system shone in other areas, each effectively cancelling out the other. Our purchase decision was based on our estimation of the relative importance of each factor, additional features and functions promised during ongoing discussions and experience in the casino vertical, plus similar criteria for the competing Integrators.

I think John has a copy of our evaluation criteria document.

KD
Krishnaiah Dayyala
Jul 04, 2013

Thanks Carl for sharing your valuable thoughts. This is what we are looking for.

John, could you please share the evaluation criteria document with guidelines on how to rate numerically by the operator?

Thanks, Krish Dayyala

SW
Sarit Williams
Jun 29, 2013
I'll see if I can get my hands on that, Carl. Thanks for the details, I do think there is value in that. There is a trade off though, it could take months to go through the different vmses if one is not sure what to look for. I suggest starting with requirements; understanding what your users base, environment and needs are to find a solution that works best. Also look for scalability and how flexible a vms is. Your needs 5 months from now will likely change, be sure the vms is constantly improving, look at prior release for indication of bug fixes, features added and whether its geared for small or large scale implantation.
KD
Krishnaiah Dayyala
Jul 04, 2013

Thanks Sarit and look for your inputs.

I like the point you made about scalability. Here we thought a) architecturally how well it is designed b) how well it can be scaled up using additional servers/network resources.

Thanks, Krish Dayyala

Avatar
Carl Lindgren
Jun 29, 2013

Sarit,

Phase 1 lasted a week for each system while Phase 2 was about a month for each. Total time was about 9 months for both phases.

SW
Sarit Williams
Jun 29, 2013

Thanks. And this is when you started with requirements/needs, so it could exponentially be higher without first looking at what is needed.

I'm curious..Would you say, load and performance were reviewed as part of this 9 month duration?

Also, did you know which cameras you wanted to use prior to looking at VMSes or did it come up during? meaning.. did camera selection play a huge part in selecting a VMS or would you be willing to change the cameras previosely considered based on the winning VMS (due to integration/feature limitation, like no dewarping for fish eye cameras comes to mind.)?

Avatar
Carl Lindgren
Jun 29, 2013

Yes, we invited a number of manufacturers to submit systems for evaluation based on already-determined factors. We sent them a list of system objectives and requirements. Then I wrote our Phase 1 evaluation form which was tweaked a couple of times during the course of our evaluations.

Phase 2 criteria were based on Phase 1 criteria but expanded based on what we learned in Phase 1 and discussions among ourselves regarding the relative importance of each criteria.

We evaluated a couple of encoders during Phase 1 and more during Phase 2. We also evaluated a number of IP cameras during Phase 2.

SW
Sarit Williams
Jun 29, 2013

I command your due diligence. May I ask...which VMS was finally selected? and having done that, is the VMS selected the right one now that it has been implemented? or are there gotchyas that even going through a 9 months process didn't uncover?

Avatar
Carl Lindgren
Jun 29, 2013

On question 1: IndigoVision.

We'll know the answer to question 2 around the end of the year or early next year ;-)

HC
Hernan Carzalo
Jul 01, 2013
IPVMU Certified

Carl, Is Indigo still based on multicast architecture?

Avatar
Carl Lindgren
Jul 01, 2013

Hernan,

We'll be using TCP/IP unicast for record and a combination of UDP unicast and UDP multicast for "Live" monitoring. The biggest issue is IV's encoder backplane limitations: 320Mbps total including 40Mbps maximum total of multicast streams. On a 20-encoder chassis we would've had to either limit the number of encoders per chassis to 6 (2x6@3Mbps = 36Mbps) or limit "Live" multicast for all inputs to 2Mbps.

I absolutely refuse to display our analog cameras at 2Mbps. The picture quality suffers immensely (keeping in mind that all cameras are 30fps).

MP IP cameras will use TCP/IP unicast for record and UDP multicast for live.

Avatar
Carl Lindgren
Jul 02, 2013

That should be a 20-channel (10x2-channel) chassis. Actually, for multicast viewing we could run 13 channels @ 3.0Mbps (total 39Mbps) or 14 channels at a mixture of 7 @ 3.0Mbps (subtotal 21Mbps) and 7 @ 2.5Mbps (subtotal 17.5Mbps) - total 38.5Mbps. Or 16 channels @ 2.5Mbps (total 40Mbps). Fully-populated encoder chassis must be run at 2.0Mbps/channel to avoid exceeding the 40Mbps multicast limit.

It makes for some interesting contortions to me our goal of 50% @ 2.5Mbps and 50% @ 3.0Mbps. Additional chassis add to both cost and footprint but would solve the multicast limit problem. Fully populating all chassis (40 = 800 channels) requires grouping the cameras so that some clients view multicast while others view unicast streams. That also brings in limitations of how many cameras can be viewed simultaneously without exceeding the number of available streams or the chassis' bandwidth limits.

It would have been nice if all channels could be viewed at full resolution simultaneously but the only way to do that would be to cut bit rate for live view or cut the number of encoders per chassis. IV servers do not re-distribute streams on their own and their proxy servers (that do) add quite a bit to the cost. That was an issue that came up when discussing Axis P3300-series cameras, which are not capable of delivering two streams @ 30fps simultaneously.

KD
Krishnaiah Dayyala
Jul 04, 2013

Carl, Sarit,

Very good discussion

Once we do this exercise, I will share our experiences and final result. at least the key stats that are clearly measureable.. for example how many cameras/CPU core at BW of x and fps y ?

Thanks, Krish Dayyala

SW
Sarit Williams
Jul 07, 2013

Krish,

We, IPVM, don't have a copy of Carl's evaluation matrix. Also regarding the specific features you listed, those are good but are not as simple as it sounds to test, depth will be particularly important here.

Just a couple for example:

1. Application Security: some implement it via LDAP, Active Directory or built in user management (and some are more granular than other implementation).

2. Cameras/CPU core at BW of x and fps y: be especially cognizant of the cameras used (canned video or realtime scene?), network infrastructure and VMS low level architecture (CPU vs. GPU, any hidden dynamic throttling by VMS?) because in some cases it may be comparing apples to bananas.

Avatar
Carl Lindgren
Jul 07, 2013

Sarit,

I did send John a copy of our Phase 2 evaluation form. Let's see if it fits here:

No, it didn't show properly. If anyone wants to give me their email address, I'll send them a copy. PM me at LinkedIn.

New discussion

Ask questions and get answers to your physical security questions from IPVM team members and fellow subscribers.

Newest discussions