Indigovision Rates Avigilon Worst In Class

Oh IndigoVision....

IndigoVision ran their own shootout vs Avigilon, Axis and Bosch.


IndigoVision won!

Obviously, a manufacturer run shootout is ridiculous.

For those who care, IndigoVision rated:

  • Bosch a 7/10
  • Axis a 5/10
  • Avigilon a 4/10

But wait! You can vote who the best was and win a trip to Rio De Janiero (well, unless you pick anyone but IndigoVision, in that case, don't pack your bags).

Also funny is this IndigoVision claim:

"IndigoVision is also known for having the most open, end-to-end video security solution in the world. Every major camera manufacturer can be seamlessly integrated and managed with IndigoVision’s Control Center video management software."

Ironically, IndigoVision's VMS does not have Profile based ONVIF support, only the legacy, much more likely not to work, Archived conformance (which they last submitted in 2011).

IndigoVision just issued a press release announcing Profile S support for its VMS. Finally.

Now, it can do a VMS shootout and rate Avigilon's VMS 1.x ONVIF support worst in class.

I view it as a plus. Now if only ONVIF would add Multicast via ONVIF stream to their profile requirements, I'll be happy as a clam.

Well at least it's a step in the right direction. I'd say many VMSes are going this way. I know of one specific VMS that's changing their entire platform in the near future and if I'm not mistaken ONVIF Profile S support is part of this change.

Hi John, any plans on a new test of IndigoVision like the recent ones for Avigilon, Exacq, Genetec, Lenel, Milestone ?, last one on IndigoVision I believe was 2010. BTW, they do claim to be positioned as the most open end to end solution on the market in their Second-Interim-Report-2014 (Available on their site).


No plans right now. Niche company, not a lot of overall interest.

As for being the most open end to end solution, that's just marketing. It's also doublespeak because by framing it as 'most open end to end solution', they are only comparing to other 'solution' providers like Avigilon, Verint or DVTel, not VMSes like Genetec, Milestone, Exacq, etc.

Second place gets a set of steak knives.

  • Bosch a 7/10
  • Axis a 5/10
  • Avigilon a 4/10

Ok, but I couldn't find 'hide nor hair' of IndigoVision's score...

It's the denonimator...

Jim, Yes, IndigoVision did not give themselves a score but their self-assessment was unrestrained:

"We think the pictures speak for themselves. Rich, high definition video and image clarity, in all lighting conditions."

As the premiere shooter-outer in the industry, can you critique their methodology/analysis?Specifically these questions:

  1. Do you think the camera selection was a fair representation of the class?
  2. Do you think that the scene selection was reasonable, (if limited).
  3. For the images that were provided, was the commentary reasonable?

A couple of ways Indigo could win without lying:

  1. Start with twenty cameras, only include the ones Indigo beats.
  2. Start with twenty scenes, only include the ones that look better for Indigo

Maybe IPVM could do the same cameras and similar scenes and see how close the results are, a shootout shootdown?

A few thoughts:

  • On camera selection, I am pretty sure Axis would have submitted / recommended their P3384 dome for such a low light / WDR shootout.
  • They put all the cameras at 1/15s slow shutter. This is weird because they are touting 2K, which is 30fps. You can't deliver a full 30fps at 1/15s shutter. Of course, they did this because their (and competitors) camera would have looked really bad at 1/30s, with half the light reduced.
  • The test was done in their office so it's likely they have been optimizing their settings for this environment, either explicitly for this test or just in general as that is their workplace.
  • It would have been interesting to see a bandwidth analysis, given their aggressive bandwidth savings claims (of course, properly normalized at the same quantization level).

As for testing IndigoVision's camera, there's not a lot of interest in them.

I give them credit from a marketing perspective - proclaiming that you are better than 3 of the biggest players in the industry is a way to draw attention, if not credibility.

Hi John,

Yes, it's always suspect when you are a manufacturer who conducts their own tests and make it public. Of course as you have said before these are usually not made public.

However, most vendors' reps have fancy slideshows and videos when presenting new products where they compare it to other known brands. In the instance where brand A comes to see you their cameras are clearly better than brand B. Next week brand B comes to see you and lo and behold, their cameras are better than brand A.

The best way to gauge the real winner is over a sustained period of time running multiple tests and not just low-light tests. At the end of the day the customer will appreciate low-light performance but there are many other factors that need to be taken into consideration.

I myself often request equipment from vendors who are not part of our portfolio simply to conduct in-depth tests. If it happens the vendor that's not part of our portfolio is better in whichever instance I document this and take it back to our vendors. I also inform the other manufacturer and give them the results.

In the event they are significantly better across a number of categories and could potentially fill a spot or replace a spot on our portfolio I take it upwards.

Ultra 2K camera?

It's their marketing term for 1080p. Here is the camera datasheet.

I guess they want to make people think / associate it with 4K - "Hey you know 4K? Well our camera is sort of half of that...."

Also, IndigoVision has a classically innane 'up to' claim:

"IndigoVision’s SMART.core technology gives highly optimised H.264 compression and ACF+, reducing storage costs by up to 95%."

Although I can't agree or disagree with their "tests", not having other cameras for comparison, I will say that their 12000 camera does provide excellent picture quality. We have been testing a beta version for a number of weeks. That said, I would love to see how it compares with its competition when tested by an unbiased tester.

I wouldn't be surprised that it's a solid camera. It's brand new, compared to the Axis and Avigilon models which are around 2 years old, so that means better imagers and chips are available.

It would be like Axis testing one of their brand new models vs one of IndigoVision's older domes. IV would get smoked.

For what it's worth, IV cameras can provide up to at least eight Unicast streams at 30fps plus a Multicast stream. Although that capability is probably not needed for many applications, I really don't see any other cameras with that capability.

"It would be like Axis testing one of their brand new models vs one of IndigoVision's older domes. IV would get smoked."

Seems a bit unsporting if I do say, why not get the Axis domes smoked too?

Jim, the IV dome gets smoked, it doesn't start out that way. :)

C'mon now, what'all gave me away? ;)

Sorry, wasn't trying to be a jerk, I swear I didn't even notice it says undisclosed... :(

For anyone who has yet to vote in the 'competition', remember in order to win you will need to submit to IndigoVision whatever your choice is.

Although how the winner is chosen is not stated, my intuition tells me that the best camera to choose might be IndigoVision, if for no other reason than it will make the required 'publicity agreement' a little less awkward, i.e. ("Although I actually voted for Avigilon, I have always respected IV...").

Also note that the airfare is listed as one person and "one return trip"' which is sure to spark energetic dialog about IV with ones significant other...

Update: It gets worse, the winner may be announced after the dates that the trip must be taken by: return trip... which must be taken before 31/12/2015...

The winner will be announced by 31/12/2015...

Maybe if you are feeling lucky you can just go to Rio, and then get reimbursed after you win...

I think you should enter, Rukmini. Pack your bags and arrange transportation to the airport now, since you won't know if you've won until you get to Rio...

It's amazing the qty of crazy there is in this Industry...

I would really like if the other manufacturers replied with tests using the same cameras to see what results look like.

Ulrich, that's awesome! Could you imagine if each manufacturer published their own shootout results :)

Btw, this does happen privately / internally and often gets presented by RSMs in their sales decks. However, what's different is that IV has turned it into a marketing campaign and a contest.

Indeed, some open warfware would make for great sideline viewing though.

Has IndigoVision VMS been tested by IPVM? I cannot find it in any comparisons. Out of our 50 plus offices we have VMS installed we have about five Indigo installations that the local management chose. I think the sales team woo'd them with their compression claims, "10 year future proofing", extreme hardware life claims, open camera architecture, etc.

The company appears to have semi-slow growth and only appox. 52 million in revenue (2013). Given our spend of roughly 10+ million annually I would be hard pressed to be a customer that is 20% of their annual revenue.

Are there any reasons to seriously look at Indigo? Anybody a strong supporter or have experience with the product for large scale enterprise use?

B, we tested IndigoVision's VMS a number of years ago. It was solid back then.

Their revenue growth was slow and they also had a management crisis a few years ago (e.g., IndigoVision's CEO Resigns / Removed).

IndigoVision has good technology, they've just been very pricy historically and designed their system more for the UK market (heavy on encoders, late on multi-megapixel).

Carl (above) uses them for a fairly large scale casino operation and IV does have quite a number of enterprise users.


We are using IndigoVision. We chose them after a series of comprehensive evaluations of a number of VMS' that specialize in the Casino Surveillance vertical, including Avigilon, Genetec, Pelco Endura, Dallmeier and Geutebruck.

Our tests of VMS' included ease of use, ability to accomodate third party devices, clip creation time and capabilities, User Management, flexibility of the GUI, encoder latency, video quality and efficiency, ability to integrate with eConnect POS interface and capability of additional features like motion search, bookmarks, Monitor Wall control, etc. We also interviewed all of the manufacturers to determine how dedicated they were to customer support and how open they were to suggestions to improve their systems.

The first stage of our evaluations lasted approximately a week per system. Three systems (IndigoVision, Dallmeier and Geutebruck) were chosen to move on to a second, month-long set of more comprehensive tests. IndigoVision was the final winner and installation was completed in December.

Carl, can you tell us about 3rd party camera integration?


They can integrate with third party cameras via their ONVIF streams or via a separate software called IP Camera Gateway that accesses the cameras' proprietary streams directly. We chose not to buy the latter as it has to be installed on separate server(s).

Although IV Control Center is only ONVIF Profile 1 compliant right now, it is my understanding that the next upgrade release will be ONVIF Profile S compliant, which should enhance suport for newer devices.

Through our limited third party camera tests, we've found that we can't utilize every ONVIF-compliant camera on the market at this time but we can use a number of them. The major impediment is our choice not to purchase their IP Camera Gateway software or their Proxy Server software (that redistributes a single stream to multiple endpoints). Without either of those, we require each third party device have the ability to distribute two ONVIF streams: one TCP/IP Unicast for recording and one Multicast for viewing by multiple simultaneous users.

While that doesn't make IV Control Center totally open at this time, it is far more open than many other casino-oriented systems like Honeywell, Dallmeier and Synectics.

I have not experienced their compression technology first-hand, but I was involved where latency was tested with IndigoVision on IndigoVision and 3rd Party cameras on IV and Milestone.

The results from IV to IV were better than 3rd party to IV. Latency to Milestone was fairly equal over 3 brands (including IV).

This was quite a while ago and the tests were conducted in a virtualised VBlock environment.

@Ulrich ... thanks for your suggestion re: copyright footer. We are a doing major upgrade and I will make sure your suggestion is included.


IndigoVision switched to h.264 a few years ago. We didn't test their IP cameras for latency during our evaluations but their encoders tested at 140ms to 150ms, which is among the lowest latencies we found. PTZ latency is very acceptable - we have installed 13 of their 11000 HD PTZs and have no problems controlling them.

Hi Carl,

Yes, IV to IV is quite low latency wise.


We did recently test a Bosch, a Sony and a JVC PTZ and didn't notice any appreciable latency increase over IV's own PTZs, although control of the Bosch and Sony PTZs was a little "flaky". By that, I mean the PTZ continued to move for perhaps 1/2 second after the IV joystick was released. The JVC PTZ did not have the same issue.

We will repeat our testing once IV has "upgraded" to Profile S. The consensus is that ONVIF incompatibility was a likely culprit. The Sony PTZ had other issues - primarily inability to provide Multicast via its ONVIF stream. Sony claims they will be adding that capability this summer.


Sure, it could be an ONVIF related issue or simply something with the camera(s). The latency I experienced was not that dramatic either, simply that IV to IV was slightly better but to Milestone the results were somewhat less disparate. Again, this was tested in a virtual environment and not straight, therefore this may also have an impact on the results depending on how well the VMSes run in virtualised environments.

PS: The results over the virtualised environment varied from 180ms to 450ms depending on camera and VMS.

I can concurr with Sony (it was a 6th Generation static camera after initial release) and its results were the slowest.


That jibes with our tests. IndigoVision and Dallmeier had the lowest latency, while Avigilon and Pelco had the highest at ~500ms and ~350ms respectively. That was feeding analog cameras into their encoders.

Hi Carl,

Of course, tests vary depending on tests conducted and infrastructure used.

On a side note: We tested JVC about 2 years ago and while the quality was good there was some jitter. The test only involved RTSP through VLC so it was not tested through any VMS.

What I liked about JVC as a geek was their very logical and in-depth camera menu system.


Unfortunately, the scuttlebut I've heard is that JVC Kenwood is considering exiting the video surveillance business. Despite being a fair-size electronics manufacturer (2013 Net Sales of $3.3B USD and Net Income of $12.2M USD), they are a very small factor in this business.

Well, I'm biased! It appears to me that the IndigoVision 2K has the best image.

With that said, I have no idea if the image was taken directly out of the camera or if additional processing was done to make it look good. Or any other tweaks on the camera.

But I'm hoping to win the trip!

Hum.. Why Avigilon's FOV wider in low light image?

It probably shouldn't be, someone just forgot to crop and zoom it that's all. ;)

Have a look, Alex is right, the worst in class Avigilon clearly gets the worst FOV (Biggest), and Indigo clearly gets the best (Smallest).

In every direction Avigilon shows the same or more scene:

More bricks, more trees, more ground, more (or at least same) car, more cowbell.

Whether there is some legit explanation or or not, why would Indigo want to even look like they're cheating?

@John, with your new all-in-one calculator, the rumor is one will be able to upload custom scenes, like these ones. Would then one be able to back into the actual focal length used for each vari-focal, roughly? With a bit of trial and error, no doubt.

"Whether there is some legit explanation or or not, why would Indigo want to even look like they're cheating?"

It's a marketing stunt. I don't think they care or think that far ahead.

As for the calculator, it would be able to automagically deduce which image has a wider FoV. The plan is that the user inputs the AoV / HFoV of the image and we use that to display / manipulate it.