Most software packages have no idea they are running on a virtual machine. If the VMS says they do not support it, it is likely due to the fact they haven’t tested it. It is important to have the correct hardware in place, as in a processor which can support 64bit VMs. Another issue you will run into is graphics card processing. If the application you are attempting to use, offloads its resources to a graphics processor, you will run into issues with that. Of course, they can be solved with a compatible graphics card in the host, and have RemoteFX (or similar) handle the pass-through.
I don’t think you will have any problems running your VMS in a virtual environment, providing it comes as a standard Windows installer file, and not something that only comes on their own hardware.
It is very common place with large systems. You can purchase one large server (host) and load it with multiple VMs with the VMS to distribute the load. It really cuts down on hardware and increases efficiency in hardware utilization.
If the VMS developer does not officially support running on VMs, the prudent step is to not do it. The reason is that anytime in the future you run into an issue - whether search is slow or video is jerky or a camera cannot connect, etc. - they are likely to blame it on virtualization - right or wrong. Then you will have a bigger headache.
There are many VMSes that support virtualization but typically they list specific platforms and versions, rather than a generic support anything approach.
VMS developers, feel free to share a specific comment about what VMs you run on, including any technical issues to be aware of. Bonus if you can link to an official page / url explaining your support.
The answer depends a lot on both the virtualization technology as well as the VMS and configuration. IT departments love virtual servers because pooled hardware resources are a lot easier to manage and can be more efficient against their budgets. However if they're unfamiliar with VMS software/systems they may be unprepared for the nature of the workload and their cluster could struggle with things like streaming, encoding/decoding, etc.
Be sure to point out the lack of support and reasons why. Then ask for a pilot on the virtualized hardware with a solid plan B to fall back to real hardware if the virtualized solution doesn't work.
I can speak to Genetec Security Center fully supporting installation on VMware or HyperV, but one catch is that there an additional one-time license fee they require to support the system if you call in with an issue. I'm curious if any other manufacturers do something similar...
IPVMU Certified | 06/03/14 10:25pm
We use Video Insight on virtual servers for many customers and have never had a problem with functionality or support. You just need to educate the customer IT manager on the high level of resources that may be needed for an IP video installation so they aren't surprised when they have to start allocating more processing power and storage to your virtual server.
Thank you all for your great comments! Very helpful to me and my customer.
Disclosure: written by a Vicon VMS manufacturer employee.
From the manufacturers point of view, the issue we typically deal with is not the actual running on a virtual machine to which (as said above) the application is indifferent.
The main concerns we see are three:
- Most VMs built by IT tend to be around the typical database / web server type machine and tend to be under the spec we require to handle the massive throughput video requires
- Considering no 1. The number of VMs you can carve out of a physical server is lower for video intensive applications than other (which increases the $ per VM expense)
- The sharing nature of VMs (drives and network cards) is not a friend of intense video systems
What we do is offer a guideline for those who consider VM as must:
- Emphasizes that the VM must be built according to the standard minimum requirements
- Specifically asks that the drives for recording will be dedicated ones and not shared with many other VMs (so I/O is maximal)
- Asks for a physical NIC in place of a virtual one so all bandwidth is available
In many cases where a VM is considered a must, this guidelines puts everyone on the same playing field and eliminates technical issues.
I am running a comlete VMware environment dedicated to our camera platform. I am running all vurtual servers with a direct-attached 130 TB iSCSI array. No big problems so far. Runs surprisingly well considering the recorder vm's have 6 gig mem and write directly to disk rather than archiving.
No problems with playback and I have 3 vm's sitting on standby as well as config files saved off for each production vm. If needed just turn on a spare vm and copy the config over, all set. Each recorder runs about 50 cameras and sits at about 3.5 gig memory running.
Still trying to figure out the best wy to carve out the LUN's though. I gave each recorder a LUN rather than a large shared LUN in case of possible corruption so I wouldn't then lose a bunch of video. Not sure if there is really a best way to handle storage but it is woring well so far with close to 300 cameras.
IPVMU Certified | 06/08/14 11:30pm
We work with "VMware Ready" Solution. VMs with iScsi storage, tested on quite big scenarios (2.5K channels, 1K channels, etc...).
The quick answer is yes, it works. I have worked with Pivot3 in Europe and we've installed VMS like Milestone and Genetec on virtual machine both Citrix and VMware. There's no major issue to install them on virtual servers, virtualization is just sharing harware ressources between logical OS but if the IT department is involved, they will know how to manage it. Give them the fulls specs, (CPU, RAM, HDD, bandwidth for cameras) and it should be OK.
Genetec will add license fee but for Milestone, nothing was asked. For others VMS, you should have to ask.
Hope this helps.
IPVMU Certified | 06/09/14 02:21pm
For other technical refs from a VMS manufacturer... Milestone wrote two papers on the subject - 2009 post, 2011 post
The interesting thing is to contrast the older one to the newer one as the VM software and system hardware has evolved in the time between the two papers.
We deployed a large multi-state 5 city project using ONSSI RC-E software using Virtual Machines. The client provided Flexpod and VMware machines. This was very successful with a few bumps in the road.
Advantages are numerious.
VM vs Pizza Boxes VM is dynamic, if you need more CPU you allocate it, if you need more storage you allocate it. Rack mount machines only have the CPU's you bought or physically change the chipset.
With VM you can allocate any of the resources in any fashion as you need them. Slice it and dice it any way you need.
Hot standby is a big advantage and the cost is minimal as opposed to a hot standby physical machine.
One of the big problems was disc availability on boot up and disc latency. If the machine looses its discs you have paperweights. With the proper configurations this is not an issue.
ONSSI was very helpful throughout the process, they even studied the setup for reference.
One big thing is planning, you have to plan with the hardware providers, the software manufacturer, the client, PLAN, PLAN, PLAN!
"In preparing for battle I have always found that plans are useless, but planning is indispensable."
Dwight D. Eisenhower