Cloud Storage For 50 Cameras

I am looking for a cloud video storage solution/company to store 6 months worth of archived video from an exacq NVR? Customer requires that company has been in business for minimum 7 years and replicates the data across several locations. All 1 MP cameras for approximately 90TB of data with a a daily transfer of about 500GB over 100MB+ uplink.

Any suggestions would be greatly appreciated....?


Hi Joe,

I have done this for myself (as a test) using Hikvision and Axis cameras, but I have not had any clients with a fast enough connection to warrant a trial of the service.

Companies like http://www.camcloud.com/ are doing it with Axis, but these companies will probably not meet your criteria of 7 years.

Others on this forum may have the experience you are looking for and I hope someone does. If you have the time, try playing around with AWS and the cloud settings on various cameras.

Best of luck!

Thanks, i have been looking at AWS as an option, awaiting a call back from them, Everything i have found so far is I pretty much have to build and manage a VM from them myself. And it seams as though they also want to charge for transfer usage, not just storage amount. I assume they all will want to do the same thing.

That's the catch with the services - nothing is free. They all use AWS or a similar services (Azure, whatever Google has, etc.) and then add their own costs.

AWS gives you a small VM (Linux I think, Server 2012 might be an additional cost) to try for free for a month or two if you don't want to pay for something untested.

Using http://calculator.s3.amazonaws.com/index.html, I did a quick calc for you:

Windows Server 2012 w/ 8 CPUs and 15GB RAM - $550/mo

90TB magnetic storage - $5150/mo

You could always increase storage as it becomes necessary, but you're looking at $5,700 USD/mo for operation costs with AWS. Azure (https://azure.microsoft.com/en-us/pricing/calculator/) looks to cost around $2,000/mo for a similar package.

OUCH! thanks for the info. I think i can rent brick and mortar space, purchase hard storage and utilize AirFiber Wifi to move the data about cheaper than that. Thats $62k a year! A nice service contract. I'll try to remember to post back what AWS tells me if/when they get in touch.

If I expected to have to retrieve archived video infrequently I would use Amazon Glacier, it's .007 per GB per month, so for 90TB it's $630.

There are no data import fees.

They charge you .50 per 1000 retrievals.

The catch? Retrievals can take from 3-5 hours from initiation, hence the Glacier.

That's pretty interesting. Thanks for sharing.

Google's Nearline Storage (alternative to Glacier) makes the storage available within 3 seconds: https://cloud.google.com/storage/docs/nearline

But with any option you choose, be sure to calculate the full lifecycle costs of writing, deleting, and accessing (not just the $/GB/month)

Might want to investigate the Glacier option a little more:

how-i-ended-up-paying-150-for-a-single-60gb-download-from-amazon-glacier

Interesting article about a guy backing up his cherished music collection thru Glacier only to realize that in all scenarios in which he would actually want to retrieve the data, he would want to retrieve ALL the data, which explains the $150 charge.

This is because he was using Glacier as a backup to files he was using locally.

Nothing could be further from the OP's use case:

Music Backup

  1. Small Storage Requirement
  2. Initial Large Upload
  3. Occasional Additions to Catalog
  4. Redundant Copies of Local Files
  5. If needed, Entire Dataset Would Likely Be Requested.

Video Archive

  1. Large Storage Requirement
  2. Empty Data Store Initially
  3. Regular, Predicable Additions to Catalog
  4. No Local Redundancy
  5. If needed, Small Clips Would Likely Be Requested.

In short, although Glacier may or not be the best solution, I'm not sure what part of this particular article applies to the OP.

Was there something I missed?

I think you might have glossed over this part of the article:

Glacier data retrievals are priced based on the peak hourly retrieval capacity used within a calendar month. You implicitly and retroactively“provision” this capacity for the entire month by submitting retrieval requests. My single 60GB restore determined my data retrieval capacity, and hence price, for the month of January, with the following logic:

  • 60.8GB retrieved over 4 hours = a peak retrieval rate of 15.2GB per hour
  • 15.2GB/hour at $0.011/GB over the 744 hours in January = $124.40
  • Add 24% VAT for the total of $154.25.
  • Actual data transfer bandwith is extra.

Had I initiated the retrieval of a 3TB backup this way, the bill would have been $6,138.00 plus tax and AWS data transfer fees

The bill wasn't $150 after he retrieved the full catalog, the bill was $150 when he was expecting it to be less than $1 based on the amount of data retrieved at that point. However, because the full-month price was based on a single peak retrieval, he could pull the entire music catalog without incurring extra fees once he worked out the bugs in the retrieval software. Even then, the bill was still 185 times higher than what he had expected based on a the simple pricing table.

The broader point being, before I was going to rely on any solution long-term for something like this, I'd want to test a couple of cycles of backup and restore to verify functionality, time to complete request, price, etc.

"I think you might have glossed over this part of the article."

Glacier data retrievals are priced based on the peak hourly retrieval capacity used within a calendar month.

Again, this does not apply in this use case, as you can retrieve up to 5% of your entire datastore per month FREE. OP said data would be need to be accessed "probably never", so 5% seems plenty. Point 5.

The broader point being, before I was going to rely on any solution long-term for something like this, I'd want to test a couple of cycles of backup and restore to verify functionality, time to complete request, price, etc.

Sound advice.

Anyway, as far as I'm concerned Glacier was already dead in the water, once the OP's requirement of instant access was established (see below).

The key with cloud storage beyond the data storage cost and the bandwidth utilization is the ability to play it back without needing an IT person to get involved in the data recovery to redirect it to local storage, then open it with the player app. Searching video would be nearly non-existent. It looks like AWS doesn't support the windows SMB protocol that most NVR's like the exacq I am trying to work with need. Perhaps with some custom programming language in an API and a few redirects you could get the standard cloud storage environments to work, but from what I am seeing they all require a web based upload. I need a file system as in that of an FTP or SMB in order for the exacq to write to the virtual space, and thus also retrieve from that same space for playback.

It appears that AWS is working on it with an NFSv4 but it's currently under limited release: https://aws.amazon.com/blogs/aws/amazon-elastic-file-system-shared-file-storage-for-amazon-ec2/ and even so that may require a 3rd party client app to be installed on the exacq server to mount the storage as a drive.

Here is another calculator that is a little easier to use also: https://aws.amazon.com/tco-calculator/

Also from what i can tell is they charge for the initial upload of data, of which would be progressive in the video surveillance world. This calculator is very interesting in comparing the TCO compared with cloud.

How often do you expect to need access to archived video?

not exactly sure, day to day it would occur daily, but that would be on local storage. Viewing video older than the 30 days probably never, but it would need to be available near instantaneously and in such a manner that i am not having to use the playback app to continuously open individual files looking for events.

Looks like if you use an Amazon Storage Gateway and if the the Exacq NVR supports iSCSI, you may be able to write to Glacier transparently as shown here:

The "backup software" is not anything special except an iSCSI writer. Retrieval would still be manually requested and possibly delayed for hours, but would restore to an Exacq accessible drive so you could use the normal tools.

I'm not sure how to get 90TB in the cloud for under $1000-2000/month otherwise.

Azure has support for SMB 3.0 but that is gonna be a pricey storage bill on a monthly basis. Probably more then the business case can justify. And any custom programming to try to map stuff over just seems like a good way to rack up service calls. What is their goal or concerns that the cloud storage is trying to address?

Also I should point out that traditionally SMB has had major security issues. While in theory SMB 3.0 should be secure you or the customer is going to need to keep an eye out for security alerts for SMB. And patches that can potentially break it. That's gonna need to be figured into the service contract costs.

Looks like SMB 3.0 works with MS Server 2012, no windows 7 support. I'll have to test it. Customer wants the redundancy in multiple locations, So no tampering can occur. But once the costs come in they will more than likely forgo that.

Are they concerned about employee tampering? Because they are much easier ways to achieve a reasonable level of preventing that at the OS level rather then having to do it by moving it to the cloud.

So the customer is out on this as an option. But all the help has been been invaluable. And to to keep up with the suggestions, While the google option appears less exspensive, exacq is only supporting SMB file transfer method. Of which again doesn't appear to be supported directly with google. So I would need another device of which I am unaware of where to go for in which to create this SMB file share.

storagemadeeasy.com claims they can do it, may be wicked expensive.

Interesting someone else to look at but anything that says SYNC, throws a flag, as I don't want 90 here and 90 there, I just want it there and transferred by the day.

This from the whitepaper:

The Storage Made Easy Cloud File Server can be used as either a SaaS service or an onsite Cloud Appliance to address these issues. Firstly any Clouds or SaaS services added to the File Server have information indexed and recorded, and made available from a single unified explorer view. As users create anew content and upload it to any of the Cloud and SaaS services that have been added to the Cloud File Server, each service’s API is used to securely capture, store and index the files. Note that no content needs to be copied. All this is stored and accessed from the original repository.

It's sounds like you get a local machine that proxies all the shares. Don't see pricing, that's what scares me. What the CTO might pay for a solution might be more than the CSO would.

So I spoke to this organization and it appears they have the means to do this with their API, and OPENStack Swift, however their business model is based on multiple users plus what ever cloud storage provider you go with. So they were not to interested in a single user, i.e the vms system. Waiting to receive a costs worksheet.

Eagle Eye Networks has the Ability to do this, we have customers that are recording 1MP cameras for 90 Days. and their system has its own bandwidth management.

Eagle Eye is a VMS in itself and for this request specifically looking for just cloud storage to use with existing VMS (exacq) of which is the dominant VMS for this customer.

You are going to have to reach out to datacenter storage facilities for this. Microsoft, Google, Apple are the more noted names. Others such Equinix, Atos and Cisco are out there. Do a simple search for data center service providers. Your physical location will be a factor to consider when choosing which service will work best.

I have experience working with Microsoft Azure and found it acceptable. They do have some new features that may be better suited for video storage.

Hi Joe,

There's an app for this purpose in our app-store: 6 months video retention cloud storage for $39.99, check www.angelcam.com/apps/cloud-recording for more details.

Any RTSP & h.264 cameras/NVRs are supported currently.

[Note: poster is from Duranc]

Hi Joe, you should check out www.duranc.com and they do offer custom plans for long term storage as well.

Some other thoughts.... I know some of this doesn't exactly fit your request, but with the cost of doing something, the customer may be open to it .... "close enough..."

How about looking at your local providers? If you are in an area of fiber connectivity, Time Warner and others offer off site storage solutions. Maybe stream the video live rather than as backup? Install your own Exacq servers in the local site. More exacq servers, handling fewer cameras, and lots of storage.

This sounds like a sophisticated user. What are they doing on their IT side? If they are asking you to do this on the video side, they probably have the need on the regular data side. This is just data, stop treating it as video.

That is a good option and have looked into it, but the cost of the initial connection is there. Our local providers have Fiber "readily" available, but the cost to bring it into the building is a different story. I know what you mean about it just being data, but it has to be in the form that is readily available to the NVR so that is sourced from within the NVR as a seamless storage product. Which is what has proven to be a very costly cloud storage option. Most providers are using web interface to get data to the cloud, or using synchronized folders. So the storage must be local, have a intermediary device to store and forward, or other customized solution.

Joe, are you looking for cloud storage and archives without any local device installation but that can pull the feeds from nvr directly into the cloud?

Essentially yes. Exacq support SMB 3.0 using windows 2012 server. And from what i found only Microsoft Azure offers it ready to go: https://azure.microsoft.com/en-us/services/storage/files/

So what kind of circuit are you going to use to get this "to" the cloud? I would think fiber is going to be necessary no matter what. The moment you need a T1, or multiples, to get the "up" speed, you are into fiber.

100MB Fiber to the internet, different than that of direct to the local data center.

And that fiber provider does not have a colo site to offer and some type of off site storage service? I would think they offer it directly or are in bed with someone.

On the file side, I'm not familiar with how Exacq does it, but what I use records files in five minute chunks. The files are named to camera date and time. I can offload those files directly and store them any way I want. I can import them back into the DVR and tell the DVR to "reindex" them into its search directory. So with a little knowledge of date and time, you can get them back.

Since this is not user friendly, that is why I suggested putting an Exacq DVR in the colo site. That way retrieval would be the same as any other Exacq user interface - very user friendly.

the file structure has been the pesky problem with the indexing requirements. I have not looked into colo but would image that is fairly costly as well, in that i am still paying for the bandwidth, hard fixed cost of storage, rack space, and self maintenance.

There certainly are ways to that are cost effective to get the files there and back, however the video playback and search functions are pretty much non existent with most methods. Short of individual playback or support from IT to merge them, a manual process. not to mention a 6 month data purge interval.