The general rule of thumb is to not exceed 80% of total capacity. So if you have a 100Mbps network, you would not want more than 80Mbps of continuous throughput on that network.
Switch speeds are obviously important, but you also need to look at the server NIC/HDD/VMS as well. Almost any modern system can handle more than 100Mbs on decent hardware, but may top out at 250,300 or 400Mbps. Reaching 800Mbps of sustained throughput on a 1Gbps server is rare if not unheard of. Also note when looking at server specs some ratings account for video/data in and out, others list it as total *combined*, or otherwise.
A 2MP camera should be in the area of 3-4Mbps throughput on average. 61 of those then comes to about 244Mbps peak. GigE would definitely be my recommendation, but I don't know why you'd need 10G unless there was a lot of other data going on the same segment/VLAN (which I wouldn't recommend).
Fast Ethernet -> 100Mb/s
Gigabit Ethernet -> 1,000Mb/s
Gigabit Ethernet uplinks are pretty common and not a material premium, so going for GigE should be a simple option.
You could have a gigabit enabled central switch and use fast ethernet access switches at the edge with gigabit uplinks. This would be cost effictive and still provide you with some overhead to expand in the future if you neded to.
Does 1gbps go till 1gbps?
Yes and no. If your referring to the raw bit rate, then thruputs approaching 1Gb/s can be achieved unspectacularly on a GbE link with decent equipment. Though running without headroom can lead to other problems when the capacity is exceeded, and so pushing more than 800Mb/s on average is likely to be counter-productive.
If however, you are referring to information bit rate, as most are when they talk of video streams, you have to keep in mind that you are talking about (usually) an H.264 payload wrapped inside of a RTP packet which is itself inside of a TCP/IP packet inside of an Ethernet frame. All of these encapsulations require headers; they are one of the tradeoffs that packet switched networks make. On top of that there are TCP acknowledgements and handshakes with every connection and reply.
So, from an information bit rate standpoint you are not likely to get closer than 750Mb/sec of actual data across the link, even if the network is working at 100% utilization.
In our last install that was a similar size, we used 10/100 switches with Gigabit uplink ports (Netgear M4100 series and a Cisco unit) for PoE and a full 24-port Gigabit switch on the back-end.
If your selected VMS provider has an online calculator, it can estimate what your bandwidth usage will be. A rough estimate of 61x 2MP cameras is around 305Mbps (1/3 the capacity of a Gigabit network). You are well within the limits of Gigabit.
With 61 cameras, I agree that a Gig network should be sufficient.
However, it is important to emphasize that when sizing networks, estimating the worst case bandwidth scenario is important. On average, each camera might consume 3Mb/s but most cameras consume significantly more bandwidth at night, so worst case it might be 10Mb/s per camera.
Best thing is to test those cameras with the settings you plan (fps, compression level, CODEC, etc.) at night to see what that might be.
Concur with the comments above. You should in general be using 1 gig copper today unless you have a good reason to back off to 100 megabit (PoE, outdoor switches are good reasons. "The switches at Walmart are cheaper" is not a good reason.)