There is nothing wrong with AMD CPUs for video surveillance. The difference is that single threaded performance is weaker and they do tend to use more power.
I know that Geovision cards have required Intel CPU and Chipsets for a long time. If you try to use an AMD CPU with a VIA, ATI, etc Chipset, the software will crash. That is the only for sure issue I am aware of.
My experience is limited, but I use an AMD machine for one of my demo stations, and we have a small smattering of customers with AMD machines. No issues, but this is mostly a sample set of people running our software.
I can't think of any reason they would cause any issues in a standard deployment.
We used mostly AMD machines when we were rolling our own VideoInsight machines.
I think moreso than the CPU, some systems have issues with certain motherboard chipsets. Back when I was working at a technical school, the high-end video workstations (forget what system and cards they were using) had a very specific list of supported chipsets and motherboards; I think we tried ONCE to get it to run on an unsupported board and it was a nightmare. Don't know exactly what the issue was, maybe something to do with the maximum *sustained* bus speed - some of these systems were dealing with uncompressed component (albeit SD) video.
As far as things like GeoVision, it may (and this is just a wild guess) have to do with their licensing and the way the GV drivers will only work with a legit GV card, perhaps the chipset type is used in there... or the software may make use of certain instruction sets (things like MMX, SSE) that are unique to Intel (yes, I know MMX and SSE are not unique to Intel, those are just a couple examples that came to mind).
That aside, I agree, there's little reason AMD chips SHOULDN'T work in most cases.
AMD's FX-8350 is ~$150 cheaper than the i7-3770.
The FX-8350 uses close to DOUBLE the power - which is actually a nasty problem because:
1. need better case for airflow - $$
2. need better PSU to ensure reliability - $$
3. need better CPU cooler (the stock HSF really is rubbish when dealing with the 125W TDP part) - $$
4. need a to but a GPU - $$ and more watts wasted
We tried it a few times and it ended costing us more than i7-3770 because of the extras above costs to get everything running nice and cool.
Most of our systems get used as clients and servers - so the CPU is always 50%-80% load decoding the live view streams - as such the heat production is 24/7 and relentless - less heat is always better.
Unfortunately this does not look like its going to change soon as Intel will be down to 14nm by the time AMD is at below 28nm.