Good question. I still see some IP cameras in North America with a max frame rate of 30fps but in Europe it is 25fps, presumably because it follows the old NTSC vs PAL standard differences.
Safety certifications likely still differ (UL vs CE).
That's all I can think of but I bet others will have some insights.
I guess it depends on what camera is being used, if a PAL sensor(Europe) is being use then only 25 interlaced frames will be generated by the camera as it takes advantage of the electrical current in the Region ie. PAL 50 hertz(cycle) hence 50 fields or 25 interlaced frames. In Countries where we have 60 hertz(cycles) such as North America then it is 60 fields or 30 interlaced frames. Also a lot has to do with the capability of the read out speed of the CCD or CMOS, once you go into the Megapixel(Millions of pixels) the read out speed is lower as the CCD/CMOS can only read so many Millions of Pixel every second, ie. frame rate(IPS) a 16 Megapixel sensor can only be read about 5 times every second, hence 5 IPS. Traditionally the higher the Megapixel, the lower amount of images per second. Same goes for storage calculation based on the number of pixel and bit depth, see.
As a manufacturer that moved from analogue to IP it has simplified our logistics and production as the regionalisation issues have diminished significantly: we now only have to manufacture one type of each camera model for the global market. For cameras with a video output built in we have the option to switch between PAL and NTSC but for the IP feeds there are no longer any regional variations due to TV standards - we can have 30fps at 1080p in any country.
Jean-Pierre's first comment is correct for TV based systems but not for IP - we don't rely on mains frequencies to drive camera synchronisation as there is no need to "sync" to a scanning CRT monitor anymore. Most IP cameras use PoE or DC which means we use internal sync generators to set the frame rates - not external.
There are still different national standards and languages to consider but as a rule the camera is the same for all countries now.
This is not quite correct about having to worry about NTSC or PAL. You will need to change the frame rate to match the 50hz or 60hz flicker from fluroescent lights when used in these conditions. Recently I had to adjust an Axis Q1604 to 25ips to match my 50hz fluro light flicker then 12.5ips because i didn't need such a high frame rate.
We delibrately degrade the colour acuracy for the NA market who are used to NTSC (Thats a joke btw...).
Flicker is a problem. But you can run 30fps with 50Hz flicker if the camera is designed correctly.
Running at 25fps rather than 30fps does give just a little bit extra low light sensitivity.
Expanding on Hugh's statement about fps and low light sensitivity, I have seen this occur when certain cameras increase / lengthen their shutter when moving from 30fps to 25fps. However, many cameras that we have tested just keep their shutter at 1/30fps regardless (unless one manually configures it to be less). Of course, other cameras, default to really slow shutters regardless (like 1/8s, 1/6s, etc.)
NTSC... PAL.... Whatever happened to SECAM?
How can anyone forget NTSC Never The Same Color or PAL Perfect At Last and SECAM System Essentially Contrary to the American Method
No offence intended, with all do respect, enjoy
John, good point (you beat me to it). Following this thread, the incremental extra time per frame afforded by running 25 Hz instead of 30 Hz (about 7 msec) can be used inside the camera imaging functions (e.g. shutter speed) or outside the camera such as for compression or analytics. In general, if your camera is hitting the wall, running at 25 Hz may relieve some small amount of stress.
Thank you for the excellent feedback and insight. I assumed most manufactures would want to streamline their production runs and it seems that most are doing just that.