Oh, for those of who are developers or desktop power users, here's a good blog post from a programmer using the Seiki 4K monitor.
Trend: 4K Monitors for Surveillance
4K is one of the next growth areas for IP cameras. Indeed, with major new announcements at CES 2014, it may more quickly impact surveillance monitor selection first.
Historically, some objected to multi megapixel cameras because of monitor resolution constraints. Though digital zoom, especially for investigations, still made it useful, clearly being able to display multiple cameras at native resolution live and without zooming holds benefits, including the potential to make it easier for guards to spot problems (and lose focus less quickly).
I can't wait to get my hands on a 4K monitor for development work. I currently have a GeForce GTX 670 card, which seemed to be the best bang for the buck at the time, and I am now a little anxious to see how well it will handle a 4K display. After all, ONE 4K screen is somewhat similar to driving 4 x 1080P screens, so the GPU is going to be busy. Apparently, the new Mac Pro can drive 3 x 4K displays, but the cost of a Mac Pro is a tad bit higher than your average Dell.
I'd be interested to hear about Ray's real life experience with the 4K system (or anyone else for that matter).
We're using the Seiki 4K 120 Hz monitor as one of three displays on our 3D CAD workstation (the other two displays are a low res ~ 20" monitor for status and other text, and a 1080p monitor for lower resolution concurrent 3D work, typically for subassemblies). There's not much to say about it, except that it's a great price point and the display is crisp and clear. There was nothing remarkable about the setup -- we just plugged it in and the system defaulted to 4k on that display.
Morton, we've also had good luck with GeForce graphics. Since this was a recent build, we're using the GeForce GTX 780 which not surprisingly has the horsepower to manage these displays.
Slightly off topic, during the build we ran across a quirk in which Windows 8.1 consistently crashed very early in the installation process, but two different Linux Live CDs ran with no issues, and using their suites of system testing exposed no issues. You probably know the drill: Google a lot, try this, try that, and see what fixes it. Several hours later, placing the graphics card in PCI-e slot 2 moved us past that challenge. The surprise was, usually it's Linux that delivers some hardware or driver incompatibility that takes a while to run to ground, but this time Linux validated the hardware while Windows 8.1 was stopped dead in its tracks.
Casino applications for 4k monitors wouldn't be that different from any other vertical. In fact, I'm not so certain of any real advantage to 4k monitors other than one large monitor having the ability to replace four smaller monitors for Monitor Walls.
One point is that if the observer is sitting the same distance from the same size screen, there would be very little benefit to 4k over 2k. My reasoning is that the maximum perceivable resolution of a monitor would be the point just before individual pixels can be seen (or jaggedy edges on 2k cameras displayed full screen). By that reasoning, if an observer is far enough away from a monitor that they can't see its individual pixels, any increase in pixel count would not lead to increased perceived sharpness of the images.
Video walls for casino surveillance typically deploy 40"-46" monitors mounted 5' or more away from from observers. At that distance with that screen size, even the sharpest eyes can't see the individual pixels so increasing the pixel count to 4k would provide negligible benefit. That said, if four say, 42" monitors were replaced with one 84" 2k monitor, the individual pixels could become evident at the same distance.
Keeping in mind that one 84" 4k monitor would occupy approximately the same wall area and have the same pixel count as four 42" monitors, replacing the four with the one would yield almost exactly the same pixel count per unit area.
We currently have our monitor wall displaying a mix of 1x1 and 2x2 salvos. We see no need or valid reason to display larger salvos because it would be impossible for an observer to process the information from even 16 (2x2x4) simultaneous panes and, of course, as you increase the number of panes in salvos you lose resolution for each pane.
The only possible application for an 84", 4k monitor that would allow increased detail would be to display an image that is larger than 42" from a single camera and, all else being equal, the camera itself would need to have >2k resolution.
To expand a bit on this, replacing four 42" 2k monitors with one smaller-than-84" 4k monitor on our Monitor Wall would make little sense either. We would either be forced to display less than 16 cameras or resolve to deal with lower resolution per camera than we currently get.
While I have stated and believe that one observer cannot view 16 simultaneous images, the one advantage of displaying that many is that observers would only have to shift their eyes to different areas of the wall to concentrate on the cameras displayed there rather than having to call them up. That's a lot faster. 2x2 salvos per monitor on 2x2 or 3x2 walls allow an observer to focus on one display at a time with no more than four images within their area of concentration. Tracking action on four simultaneous images is considered to be within the capability of a typical operator.
Carl, thanks for the feedback.
I agree: "If an observer is far enough away from a monitor that they can't see its individual pixels, any increase in pixel count would not lead to increased perceived sharpness of the images."
What we have done is bought the Seiki $499 4K 39 inch monitor and a 1080p from them at the same size. We'll do a few scenarios with different distances / positioning of the monitors to see how much it helps make out details, watch surveillance feeds, etc. It won't address your 84 inch monitor scenario but I do think it will help some users get a sense of potential applications.
If you or anyone else have input on this test, let us know.
any specific reason for choosing the NVIDIA Quadro K5000? The AMD FirePro W7000 also supports four 4K monitors and it has similar GPU power as the K5000, but it is basically half the price. I'm just wondering actually, since I don't have any experience with those graphics cards.
Tiago, that was a quote from Ray Bernard.
Ethan recently bought a 4K graphics card. What did you buy, Ethan?
Ops... I overlooked that part.
Btw, checking some reviews on newegg.com, the AMD FirePro W7000 does not seem to be that good.
We bought an ASUS GT630-SL-1GD3-L. It's based on an Nvidia chipset, only dual output, though. It's driving a 40" 4k display currently, which we plan to start formally testing today.
Thanks, Ethan. The 40" 4K monitor is manufactured by Seiki, I assume... based on what I read above?
Looking forward anyway to the results of this test.
It is indeed that monitor. Or TV, really. They also have a 50" available now for $899, and a 60" for cool $2000.
By the way, for those interested, the 4k Wikipedia entry has a been kept up to date with 4k displays (TVs, monitors, and projectors) as they're announced or released.
A performance perspective on 4K monitors.
We recently upgraded our lab to include 4K monitors for some of our products, so naturally , I tried out a couple of VMS clients with them as the target output.
What I discovered is the the CPU use was a 30-50% increase due to the larger 'workspace' that the VMS sees and I was using a higher end video card.
My test was four 2MP cams and I switched the output between a 1080P monitor and a 4K one.
So, keep this in mind when your customers ask for 4K monitor output. Prepare for more CPU horsepower.
Great info, Mike!
I just tried this out with Exacq. Which is not necessarily typical of all VMSes.
Four 1080p streams open in a 2x2 view:
- Monitor set to 1080p: 23-24% CPU
- Monitor set to 4k: 27-28% CPU
So clearly there is an increase. I'll be curious to see if it is more or less in other clients. I'll do this on the side of other tests, as we're using other clients. I'll report back if/when I find anything interesting.
So clearly there is an increase.
A 15% increase seems pretty modest considering 4K is rendering 4x as many pixels.
In this case, the 4K quad could be rendered bit-for-bit, where the 1080p version needs to be down-scaled.
On my 4K monitor you can either have the graphics card do the 1080p down scaling or the monitor itself. Whether one way or another affects the CPU more than the other, I'm unsure.
I was using 5MP cams at 10FPS with Milestone. The 2x2 salvo is the same.
The CPU went from 40 to mid 60% simply by changing the monitor.
What might be intereesting on Exacq is the GPU % change since they do use the GPU a little bit.