Most VMS's have default settings for each "driver" - ONVIF is just another one of these drivers.
It's possible that the default setting for the stream of the ONVIF driver could be lower than integrated driver for the particular camera.
That's certainly false!
A direct driver often gives access to configuring / integrating more advanced settings, like PTZs, motion detection, analytics, events, etc. However, that's not video quality.
That doesn't make a lot of sense when it comes to basic video stream quality. At the end of the day, the camera is spitting out video over RTSP (or maybe it uses MJPG, doesn't really matter). How you "find" that URL and connect to it for basic video streaming doesn't matter... It's like a bit of a treasure hunt, but once you find it - via ONVIF, WireShark, API data, etc. - you should be seeing the exact same thing.
As John mentioned, it's possible that the ONVIF default is a lower-res stream or something else, but there is nothing inherrent in ONVIF compliance that should directly impact video quality. ONVIF up to this point has been *mostly* about standardizing the protocols and/or URLs to access the most common features and functions on a device, so that you don't have to play "treasure hunt" every time you're connecting a new device into your system.
John, the theory was that using a driver written using the API from the camea manufacture would somehow give better video quality then ONVIF all other configuration being equal. I hope that wouldn't be the case but was curious about anyone elses experience.
Ken, I doubt that's the case. It could be indirectly because there was some configuration error (i.e., the ONVIF stream somehow got setup with a lower resolution or higher compression rate).
Was there any theory or explanation of why that happend in those instances?