We can compare pixel sizes to the attached optics f-stop to get a ballpark estimate of whether there is a significant degradation based only on pixel density. This assumes decent lens quality; poor quality lenses will under-perform the f-stop and pixel-size based calculations.
"... there IS a point of diminishing returns when you start to cram super-high resolutions onto super-small sensors."
It's known as the Rayleigh diffraction limit. Basically, a real optical system can't focus a point source of light down to a perfect point. Instead, a point source is blurred out to a disk of some size, known as an Airy disk. In today's camera systems, as long as each pixel is larger than the size of the optical system's Airy disk, the Rayleigh limit doesn't meaningfully degrade camera performance.
Wikipedia does a very good job of explaining this effect, both presenting the math and at the same time providing a quick approximation. At f/8, making pixels smaller than about 4 um across would result in diffraction-limited visible image degradation.
Looking over these posts, I see one reference to a 1/4" 8 MP (3,224 x 2,448 pixel) sensor, which would lead to pixel sizes on the order of 2-3 um. I'm surprised that the math shows the Rayleigh diffraction limit would have already begun to come into play for that 1/4" 8 MP chip with an f/8 lens. However, if this hypothetical system were lensed with a decent quality f/4 lens, the Rayleigh effect would not be significant.
Beyond the Rayleigh diffraction limit, it seems likely that the size required by microelectronics in order to do "things" well will have a much more substantial effect. That's because the designers will (probably, if conscientious) match chips to appropriate optics to readily avoid the Rayleigh diffraction limit. If the much-referenced relationship between chip size and image quality is real, I'd imagine it would be because a big chip is better than a smaller chip because it has more space to pack in other "stuff" (I'm not a microelectronics guy but let's speculate) to improve things like dynamic range, blooming, and sensitivity/efficiency.
What I'm trying to say is, a well matched optics will land the same # of photons per pixel regardless of pixel size (up to the Rayleigh diffraction limit), but it MIGHT be possible (although I'm not ready to stipulate it) that larger pixels can do more with those photons than smaller pixels do.
There's reason to be cautious. Manufacturers of high end gear try to find things that justify their cost premium, and large format imagers are widespread in high end gear. Looking at the golden age of analog audio, a lot of "golden ears" found effects that were unmeasurable with state of the art equipment, but helped buyers justify stratospheric cost premiums. In that vein, I'd want to see robust proof before accepting that larger format imagers are inherently superior to smaller ones.
In any event, we can easily compare pixel sizes to the attached optics to get a ballpark estimate of whether there is a significant Rayleigh degradation, or not.