Richard, that's definitely interesting.
My main concern is how to do it in a fair and accurate way. We could certainly subject cameras to extreme conditions but it would be much harder to fairly replicate or test for long term endurance (i.e., does camera A tend to die after year 2 but cameras B has no issues past 5 years).
I am open to discussing ways to do this. I also think a survey could be very interesting (maybe something like "which cameras have you had the most significant outdoor quality / reliability problems with?").
Sounds like we're mixing design quality and manufacturing quality. If a camera is designed well, and manufactured correctly it could/should pass most of the one-off tests you can throw at it and expect it to resonably pass (that is, after all, the design criteria).
What's more difficult to quantify is how consistently the camera's being made and how often or how frequently they'll fail in the field. Even poor manufacturing practices will result in some devices working fine. It's the one 1 in 100 that needs to be replaced that really costs the integrator/customer and can make it not worth the lower price.
The standard IK10 and IP66 tests require that the testing be done 1) multiple times and 2) from a variety of angles. It can not simply be hitting the sweet spot to make sure that it passes. Most companies use an independant 3rd party for these tests.
If the dome barely makes it throug an impact test, to me that sounds like it is starting to break, but is still intact. I don't think that would pass an IK10 test...