I can add very little to the answers you've already received, but I'd like to reinforce a few points.
John mentions selection guides. I've been looking at laptop upgrades, and it's surprising how inaccessible selection guides can be.
When I know what capabilities I'm searching for, I imagine a list of all products that dynamically re-orders the most relevant products to the top as the customer selects desired characteristics. Why still show the other products? Sometimes all desired features don't exist in a single product, forcing choices. If the buyer can no longer see products with other capabilities, they don't understand what they might have had.
Also, I'd like to second Mark's comments on the informative nature of the Axis website (I hadn't looked at Vivotek). Newbie easy access to white papers and a variety of educational material has been very helpful. It can also support the vendor well because of course the information explains fundamentals, setup, etc. in the context of their own product line. I'd call that a win/win.
It strikes me that a particular brand may not benefit that much from a customer rating system, but a distributor might have less stake in a particular product and might benefit from helping customers see other user feedback through a rating and comment system. After you've spent a bundle developing a new product, it's still costly to see it panned (the truth may not be very helpful after development costs are sunk). On the other hand, closed internal feedback should always be solicited and be easy to provide from every web page. The challenge might be how to categorize it to get it to the right internal party without a significant manpower or infrastructure drain.
Beyond that, as I support a company web site now, I would be interested in hearing what sort of approaches you use for testing. In a niche market, using/observing easily accessible test subjects who aren't the population likely to buy your product has modest benefits but significant limitations. We're reluctant to impose on our customers directly (maybe that's a personal problem?). I can see how this sort of outreach, as you're doing on IPVM, is really good and targeted, but lack of responses also demonstrates the limitations of this approach.
Wouldn't it be interesting if IPVM maintained a standing score of vendor web sites? It wouldn't reflect any IPVM management opinion, but would simply be a compilation of IPVM user votes. ...or even a standing vote of vendors in general would be interesting.
Users could nominate any web site (relevant to IPVM) which would add it to the voting list.
Votes would be simple (terriffic, good, useable, limited, sad)
One vote per registered IPVM user, which could be updated and edited any time the user wished.
Although internal IPVM mechanics would enforce one vote per user, it would preserve each voter's confidentiality within this tool.
Each vote could provide room for the user's comment (eg, what about the site particularly supported the vote?)
Maybe even, other users could vote on a user's comments so that the most agreed-with could rise to the top in some custom report.
The IPVM site would auto-score current info and be able to produce a report at a click, like best voted web sites and worst voted web sites (same list in opposite order), and also most relevant results based on user votes (if implemented).
In fairness, some mechanism (I'm fuzzy here) such that if vendors address someone's criticism, an IPVM tool allows them to ask the anonymous voter for a new opinion.
Which reminds me, ... John, you have a lot of material here. I've read quite a bit of it, but I wonder if folks with limited time might benefit from a set of tools which let them view the most useful first? Have you given any consideration to user votes on discussions, either at the whole discussion level (x out of y users found this discussion valuable) or the contributor level? I think I'd be a loser there with my off-the-wall contributions, but that could save a lot of folks from reading my material, right?
If you ever did that, it might also be particularly interesting (dunno what users might think about this) but if we as users were binned into different categories, users might find that votes tabulated by a certain class of users are more informative to their own particular needs. For example, categories might include installers, distributors, manufactures, and (folks like me) dilletants (people who enjoy reading and learning but have no ownership stake in the industry). Users might find that viewing stuff I find valuable (by my category of votes) doesn't help them find what they want, but viewing stuff rated highly by installers is very informative to their particular needs.