Controversial Ranking Responses To $4.6 Million School Surveillance Project

A member shared this award document and ranking of integrator / manufacturer responses for a $4.6 million awarded school surveillance project.

Here is the ranking:

I will ask the member to give more color in the comments.

We got the best score on reputation and price and still lost. They gave us 1 out of 5 on safety even though we don't have a single safety claim on our workers comp. Didn't even give us a chance to despite.

Am I reading it wrong?

To me it looks like no single bidder had the best score on both reputation AND price.

Condolences in any event. :(

Robert, so you lost by 3 points then? I notice that half of the respondents got a 5 on safety, the other half got a 1, but no one got anything in between. Strange.

Robert, what products did you propose? Roughly how much lower was your bid than the winning one?

With that I have a question on safety. Do you have a documented company safety program? I only ask because I'm curious and want to know in case we ever run into the same issue.

NOTICE: This comment has been moved to its own discussion: Integrators, Do You Have A Documented Company Safety Program?

Did you submit your OSHA logs with the bid response? We did and received all 5 points in that category. I think a better way to score safety would be to rank respondents by their average EMR over the last 3 years. Your safety plan is important, but your EMR's show how safe you actually are.

Your safety plan is important, but your EMR's show how safe you actually are.

In the same vein, the reputation of your "Goods and Services" would be, IMHO, less importatnt than the actual quality of those "Goods and Services".

Reputation is something you rely on when you do not know the quality.

Look at Wade, 10 for quality of G+S, 1 for reputation of G+S???

On the pessimistic side, in cases where the 'fix is in', fudging one of these to favor whomever one wants is even easier than rigging the RFP.

Does anyone know how they defined the difference between the reputation and quality categories? I agree that it is a little confusing.

Our sales rep failed to submit our safety/EMR scores. The district didn't conduct any interviews or ask for any clarification for why we didn't submit them. As I stated before we had no WC claims and all our OHSA logs are in order. Our bid was 1.4 million less with Video Insight,Panasonic and Dell equipment.

So if you got a 5 like half the other respondents, you would have won and they would have saved $1.4 million. Did you protest or challenge or? Would seem to be a mutually beneficial move.

So if you got a 5 like half the others...

And the rest got a 1. Maybe the fact that many others apparently did not submit safety records either, for whatever reason, made it not seem out of place.

The reputation score also has a very clumpy distribution: 1, 5, 10, 15 as if there was some rote mechanism at work here, e.g. start at 15 and subtract 5 for liens, lawsuits, bond defaults. :)

I am quite curious about how they defined the 'Reputation of Vendor and Vendor's Goods and Services' especially in contrast to 'Quality of Vendor's Goods and Services'

Convergint got a 10 in reputation and a 14 in quality while 911 got a 15 in reputation but an 11 in quality.

So Convergint's stuff is better quality but not as good reputation than 911's?

Is this based on the integrator's quality or perception thereof? Or the manufacturers they are recommending?

"On the pessimistic side, in cases where the 'fix is in', fudging one of these to favor whomever one wants is even easier than rigging the RFP."

If I was tasked to come up with a bullet-proof way of gaming an RFP system and awarding a contract to whomever I chose, I would be hard-pressed to design a better structure than they have in place here.

The weight of the scores of catagories, (price is twice as important as any other single issue) combined with the obvious arbitrary assignment of scores, makes a mockery of the public bid process. This document is the handiwork of a consultant justifying their fee. Just curious, were the bidders aware of any of this when they submitted their documents? If I were Convergint, I would take the win, but I would skip the victory lap.

Ironic that the two Convergints couldn't be more Divergent.

Which is good since then there's little chance of confusing them.

The fact they awarded to Convergint speaks volumes about how "smart" they are. They will come back to you when Convergint does a poor job and gets kicked out. We go against Convergint all the time, and well... They might as well be Diebold, ADT, or Tyco.

Where's the controversy? You have to love a customer that takes the time to establish a buying criteria, and has the guts to hopefully award a project on something more than just a low price......Interesting ranking of some of the bigger integration companies in our industry.

Undisclosed 3, who is praising the RFP awarded to Convergint, works for Convergint.

You got to (not) love a guy with the audacity to praise himself undisclosed.

I have thought about this a little. As long as everyone knew ahead of time that the ranking system was going to be applied, it is not such a big deal. If they did not know, I would have to throw a flag. Usually public bids are low bid.

Evaluation criteria seems to be pretty well spelled out in the RFP on page 3.

Thanks for that!

Here is the excerpt of page 3:

I am generally curious what products got what rating. For example, on criteria 3, Convergint got a 14 but Preferred Technologies got an 8, even though Preferred says they pitched Axis and Genetec. What combination of products did the consultant find that was so much better than Axis and Genetec, especially since price was not a factor in this criteria?

It makes one wonder, doesn't it.

We, Preferred Technologies, Inc, scored a 5 out of 15 for Criteria 2: Provides evidence of your experience in planning, staging and delivery of recent projects of similar scope and scale

I guess the 4300 camera upgrade and public WiFi system that we completed at Bush and Hobby airports in Houston aren't worth much. I guess the 1100 cameras across the Houston METRO enterprise score pretty low. The several, very satisfied K-12 customer references that we included were of no value, either.

I have no way to validate this, but I heard through the grapevine that this criteria section was scored based on the number of email responses that the procurement person received from the listed references. I know all of our listed references would go out of their way to validate our reputation. What happens if the email from procurement ended up in the references' junk mail accounts? What happens if the return email from a reference ended up in the procurement person's junk mail?

And Criteria 3...we bid a specified VMS that is, without question, of higher quality than VI. We also bid cameras that strictly adhered to the specifications (aside: John, a good exercise for your camera class would be to read the specifications and use the Camera Finder Tool to see which manuf/models fit each camera type). I even built a matrix for each camera type to show compliance with every individual requirement. The matrices also showed lower cost camera options and where those options did not meet individual requirements, if any. Anyone with an ounce of industry knowledge and experience would not score a Genetec/Axis solution lower than a VI/whatever-camera solution for quality.

I very much appreciate FBISD's decision to procure using weighted selection criteria and not lowest price. However, I question their scoring within each criteria section.

We usually develop a relationship with a customer so that we can influence scoring or selection. We had no such relationship for this opportunity. Did Convergint? I hope we lost to them because they worked hard on those relationships and not because some folks without a strong understanding of this kind of project applied scores somewhat subjectively.


Is there a way to request more information about what each integrator / respondent submitted?