Such determinations must draw upon the direct experience and expertise of the officials responsible for the award decision and be clearly reflected in the agency record.  A recent Federal Circuit decision addressed the question of whether an agency’s best value decision and associated tradeoff judgment may be accomplished through the use of quantitative computer software, where the record of the tradeoff was in the proverbial numbers and not in a more conventional narrative format.  Specifically, the protester alleged that the numbers were insufficient to properly evaluate, much less justify, the agency’s award decision.  As discussed below, the Federal Circuit disagreed.

Croman Corp. v. United States involves a “best value” acquisition by the US Forest Service for heavy and medium exclusive use helicopters for large fire support.  The solicitation stated that non-price factors were “significantly more important than price.”

To assist with the evaluation, the Forest Service utilized a “computerized optimization model” or “OM” to make the best value determination. As explained by the Forest Service:

“The OM assists the agency in its evaluation by providing a mathematical solution that recommends a set of awards based upon the importance the agency assigns to the evaluation factors the Forest Service is using in a given procurement. To run the OM, the Forest Service enters all relevant bid data, including prices, into the database, and programs the OM to incorporate the percentage weights assigned to each technical evaluation factor, reflecting the relative importance of each selection criterion. The OM thus provides a recommendation that is tailored to the objectives of the procurement for which it is being employed. Accordingly . . . the OM offers an overall objective of determining, for each line item, the overall best value to the Government.”

Approximately eleven months after the solicitation was issued, and after discussions with offerors, Croman was notified that it was not successful.  It filed a protest with the GAO in January 2012. Two other unsuccessful offerors also filed protests. In response, the Forest Service elected to take corrective action, which resulted in the dismissal of all protests by the GAO.

As is often the case, the agency’s re-evaluation did not result in a different award decision. The Forest Service explained: “On each of the previous OM summaries we have performed an abundance of confirmation checks to ensure the program is optimizing the inputs and providing the overall “Best Value” to the agency.  This OM for Large Fire Support has been no different in fact we have re-checked the inputs and outputs to ensure the program is working as expected and reconfirmed its application as being a valid tool.”  The Forest Service thus concluded:

The recommendations should be awarded, as modeled, without necessitating any human element changes.

Croman filed a second protest with the US Court of Federal Claims in February 2012, arguing “that many of the errors allegedly committed by the Forest Service in the initial evaluations and initial best-value tradeoff determinations were repeated during the corrective action.”  The Court of Federal Claims rejected these arguments, and Croman appealed to the Federal Circuit, asserting (among other things) that the Court of Federal Claims had erred in its determination that the agency made a proper tradeoff analysis.

Croman specifically asserted that the Agency failed to follow FAR 15.308, which states that “the source selection decision shall be documented, and the documentation shall include the rationale for any business judgments and tradeoffs made or relied on by the SSA, including benefits associated with additional costs.” Croman argued that the record contained no declarations or the like by the SSA as to the relative strengths found in any proposal, “let alone whether these relative strengths were worth paying hundreds of thousands or even millions more to obtain” (perhaps suggesting that Croman’s price was lower than that of the awardees).  The Forest Service responded that the point by point comparisons for individual line items were sufficient to make and justify the tradeoff the re-evaluation and re-award decisions.

The Federal Circuit sided with the Forest Service.  The Court explained that the Forest Service’s documentation of the award “conveyed as much information, if not more” than Croman believed was required under the FAR.  The Court explained that the OM provided a “mathematical solution” that recommends the award based on relevant bid data and, thus, the OM “takes into consideration and its analysis the type of detail that is warranted in these cases.” Consequently, the Court concluded that the Forest Service’s decision not to award to Croman had a rational basis.

This decision is, indeed, unfortunate.  While there may be certain, complex acquisitions in which computer modeling and analysis may be useful to the Source Selection team, best-value award decisions ultimately require the exercise of judgment (by a human) to determine whether technical superiority is worth a higher price.  What is more, the record must be sufficiently detailed to justify the award, and permit an evaluation of whether the award decision has a rational basis.  This case appears to establish that an agency may discharge its duty to exercise rational judgment through the use of quantitative computer modeling, which, unsurprisingly, does not leave much room for a challenge if the agency merely follows its own model.  And this presumes that the quantitative record is sufficiently detailed to make such a challenge.   For instance, in the absence of an explanation, how would one challenge whether the inputs and outputs have been “optimized,” or whether the program is a “valid tool” for assessing best value?

The rule should not be that, if the agency follows its own model, then the award decision must have a rational basis.  Rather, if the agency intends to delegate significant source selection responsibility to a computer model that generates primarily quantitative results, the agency should be required to demonstrate that the use of the model has a rational basis.  Only then can there be a meaningful assessment of whether the tradeoffs were performed and whether the agency has, in fact, made a decision reflecting best value.

This decision also addressed a claim regarding bad faith, which we shall address in an upcoming post.