FEMME COMP INC. v. USA, No. 1:2008cv00409 - Document 115 (Fed. Cl. 2008)

Court Description: PUBLISHED OPINION (originally filed under seal on September 12, 2008). Signed by Judge Margaret M. Sweeney. (kb1)

Download PDF
FEMME COMP INC. v. USA Doc. 115 In the United States Court of Federal Claims Nos. 08-409C, 08-419C, 08-432C, 08-454C, and 08-474C (Filed Under Seal: September 12, 2008) (Reissued: September 30, 2008)* ************************************* FEMME COMP INC., TECHNICAL AND * PROJECT ENGINEERING, LLC, L-3 * SERVICES, INC., DATA SYSTEMS * ANALYSTS, INC., and BEARINGPOINT, * INC., * * Plaintiffs, * * v. * * THE UNITED STATES, * * Defendant, * * and * * SAVANTAGE FINANCIAL SERVICES, * INC. and BOOZ ALLEN HAMILTON * INC., * * Defendant-Intervenors. * ************************************* Postaward Bid Protest, 28 U.S.C. § 1491(b); Cross-Motions for Judgment on the Administrative Record, RCFC 52.1; Permanent Injunction; Standing; Evaluation Scheme; Competitive Range; Discussions; Proposal Evaluations; Adjectival Ratings; Quantity and Quality of Evaluated Strengths; Crosswalks; Parent and Affiliate Corporations; Small Business Participation; Buying-in; Price Reasonableness; Best Value Tradeoffs; Weight of Evaluation Factors; Elevation of the Price Factor Terrence M. O’Connor, Arlington, VA, for Femme Comp Inc.; James S. DelSordo, Manassas, VA, for Technical and Project Engineering, LLC.; W. Jay DeVecchio, Washington, DC, for L-3 Services, Inc.; Holly E. Svetz, Vienna, VA, for Data Systems Analysts, Inc.; and Kevin P. Connelly, Washington, DC, for BearingPoint, Inc., for plaintiffs. Kenneth D. Woodrow, United States Department of Justice, Washington, DC, for defendant. * This reissued Opinion and Order incorporates the redactions proposed by the parties on September 26, 2008. Where one party proposed a broader redaction than another party, the court chose to make the broader redaction in all but a handful of instances. Moreover, the court redacted the awardees’ prices as requested by defendant and Booz Allen Hamilton Inc. Redactions are either indicated with a bracketed ellipsis (“[. . .]”) or clearly designated “redacted.” Dockets.Justia.com Timothy Sullivan, Washington, DC, for Savantage Financial Services, Inc. and Marcia G. Madsen, Washington, DC, for Booz Allen Hamilton Inc., for defendant-intervenors. OPINION AND ORDER SWEENEY, Judge This consolidated postaward bid protest comes before the court on the parties’ crossmotions for judgment on the administrative record, plaintiff Femme Comp Inc.’s motion to supplement the administrative record, defendant’s motion to strike, and defendant-intervenor BearingPoint, Inc.’s motion to strike. Plaintiffs Femme Comp Inc. (“Femme Comp”), Technical and Project Engineering, LLC (“TAPE”), L-3 Services, Inc. (“L-3 Services”), Data Systems Analysts, Inc. (“Data Systems”), and BearingPoint, Inc. (“BearingPoint”) protest the Army Contracting Agency’s award of five contracts pursuant to the Program Management Support Services 2 acquisition. Two of the five successful offerors, Savantage Financial Services, Inc. (“Savantage”) and Booz Allen Hamilton Inc. (“Booz Allen”), have intervened in defense of the award. For the reasons set forth below, the court denies as moot Femme Comp’s motion to supplement, grants defendant’s motion to strike, grants BearingPoint’s motion to strike, denies the motions for judgment on the administrative record filed by Femme Comp and TAPE, grants the cross-motion for judgment on the administrative record filed by Savantage, and grants in part and denies in part the motions for judgment on the administrative record filed by L-3 Services, Data Systems, BearingPoint, defendant, and Booz Allen. Due to the length of this opinion, the court provides the following table of contents: I. BACKGROUND. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 A. The Procurement Process.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1. The Solicitation.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 a. The Content of the Proposals. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 b. Evaluation of the Proposals. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 2. The Source Selection Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 a. Source Selection Organization. . . . . . . . . . . . . . . . . . . . . . . . . . . 11 b. Source Selection Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 c. Source Selection Ratings.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 3. Initial Proposals, the Competitive Range, and Discussions. . . . . . . . 18 4. Final Proposals. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 5. Contract Award and Debriefing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 B. Procedural History. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 1. Protests Before the Army and the Government Accountability Office . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 2. Proceedings Before the Court of Federal Claims. . . . . . . . . . . . . . . . 29 II. LEGAL STANDARDS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 -2- A. Bid Protests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 B. Judgment on the Administrative Record. . . . . . . . . . . . . . . . . . . . . . . . . 30 III. DISCUSSION. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 A. The Army’s Evaluation Scheme. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 B. The Competitive Range. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 1. The Army Considered Price When Establishing the Competitive Range . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 2. Femme Comp’s Evaluated Deficiency Did Not Lead the Army to Exclude Femme Comp’s Proposal From the Competitive Range. . . 36 3. The Army Properly Evaluated Femme Comp’s Proposal.. . . . . . . . . 36 4. Because the Army Properly Excluded Femme Comp’s Proposal From the Competitive Range, Femme Comp Lacks Standing to Protest the Army’s Contract Awards. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 C. The Army’s Discussions With the Offerors. . . . . . . . . . . . . . . . . . . . . . . 41 1. Discussions With TAPE. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 2. Discussions With Data Systems. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 3. Discussions With BearingPoint. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 4. The Army Conducted Meaningful Discussions. . . . . . . . . . . . . . . . . 47 a. The Army’s Performance Risk Discussions Were Proper. . . . . . 47 b. The Army’s Price Discussions Were Proper.. . . . . . . . . . . . . . . . 48 D. The Army’s Evaluation of the Nonprice Factors. . . . . . . . . . . . . . . . . . . 49 1. The Army Properly Evaluated TAPE’s Technical Proposal.. . . . . . . 49 2. The Army Properly Considered the Strengths of Subcontractors. . . . 51 3. The Army Improperly Evaluated the Offerors’ Program Managers’ Onsite Decision-making Authority. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 4. The Army Was Not Required to Perform Crosswalks Between the Offerors’ Proposed Labor Categories and the Labor Categories Described in the Solicitation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 5. The Army Properly Evaluated Systems Research’s Parent and Affiliate Corporations.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 6. The Army Improperly Evaluated the Small Business Participation Factor. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 7. The Army Properly Evaluated Data Systems’ 2006 Small Business Participation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 E. The Army’s Evaluation of the Price Factor. . . . . . . . . . . . . . . . . . . . . . . 64 1. The Army Was Not Required to Take Additional Action to Mitigate the Potential for Buying-in.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 2. BearingPoint’s Assertion That the Army Was Required to Take Additional Action to Assess Price Reasonableness Is Untimely. . . . 67 3. The Army Was Not Required to Perform Crosswalks Between the Offerors’ Proposed Labor Rates and Their Technical Volumes. . . . 70 4. BearingPoint’s Contention That the Army’s Use of the Highest -3- Proposed Labor Rates to Calculate the Offerors’ Total Prices Is Untimely.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . F. The Army’s Best Value Tradeoffs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Tradeoffs Concerning L-3 Services. . . . . . . . . . . . . . . . . . . . . . . . . . a. L-3 Services’ Arguments.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . b. Defendant’s Arguments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Tradeoffs Concerning Data Systems.. . . . . . . . . . . . . . . . . . . . . . . . . 3. Tradeoffs Concerning BearingPoint. . . . . . . . . . . . . . . . . . . . . . . . . . 4. The Source Selection Authority’s Best Value Tradeoffs Were . . . . . G. Plaintiffs Have Established Prejudice on Behalf of the Unsuccessful Offerors Competing Under the Unrestricted Portion of the Solicitation ......................................................... H. Award of a Permanent Injunction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Success on the Merits. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Irreparable Injury.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Balance of Hardships. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4. Public Interest. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 72 75 76 80 81 83 85 90 91 91 91 91 92 IV. CONCLUSION.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92 I. BACKGROUND1 In Part I.A of this opinion, the court generally describes the procurement process from the issuance of the solicitation on June 8, 2007, through the award of the contracts on March 20, 2008. Additional, more specific facts, identified in the parties’ motions and responses, will be described within the court’s discussion in Part III. In Part I.B, the court describes the procedural history before the United States Government Accountability Office (“GAO”) and the United States Court of Federal Claims (“Court of Federal Claims”). A. The Procurement Process 1. The Solicitation On June 8, 2007, the Army Contracting Agency–Information Technology, E-Commerce & Commercial Contracting Center (“Army”) issued solicitation number W91QUZ-07-R-0007 for the Program Management Support Services 2 acquisition. AR 48, 53. The acquisition concerned “the full range of information technology-related program management support services” required by the Office of the Army Chief Information Officer/G-6 (“CIO/G-6”) and the Program Executive Office, Enterprise Information Systems (“PEO EIS”), “in support of various missions and of command, control, and communications activities.” Id. at 53. In particular, the solicitation enumerated nine functional areas in the Performance Work Statement: Program/ 1 The court derives the facts solely from the administrative record (“AR”). -4- Project/Product Management; Business Process Reengineering; Information Systems Security; Contingency Planning; Physical Security; Enterprise Design, Integration, and Consolidation; Systems Operation and Maintenance; Integrated Logistics Support; and other Information Technology-related Services. Id. at 54. Offerors were required to “be capable of performing ALL tasking listed under the nine Functional Areas . . . .” Id. at 343; see also id. at 200 (defining “offeror” as “the combined contractor/subcontractor team”), 272 (noting that “[t]he offerors’ team[s] must be capable of performing ALL nine functional areas”), 363 (requiring offerors to indicate which “Functional Area(s) the Offeror, or its subcontractor(s)[,] will be performing . . . .”). The acquisition was a follow-on to the Program Management Support Services Blanket Purchase Agreements, which were to expire on September 3, 2008. Id. at 53. L-3 Services and BearingPoint were the incumbent contractors. Id. The Army intended to award five Indefinite Delivery, Indefinite Quantity (“IDIQ”) contracts pursuant to the solicitation. Id. at 359, 378. Three of the contracts were to be awarded without regard to the size of the offeror. Id. at 378. The other two contracts were to be awarded to small business offerors. Id. The Army reserved the right to increase or decrease the number of contracts it awarded. Id. The contracts were to have a base period of one year, with four oneyear option periods. Id. at 50. Upon award of the contracts, the Army anticipated issuing Firm Fixed Price and Time and Materials task orders to procure the services of the contract awardees. Id. at 50, 54. The contracts had a guaranteed minimum of $20,000 each, and the total amount of all task orders under the contracts was not to exceed [. . .].2 Id. at 50. a. The Content of the Proposals Prospective offerors were required to submit their proposals in six separate volumes, with the following titles: (I) Corporate Capability; (II) Performance Risk; (III) Technical/ Management; (IV) Price; (V) Small Business Participation; and (VI) Contract. Id. at 348-49. Offerors were required to “present all information relevant to the factor/subfactor in the appropriate section,” id. at 348, and were further cautioned that “[i]nformation shall be confined to the appropriate volume to facilitate independent evaluation,” id. at 347. Moreover, offerors 2 The Army took the contract ceiling price [. . .] from the Independent Government Estimate, which was “based on price information, including labor hours and labor rates, from current and previous PMSS task orders, GSA schedules contracts and other contracts which program management offices have utilized to satisfy their requirements,” as well as “projected expenditures based on the level of effort from previous contracts.” AR 5; see also id. at 7 (“[T]he Independent Government Estimate (IGE) was developed using historical data and will be used to determine if labor rates are significantly high. . . . The IGE was developed using actual rates and labor hours currently being performed by the CIO/G-6 and the PEO EIS.”). Labor hours were “based on estimated future needs as compared to existing task order requirements and/or estimates from knowledgeable personnel and existing inhouse contracts.” Id. at 5. Further, the contract ceiling price included a ten percent contingency factor “to facilitate unforeseen requirements.” Id. -5- were cautioned to provide accurate information because if the Army identified or discovered “that the proposal information [was] not accurate or misrepresent[ed] the Offeror’s status or capabilities, that information [might] be used by the contracting officer as part of the Offeror’s responsibility determination and could result in the Offeror not being eligible for award.” Id. at 343. Volume I, Corporate Capability, had two subsections. Within the first subsection, Knowledge of and Experience With [Department of Defense (“DoD”)]/Army Acquisition Policies and Processes (“Knowledge and Experience”), an offeror was to address “[i]ts knowledge of and experience with DoD and Army acquisition policies and processes such as the DoD 5000 series, information assurance (IA), Independent Validation and Verification (IV&V), [and] Department of Defense Information Technology Security Certification and Accreditation Process/Department of Defense IA Certification and Accreditation Process (DITSCAP/ DIACAP).” Id. at 349-50. Within the second subsection, Performance Capability and Resources to Support PEO EIS and CIO/G-6 Programs (“Performance Capability”), an offeror was to address its (1) “corporate capability to provide [program management] support for the nine (9) functional areas as listed in the [Performance Work Statement] and for the PEO EIS and CIO/G6 programs described in the [Performance Work Statement] . . . , to include the Offeror’s worldwide capability to support contingencies, deployments, exercises, and other emergencies”; (2) “current corporate quality business processes/certifications . . . [,] identify[ing] the source of the award or certification in a manner verifiable by the Government”; and (3) “resources currently available, to include back reach capability, to support task orders, including management of simultaneous task orders . . . .” Id. at 350. When describing its capability in the nine functional areas, an offeror was to use the table provided in an appendix to the solicitation, which required an offeror to designate which functional areas it would perform and which functional areas its subcontractor or subcontractors would perform. Id. at 350, 363. Offerors were to include, among other things, two categories of documents in Volume II, Performance Risk. First, an offeror was required to “submit three relevant . . . contracts under which the Offeror was responsible for or performed program management and at least two of the other functional areas as listed in the Performance Work Statement.” Id. at 351. The offeror must have had at least nine months of experience on the contract as the prime contractor, subcontractor, or teaming partner to use it as a reference, and the “[c]umulative annual dollar value of task orders against any single contract” must have been at least $4 million. Id. at 35152. Further, the offeror must have served as the prime contractor on at least one of the contracts. Id. at 352. Second, an offeror was to “submit two relevant . . . contract references for each Major Subcontractor[] . . . that will be performing the functional areas as listed in the Performance Work Statement . . . .” Id. at 351. A Major Subcontractor was “a subcontractor that [was] anticipated to perform 20% of the total contract earned revenue.” Id. at 350. The Major Subcontractor must have been “responsible for, or performed, two or more of the functional areas listed in the Performance Work Statement” and must have had at least nine months of experience on the contract as the prime contractor, subcontractor, or teaming partner to use it as a reference. Id. at 352. As part of the information submitted concerning its own contracts and those of its -6- Major Subcontractors, an offeror was required to provide up to five points of contact for each contract. Id. at 351. The offeror was to provide each of those points of contact with a specified past performance questionnaire and ensure that the questionnaire was returned to the Army for evaluation. Id. at 353. Volume III, Technical/Management, had two subsections. The first subsection, Understanding of the [Program Management] Support Required (“Understanding Support”), obliged an offeror to address its “overall program management approach to demonstrate its understanding of the scope and support required for the PEO EIS and CIO/G-6 programs.” Id. at 354. The second subsection, Program Management Plan, required an offeror to address its (1) “program management structure and the tools to be used for management of the support functions”; (2) “approach to identify, track, monitor, manage, and mitigate the risk associated in program management support”; (3) “methodology for recruitment, training, and retention of qualified and security-cleared personnel”; (4) “approach to manage and coordinate subcontractor efforts”; and (5) “monetary disincentives . . . that are to be applied when the contract performance metrics are not met.” Id. The contract performance metrics included: (1) meeting or exceeding small business participation goals ninety-five percent of the time; (2) participation in at least fifty percent of the task order competitions; and (3) receiving ninety-five percent of comments from clients expressing overall satisfaction. Id. at 71. To assist offerors in submitting Volume IV, Price, the Army provided two partially completed spreadsheets with the solicitation. See id. at 354. One spreadsheet contained a Master Labor Rate Table, in which an offeror was to provide all “fully loaded labor rates proposed by the Offeror’s team, both prime and subcontractor rate(s), for each labor category . . . .” Id. at 355. A fully loaded labor rate was to “include all direct, indirect, general and administrative costs and profit associated with providing the required skill.” Id. at 51. Offerors were directed to use the labor category descriptions included as an attachment to the solicitation to assist them in determining the proper labor rates to include in the table. Id. at 354; see also id. at 123-38 (Labor Category Descriptions). The second spreadsheet contained a Price Model, and offerors were required to enter, among other data, the highest proposed fully loaded labor rate, whether it was the prime or subcontractor rate, for each listed labor category.3 Id. at 354-55. The spreadsheet would then calculate, based on a predetermined number of labor hours for each labor category, id. at 109-10, “the extended proposed price for the base year,” id. at 108. The spreadsheet would also calculate, “using a predetermined 3.2% escalation rate,” the extended proposed price for each subsequent year and the total proposed labor price for the entire five-year term of the contract. Id. The labor rates were fixed for each year of the contract, but successful offerors could, “throughout the life of the contract, propose additional labor categories, rates and descriptions,” and if the Army determined that the proposed new categories, rates, and descriptions were “fair and reasonable,” it would incorporate them into the Master Labor Rate Table. Id. at 51. 3 There were ninety-six listed labor categories for which offerors were to propose government-site labor rates and contractor-site labor rates. AR 109-10. -7- All offerors, regardless of their size, were required to submit Volume V, Small Business Participation. Id. at 356. Offerors were required to have overall small business participation of at least twenty percent. Id. Offerors were also presented with the following “non-mandatory objectives for participation”: Small Disadvantaged Business participation of six percent; Woman-Owned Small Business participation of five percent; Service-Disabled Veteran-Owned Small Business participation of three percent; Veteran-Owned Small Business participation of three percent; HUBZone Small Business participation of one percent; and a positive goal for Historically Black Colleges and Universities/Minority Institution (“HBCU/MI”) participation. Id. Offerors were instructed to identify the names of the qualifying small businesses and the supplies or services that they would provide. Id. Further, large business offerors were “required to submit the small business goals for fiscal year 2006 and the goals for fiscal year 2006 that were actually achieved” in a table provided in an appendix to the solicitation. Id. at 356; see also id. at 377 (containing the required table that allowed the offeror to input its “Fiscal Year 2006 Small Business Subcontracting Goal” and “Fiscal Year 2006 Small Business Subcontracting Actually Achieved” for each of the seven small business participation categories). In Volume VI, Contract, offerors were required to submit the contractual documents, corporate agreements, and, for large business offerors, small business subcontracting plans. Id. at 349, 357-58. The contractual documents included a certification of requirements, to be signed by the offeror, indicating that the offeror’s proposed labor categories met the requirements of the descriptions provided in the solicitation and that the proposed labor rates were “at or below the offeror’s, or its proposed subcontractor’s, commercial list price, GSA Schedule price, or the Schedule B price in a competitively awarded Federal contract.” Id. at 341-42, 357. In addition, the offeror was required to provide its Taxpayer Identification Number and the Taxpayer Identification Number of the offeror’s common parent, if any. Id. at 336-37. The corporate agreements were described in another part of the solicitation and consisted of two nondisclosure agreements with the two support contractors that the Army had retained to provide “acquisition support during the source selection process.” Id. at 345. The support contractors were identified as RTEC Services LLC and Acquisition Solutions, Inc. Id. b. Evaluation of the Proposals The Army indicated that it would award the contracts “in accordance with the competitive negotiation source selection procedures contained in” part fifteen of the Federal Acquisition Regulation (“FAR”), “through full and open competition,” to the offerors that submitted proposals that were the “most advantageous” and represented the “best value” to the government. Id. at 378. The Army intended to award the contracts without discussions. Id. at 343, 378. However, the Army indicated that it might seek clarifications pursuant to FAR section 15.306(a), id. at 378, and that it reserved the right to conduct discussions with the offerors, if necessary, id. at 343, 378; see also id. at 346 (“The Government may also hold discussions during proposal evaluations. . . . The Army may issue written Items for Negotiations . . . .”). Further, the Army -8- indicated that it might establish a competitive range for the purposes of efficiency pursuant to FAR sections 15.306(c)(2) and 52.215-1(f)(4). Id. at 378. The Army stated that “[a]ll award decisions [would] be based upon the evaluation of each Offeror’s complete proposal against the evaluation criteria . . . .” Id. at 379. The evaluation factors mirrored the subject matter of the first five volumes of the offerors’ proposals: (1) Corporate Capability; (2) Performance Risk; (3) Technical/Management; (4) Price; and (5) Small Business Participation. Id. Similarly, the first and third evaluation factors each had two subfactors that corresponded with the subections to be presented in the proposals. Id. The subfactors for the first evaluation factor–Corporate Capability–were (1) Knowledge and Experience and (2) Performance Capability. Id. The subfactors for the third evaluation factor– Technical/Management–were (1) Understanding Support and (2) Program Management Plan. Id. The Army then described, in more detail, the criteria within each factor and subfactor that it planned to evaluate. Id. at 380-83. Within the first Corporate Capability subfactor, the Army would assess an offeror’s “knowledge of and experience with DoD and Army acquisition policies and processes . . . to assess the Offeror’s ability to manage and support information technology programs.” Id. at 380. Within the second Corporate Capability subfactor, the Army would assess an offeror’s (1) “corporate capability to provide [program management] support for the nine (9) functional areas as listed in the [Performance Work Statement] and for the PEO EIS and CIO/G-6 programs described in the [Performance Work Statement,] . . . to include its worldwide capability to support contingencies, deployments, exercises, and other emergencies”; (2) “current corporate quality business processes/certifications”; and (3) “qualified resources currently available[,] to include back reach capability[,] to support task orders, as well as manage simultaneous task orders . . . .” Id. Next, within the Performance Risk factor, the Army would assess “the quality and relevancy of the Offeror’s past performance, as well as that of its proposed major subcontractor(s), as it relates to the probability of successful accomplishment of the required effort.” Id. at 381. The Army would “focus its inquiry on the past performance of the Offeror and its proposed major subcontractor(s), as it relates to the solicitation requirements,” including “aspects of cost, schedule, and performance.” Id. The Army indicated that it would also consider the similarity of the reference contracts to the instant solicitation. Id. Other possible “important consideration[s]” identified by the Army were “[r]ecognition for quality performance, . . . problem[s], or lack of relevant data in any element of the work” with the reference contracts. Id. The Army warned that it might consider past performance data obtained from sources not identified in an offeror’s proposal and that it might not “contact all of the sources provided by an Offeror.” Id. Then, within the first Technical/Management subfactor, the Army would assess an offeror’s “overall program management approach” to determine the offeror’s “understanding of the scope and support required for the PEO EIS and CIO/G-6 programs.” Id. Within the second Technical/Management subfactor, the Army would assess an offeror’s (1) “program management -9- structure and the tools to be used for management of the support functions”; (2) “approach to identify, track, monitor, manage, and mitigate the risk associated in program management support”; (3) “methodology for recruitment, training, and retention of qualified and securitycleared personnel”; (4) “approach to manage and coordinate subcontractor efforts”; and (5) “proposed disincentives . . . .” Id. at 381-82. For the Price factor, the Army would assess the Total Contract Life Price, which it defined as the sum of the price for the base year and the price for each of the four option years, adjusted, as necessary pursuant to FAR section 52.219-4, to include a preference for HUBZone small business concerns. Id. at 382. The Army would also ensure that the proposals did not contain unbalanced pricing pursuant to FAR section 15.404-1(g), and indicated that proposed labor rates “certified either at or below the offeror’s, or its proposed subcontractor’s, commercial list price, GSA Schedule price, or the Schedule B price in a competitively awarded Federal contract [would] not be deemed to constitute a concern under FAR 15.404-1(g).”4 Id. Finally, for the Small Business Participation factor, the Army would assess “the Offeror’s approach to meeting or exceeding the Government’s minimum mandatory requirement for overall Small Business participation and the non-mandatory objectives . . . for participation by Small Disadvantaged Business, Woman-Owned Small Business, Service-Disabled VeteranOwned Small Business, Veteran Owned Small Business, HUBZone Small Businesses, and HBCU/MI in the performance of the . . . contract.” Id. The Army indicated that it would consider “[t]he amount by which the proposed percentage exceeds the overall small business category percentage and the amount by which the proposed percentages exceed the small business subcategory percentages . . . .” Id. The Army would also assess (1) “the extent to which [small businesses] are specifically identified in the proposals; (2) “the extent of commitment to use [small businesses]”; (3) “the extent of participation of [small businesses] in terms of the value of the total acquisition; and (4) for the large business offerors, small business participation goals for fiscal year 2006 for the DoD and other federal government agencies, as well as the “actual goals achieved.” Id. In addition to describing the specific criteria within each factor and subfactor it would be evaluating, the Army identified the relative importance of the factors and subfactors: Factor 1 is more important than Factor 2. Factor 2 is more important than Factor 3. Factor 3 is more important than Factor 4. Factor 4 is more important than Factor 5. While the Price Factor [(i.e., Factor 4)] will be an important part of the integrated selection decision, the non-price evaluation factors, collectively, are significantly more important than the Price factor. Under Factor 1, Subfactor 1 is more important than Subfactor 2. Under Factor 3, the Subfactors are comparatively equal in importance. 4 [. . .]. -10- Id. at 379. “Significantly more important” meant that the factor was “substantially more important” and would be “given far more consideration” than another factor. Id. at 380. “More important” meant that the factor was “greater in value,” but not as highly valued as a “significantly more important” factor, and would be “given more consideration than another” factor. Id. “Comparatively equal” meant that the factors were “nearly the same in value” and that “any difference [was] very slight.” Id. The Army then made two specific statements about the Price factor. First, the Army “cautioned that an award may not necessarily be made to the lowest priced Offeror” and that “if the non-Price factors are evaluated as comparatively equal between two or more Offerors, Price may become a determinative factor.” Id. at 379. Second, the Army indicated that it would weigh, but not rate, the Price factor. Id. Thus, because Price was not rated, it would not be “scored or assigned an adjectival rating.” Id. Notably, the Army did not describe in the solicitation how it planned to derive the adjectival ratings of the other, nonprice factors or by what process it planned to use to compare the submitted proposals. 2. The Source Selection Plan The Source Selection Plan, prepared by the PEO EIS and the Information Technology, ECommerce & Commercial Contracting Center (“ITEC4”), id. at 385, 387, established the process to be followed by the Army in awarding the contracts, id. at 389. In particular, the Source Selection Plan described “the evaluation process, methodology, and techniques to be used for the source selection decision” and “define[d] the roles and responsibilities of the source selection organization.” Id. The ultimate goal was to “[f]acilitate an impartial, equitable and thorough evaluation of all offerors’ proposals.” Id. The provisions in the solicitation concerning the content and evaluation of proposals were incorporated by reference into the Source Selection Plan. Id. at 391. a. Source Selection Organization The source selection organization was to be composed of the Source Selection Authority, the Source Selection Advisory Council, the Source Selection Evaluation Board, and the Procuring Contracting Officer. Id. at 392. The Source Selection Authority would be “responsible for the proper and efficient conduct of the entire source selection process encompassing proposal solicitation, evaluation, selection and contract award,” and would bear the responsibility to “select the source for award . . . .” Id. In addition, among other responsibilities, the Source Selection Authority would be required to (1) “[e]nsure consistency among the [solicitation] requirements, notices to offerors, proposal preparation instructions, evaluation factors and subfactors, [solicitation] provisions or contract clauses and data requirements”; (2) “[e]nsure that the [Source Selection Plan] and evaluation criteria [were] consistent with the requirements of the solicitation and applicable regulations”; (3) “[e]nsure that proposals [were] evaluated based solely on the factors and subfactors contained in the [solicitation], and that the evaluation of proposals [were] consistent with the [Source Selection -11- Plan] and the requirements of the solicitation”; (4) “[e]nsure that the source selection process [was] conducted in accordance with applicable laws and regulations”; (5) “[a]pprove the contracting officer’s competitive range determination”; and (6) “document the supporting rationale in a source selection decision document” upon selecting a source for the award. Id. at 392-93. The Source Selection Advisory Council, which was to be composed of “senior Government personnel, including a person from ITEC4,” would report to the Source Selection Authority. Id. at 393. At the direction of a chairperson, the Source Selection Advisory Council was intended to, among other things, (1) “validate the strengths, weaknesses and deficiencies” identified in the initial and subsequent reviews of the proposals by the Source Selection Evaluation Board “prior to or concurrent with the [Source Selection Authority] review”; (2) “review and provide comments to the [Source Selection Authority] on the contracting officer’s” competitive range determination; and (3) “[i]dentify discriminating factors amongst offers to aid the [Source Selection Authority] in the selection process.” Id. at 394. The Source Selection Evaluation Board was tasked with the responsibility, at the direction of a chairperson, “for the conduct of a comprehensive and integrated evaluation of proposals in an impartial and equitable manner, and the production of summary facts and findings required in the conduct of the source selection process.” Id. Along with an overall chairperson, the Source Selection Evaluation Board was to be comprised of chairpersons for each factor and subfactor and a “team of evaluators.” Id.; see also id. at 410-11 (identifying the evaluation team members). The evaluation team members included employees of Acquisition Solutions, Inc. Id. at 410-11. The Source Selection Evaluation Board was to report to the Source Selection Advisory Council and, “as directed,” to the Source Selection Authority. Id. at 394. The chairperson of the Source Selection Evaluation Board was tasked with, among other things, (1) “[prescribing] the specific evaluation and rating procedures, and the methods by which an overall assessment [was] achieved, ensuring that they [were] consistent with the procedures and methods set forth in” the Source Selection Plan; (2) structuring the procedures “to identify the significant strengths, weaknesses, and risks associated with each offer to make it easier to distinguish significant differences between offers”; and (3) “[ensuring] . . . a uniform approach in the rating effort.” Id. at 395. The factor and subfactor chairpersons were to ensure that the ratings were “established as a result of a consensus of the evaluators, and not by vote.” Id. at 396. Further, for the factors that had subfactors, the factor chairperson was to receive input from each factor evaluator, “consider this input, obtain a consensus of the factor evaluators if they [had] the requisite knowledge across the factor, and assign the factor rating.” Id. At the conclusion of the evaluation process, the factor chairpersons were directed to provide the factor and subfactor ratings, along with a narrative summaries, to the chairperson of the Source Selection Evaluation Board. Id. -12- b. Source Selection Process Upon receipt of the proposals, the evaluators of the Source Selection Evaluation Board were to “read their applicable sections and determine if there [was] adverse past performance information, or ambiguities in the proposal or other concerns (e.g., deficiencies, weaknesses, errors, omissions, or mistakes).” Id. at 403. The evaluators were directed to “immediately report[]” to the contracting officer “[m]ajor problems, such as multiple offerors failing to meet a particular requirement.” Id. They were further directed to report their “[o]ther concerns . . . as items for negotiation.” Id. After receiving such reports from the evaluators, the contracting officer was charged with determining whether to seek “clarifications” from, or conduct “communications” with, the affected offeror. Id. According to the Army’s definitions in the Source Selection Plan: “Clarifications” are limited exchanges between the Government and offerors that may occur when award without discussions is contemplated. Offerors may be given the opportunity to clarify certain aspects of proposals . . . or to resolve minor or clerical errors. “Communications” are exchanges between the Government and offerors for the purpose of determining whether a proposal should be placed in the competitive range. “Communications” regarding adverse past performance information will be conducted with offerors if that information is the determining factor preventing them from being placed in the competitive range. “Communications” will also be conducted with offerors whose exclusion from, or inclusion in, the competitive range is uncertain. Id. The Army also indicated that “Item for Negotiation” forms were not to be used for “clarifications” or “communications.” Id. at 403, 415-16. Next, the evaluation teams for each factor and subfactor were required to “convene to discuss the offerors’ proposals” and “share their views on the offeror[s’] strengths, weaknesses, risks and deficiencies related to their assigned evaluation factor(s)/subfactor(s) and to reach a final rating for each factor and subfactor.” Id. at 403; accord id. at 398. Then, the Source Selection Evaluation Board was to “identify and document proposal deficiencies, strengths, weaknesses, risks, and associated items for negotiation” and “assign the appropriate rating to each factor and subfactor . . . .” Id. at 403-04. The factor chairpersons were to forward the assigned ratings and associated narrative summaries to the chairperson of the Source Selection Evaluation Board, who would then review the ratings and summaries, prepare the Initial Evaluation Report, and forward the Initial Evaluation Report to the Source Selection Advisory Committee and the Source Selection Authority. Id. at 398, 404. The contracting officer was also to receive a copy of the evaluation results so that he could establish, with the approval of the Source Selection Authority, “a competitive range comprised of all the most highly rated proposals.” Id. -13- After the contracting officer established the competitive range, the Army planned on conducting discussions with those offerors within the competitive range. Id. But see id. at 343 (indicating, in the solicitation, that the Army intended to award the contracts without discussions), 378 (same). “Discussions are exchanges between the Government and offerors within the competitive range that are undertaken with the intent of allowing the offeror to revise its proposal.” Id. at 404. To initiate discussions with offerors within the competitive range, the Source Selection Evaluation Board was to complete forms indicating an offeror’s “deficiencies, weaknesses, errors, omissions, mistakes, ambiguities, adverse past performance information or any aspect of the proposal that could be altered or explained to materially enhance the proposal’s potential for award,” and then “forward these items” to the offeror. Id. The Source Selection Evaluation Board was then required to evaluate the offerors’ responses and proposal revisions and document the evaluation results in a supplemental report, referred to as the Interim Evaluation Report, that addressed each factor and subfactor. Id. at 398, 404. The Source Selection Evaluation Board would then present its findings to the Source Selection Advisory Committee and the Source Selection Authority. Id. at 398. At any time during discussions, an offeror could be eliminated from the competitive range if the offeror was “no longer considered to be among the most highly rated offerors.” Id. at 398, 404. When discussions were complete, the contracting officer was instructed to set a deadline for all offerors remaining within the competitive range to submit a Final Proposal Revision. Id. The Source Selection Evaluation Board was then required to evaluate the offerors’ Final Proposal Revisions and document the results in another supplemental report, referred to as the Final Evaluation Report, that “addresse[d] each factor and subfactor.” Id. at 404. The Source Selection Evaluation Board would then present its findings to the Source Selection Advisory Committee and the Source Selection Authority. Id. at 398. “[B]ased upon a comparative assessment of proposals against all source selection criteria in the solicitation,” the Source Selection Authority was required to “make the final determination of the offeror selected for award.” Id. at 405; accord id. at 398. Although the Source Selection Authority was entitled to “use reports or analyses prepared by the [Source Selection Advisory Committee] and [Source Selection Evaluation Board], the source selection decision” was to “represent the [Source Selection Authority]’s independent judgment.” Id. at 405. The Source Selection Authority was specifically instructed to document the source selection decision and, in the source selection decision document, to (1) “address each evaluation factor and how the selected offeror fared with regard to each factor”; (2) “include the rationale for the selection and any business judgments and tradeoffs made or relied upon by the [Source Selection Authority]”; (3) “explain how the successful proposal compared to other offerors’ proposals based on the evaluation factors and subfactors in the solicitation”; (4) “discuss the judgment used in making tradeoffs”; (5) explain any disagreement with the findings of the Source Selection Advisory Committee or the Source Selection Evaluation Board; (6) include, if the Source Selection Authority “determine[d] that the best value proposal [was] other than the lowest-priced proposal,” a justification for the payment of a “price premium regardless of the superiority of the proposal’s non-cost rating” that identified “what benefits or advantages the Government [was] -14- receiving for the added price and why it [was] in the Government’s interest to expend the additional funds”; and (7) include, if the Source Selection Authority “determine[d] the non-cost benefits offered by the higher-priced, technically superior proposal [were] not worth the price premium, an explicit justification” for that determination. Id. After the “appropriate legal review,” the contracting officer would make the contract awards and conduct any debriefings requested by unsuccessful offerors. Id. at 398, 405. Debriefings were not to “include point-bypoint comparisons with other proposals,” but instead were to be used to describe the “specific strengths and weaknesses of the individual proposal being considered.” Id. at 405. Further, the debriefing would include “[a]n explanation of the evaluation method . . . to ensure the offeror that its proposal was treated fairly, impartially, and objectively.” Id. c. Source Selection Ratings In addition to describing the evaluation process, the Source Selection Plan identified and defined the ratings for the four nonprice factors. Id. at 398-402. A “rating” was defined as “[t]he evaluators’ conclusions (supported by narrative write-ups) identifying the strengths, weaknesses, and deficiencies of an evaluation factor or subfactor,” which was to be “expressed as an adjective.” Id. at 402. In turn, a “strength” was “[a]ny aspect of a proposal that, when judged against a stated evaluation criteria, enhance[d] the merit of the proposal or increase[d] the probability of successful performance of the contract.” Id. A “weakness” was “[a] flaw in a proposal that increase[d] the risk of unsuccessful contract performance.” Id. And, a “deficiency” was “[a] material failure of a proposal to meet a Government requirement or a combination of significant weaknesses in a proposal that increase[d] the risk of unsuccessful contract performance to an unacceptable level.” Id. To rise to the level of a “significant weakness,” a flaw must have “appreciably increase[d] the risk of unsuccessful contract performance.” Id. Similarly, to be “significant,” a strength must have “appreciably enhance[d] the merit of a proposal or appreciably increase[d] the probability of successful contract performance.” Id. The Army did not define any categories other than significant strength, strength, weakness, significant weakness, or deficiency (e.g., moderate strength, minor strength, moderate weakness, minor weakness). For the Corporate Capability and Technical/Management factors and subfactors, the following ratings were to be used:5 5 The court reproduced the tables describing the Army’s ratings directly from the Source Selection Plan. All other tables in this opinion were generated by the court using information taken from various portions of the administrative record. -15- Color Adjective and Definition Blue Outstanding - A proposal which satisfies all the Government’s requirements with extensive detail to indicate feasibility of the approach and shows a thorough understanding of the problems, with an overall low degree of risk in meeting the Government’s requirements. Green Good - A proposal which satisfies all the Government’s requirements with adequate detail to indicate feasibility of the approach and shows an understanding of the problems, with an overall low to moderate degree of risk in meeting the Government’s requirements. Yellow Fair - A proposal which satisfies all the Government’s requirements with minimal detail to indicate feasibility of the approach and shows a minimal understanding of the problems, with an overall moderate to high degree of risk in meeting the Government’s requirements. Orange Marginal - A proposal which barely satisfies the Government’s requirements, with indications the approach may not be feasible, and which shows a lack of understanding of the problems, with a generally high degree of risk in meeting the Government’s requirements. Pink Susceptible to Being Made Acceptable - A proposal which, as initially proposed, cannot be rated acceptable because of minor errors, omissions, or deficiencies, which are capable of being corrected without a major rewrite or revision of the proposal. Red Unacceptable - A proposal which contains major errors, omissions, or deficiencies which indicate a lack of understanding of the problems or an approach which cannot be expected to meet requirements or involves a very high risk; and these conditions can not be corrected without a major rewrite or revision of the proposal. Id. at 398-401. For the Performance Risk factor, the following ratings were to be used: Color Adjective and Definition Blue Very Low Risk - Based on the offeror’s performance record, essentially no doubt exists that the offeror will successfully perform the required effort. Green Low Risk - Based on the offeror’s performance record, little doubt exists that the offeror will successfully perform the required effort. -16- Yellow Moderate Risk - Based on the offeror’s performance record, some doubt exists that the offeror will successfully perform the required effort. Pink High Risk - Based on the offeror’s performance record, substantial doubt exists that the offeror will successfully perform the required effort. Red Very High Risk - Based on the offeror’s performance record, extreme doubt exists that the offeror will successfully perform the required effort. White Unknown Risk - No relevant performance record identifiable; equates to a neutral rating having no positive or negative evaluation significance. Id. at 399-400. For the Small Business Participation factor, the following ratings were to be used: Color Adjective and Definition Blue Outstanding - Outstanding quality; proposed participation significantly exceeds all solicitation goals; very good probability of success with overall low degree of risk in achieving extent of proposed participation. Green Good - High quality; proposed participation significantly exceeds some solicitation goals, and meets or exceeds all goals; good probability of success with overall low to moderate degree of risk in achieving extent of proposed participation. Yellow Fair - Adequate quality; proposed participation meets or exceeds all solicitation goals; fair probability of success with overall moderate degree of risk in achieving extent of proposed participation. Pink Poor - Poor quality; proposed participation fails to meet one or more solicitation goals; low probability of success with overall moderate to high degree of risk in achieving extent of proposed participation. Red Unacceptable - Very poor quality; proposed participation fails to meet many solicitation goals; very low probability of success with overall high degree of risk in achieving extent of proposed participation. -17- Id. at 401-02. The Source Selection Plan did not indicate any specific number of significant strengths, strengths, weaknesses, significant weaknesses, or deficiencies that would be required to qualify a factor or subfactor for a particular rating. 3. Initial Proposals, the Competitive Range, and Discussions Twenty-five offerors submitted proposals, with proposed prices ranging from [. . .].6 Id. at 5040, 5076, 10474, 12729. In addition to Femme Comp, TAPE, L-3 Services,7 Data Systems, BearingPoint, Booz Allen, and Savantage, the offerors included Systems Research and Applications Corporation (“Systems Research”); RS Information Systems, Inc. (“Wyle”);8 Binary Group, Inc. (“Binary”); [. . .]; SI International, Inc. (“SI Int’l”); [. . .]; and others. Id. at 5079-80. Of the aforementioned thirteen offerors, only TAPE, Savantage, Binary, and [. . .] submitted proposals as small businesses. Id. at 5080-81. The Source Selection Evaluation Board evaluated the proposals, documenting their strengths, weaknesses, and deficiencies, and assigned ratings. See id. at 4557-5019 (containing Consensus Evaluation Documents for the nonprice factors of each proposal). In addition, in evaluating the offerors’ proposed prices, the Price evaluation team of the Source Selection Evaluation Board took the following steps: 1) Verified if the Master Labor Rate Table was complete and accurate 2) Verified if the Price Model was complete and accurate 6 Only one proposed price was in excess of the contract ceiling price [. . .]. 7 More precisely, L-3 Communications Titan Corporation submitted a proposal. See, e.g., AR 3570 (containing Standard Form 33, “Solicitation, Offer and Award”). The offeror was a wholly owned subsidiary L-3 Communications Holdings, Inc. See id. at 3575. According to paragraph six of its complaint and the corporate disclosure statement that it filed on June 16, 2008, pursuant to Rule 7.1 of the Rules of the United States Court of Federal Claims (“RCFC”), L-3 Services is a wholly owned subsidiary of L-3 Communications Corporation, which, in turn, is a wholly owned subsidiary of L-3 Communications Holdings, Inc. Accordingly, both the offeror and L-3 Services share a common parent. 8 RS Information Systems, Inc. submitted an initial proposal in July 2007. AR 1581, 1834. On January 17, 2008, during the procurement process, RS Information Systems, Inc. was acquired by Wyle and commenced operations as Wyle Information Systems. See Wyle Completes Acquisition of RS Information Systems, http://www.wylelabs.com/news/2008/01-17. html (Jan. 17, 2008). Despite the acquisition, RS Information Systems, Inc.’s February 8, 2008 final proposal did not bear its new name. See AR 8747, 8875. However, on March 20, 2008, the Army awarded a contract to Wyle Information Systems, LLC, and not RS Information Systems, Inc. See id. at 10790. Because the parties generally refer to this offeror as Wyle, the court will do the same. -18- 3) Verified that the Offeror’s formulas/quantity of hours were correct by inserting offeror data into the Price Model template (i.e., make certain that Offerors had not altered the formulas, quantity of hours, or estimated costs as established by the Price Model spreadsheet) 4) Verified that the highest rate for each labor category in the Master Labor Rate Table was used in Price Model 5) Verified whether or not the proposed rates in the Price Model were certified with an “X” according to the reasonableness criteria explained below in Section 3 and indicated in the Price Model 6) Verified that all proposed subcontractors and corresponding rates were provided by cross-referencing the subcontractors identified in the corporate capability proposal with the subcontractors identified in the Master Labor Rate Table 7) Verified whether or not the Offeror’s proposed fixed rate mark-up percentages applicable to ODC’s for Travel and Materials, as required 8) Reviewed the proposal in its entirety for pricing notes, assumptions, and other applicable information Id. at 5040-41. The Price evaluation team noted some common issues, including Master Labor Rate Tables lacking equivalent labor categories and the identities of all of the subcontractors mentioned in the Corporate Capability volume, the failure of offerors to transfer the highest proposed labor rate from the Master Labor Rate Table to the Price Model, proposed prices not rounded to two decimals, and the improper provision of pricing notes and pricing assumptions. Id. at 5041-42. In addition, the Price evaluation team concluded that “[w]ith the submission of twenty-five proposals, there is adequate price competition to support the reasonableness of the Total Contract Life Price” and that “there is no indication of unbalanced bidding.” Id. at 5042. Subsequent to its review, the Source Selection Evaluation Board briefed the Source Selection Authority and the Source Selection Advisory Council concerning its findings from November 5 through November 13, 2007, and three additional meetings were held later in the month. Id. at 5076, 10474. The contracting officer also attended the briefings and meetings and ultimately concurred with the findings of the Source Selection Evaluation Board. Id. at 5078. Based on the foregoing information, and after comparing “the proposals under all of the criteria set forth in the solicitation, including price and the non-price factors,” as well as “the relative weight of the criteria,” the contracting officer indicated that he used his independent judgment to reach a competitive range determination. Id. Altogether, the contracting officer placed twelve offerors within the competitive range, including all of the plaintiffs except for Femme Comp. Id. at 5078-80. To support his determination, the contracting officer prepared “[t]hirteen -19- comparisons . . . that document[ed his] comparison of each of the excluded offerors with all of the offerors that [were] included in the competitive range.” Id. at 5078. There is no evidence that the contracting officer compared the proposal of each offeror against the proposals of all other offerors. The Source Selection Authority concurred with the contracting officer’s competitive range determination. Id. at 5079. In the meantime, the Army began discussions with those offerors within the competitive range by transmitting to them Items for Negotiation.9 See, e.g., id. at 5084-939, 6036-504, 6549730, 6734-7421, 7518-928, 7973-8174, 8404. Items for Negotiation were prepared for weaknesses, deficiencies, adverse past performance, and other matters identified by the evaluators. See id.; see also id. at 404 (requiring evaluators to identify “deficiencies, weaknesses, errors, omissions, mistakes, ambiguities, adverse past performance information or any aspect of the proposal that could be altered or explained to materially enhance the proposal’s potential for award”). While most of the Items for Negotiation transmitted to the offerors identified weaknesses or other matters, the evaluators did note some deficiencies. See, e.g., 5216-17 (Booz Allen), 5334-35 (Systems Research), 5659-60 (Wyle). Offerors responded to the Items for Negotiation through January 29, 2008. See id.; see also id. at 8404 (indicating that all Items for Negotiation closed on January 31, 2008). Once it had received all of the responses to the Items for Negotiation, the Source Selection Evaluation Board reevaluated the proposals, see id. at 8175-403 (containing Consensus Evaluation Documents for the nonprice factors of each proposal); see also id. at 8404-05, 1277277 (containing the Price Factor Interim Evaluation), and briefed the Source Selection Advisory Council and the Source Selection Authority of its findings on February 4, 2008, id. at 8406-17, 10474. 4. Final Proposals The twelve offerors within the competitive range were directed to submit final proposals by February 8, 2008. Id. at 10356, 10474. Thereafter, the Source Selection Evaluation Board evaluated the final proposals. See id. at 10149-355 (containing Consensus Evaluation Documents for the nonprice factors of each proposal); see also id. at 10356-57, 12778-81 (containing the Price Factor Final Evaluation). In its evaluation of the nonprice factors, the Source Selection Evaluation Board identified the strengths of each proposal under each factor and subfactor. The following table summarizes the Source Selection Evaluation Board’s 9 The Army prepared Items for Negotiation for Femme Comp, but did not transmit them to Femme Comp for a response because Femme Comp’s proposal was not within the competitive range. See AR 5940-87. -20- findings for both the successful offerors–Systems Research, Wyle, Booz Allen, Savantage, and Binary–and plaintiffs TAPE, L-3 Services, Data Systems, and BearingPoint.10 Number of Strengths Factors and Subfactors Type of Strength 1.1: Corporate Capability: Knowledge & Experience Significant 1.2: Corporate Capability: Performance Capability Significant S R W y l B A H S a v B i n T A P L 3 D S A B P t Moderate Minor Moderate R E Minor D 2: Performance Significant Risk Moderate A C Minor 3.1: Technical/ Management: Understanding Support Significant 3.2: Technical/ Management: Program Mgmt. Plan Significant 5: Small Business Participation Significant T E Moderate D Minor Moderate Minor Moderate Minor 10 For the purposes of this table, the offerors’ names have been abbreviated as follows: Systems Research (“SR”); Wyle (“Wyl”); Booz Allen (“BAH”); Savantage (“Sav”); Binary (“Bin”); TAPE (“TAP”); L-3 Services (“L-3”); Data Systems (“DSA”); and BearingPoint (“BPt”). -21- Id. at 10449-50, 10452-54, 10456-59. As can be seen, the Source Selection Evaluation Board found no weaknesses or deficiencies in any of the final proposals. Id. In addition to identifying the strengths of each proposal, the Source Selection Evaluation Board outlined each of the offeror’s specific small business participation goals. The following table summarizes the goals of the large business offerors:11 Army Goal SR Wyle Booz Allen L-3 DSA BPt SI Int’l 20% [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] Disadvant. 6% [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] Womanowned 5% [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] Servicedisabled 3% [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] Veteranowned 3% [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] HUBZone 1% [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] Mandatory Small Bus. Nonmandatory HBCU/MI Pos. Goal Id. at 10160 (Data Systems), 10180 (Systems Research), 10219 (L-3 Services), 10293 (Wyle), 10329 (BearingPoint), 10348 (Booz Allen), 10525 (SI Int’l). The final rating determinations of the Source Selection Evaluation Board, with which the Source Selection Authority concurred, id. at 10474, are presented in the following tables, beginning with a summary of the ratings for the nonprice factors for the successful offerors: 11 For the purposes of this table, some of the offerors’ names have been abbreviated as follows: Systems Research (“SR”); L-3 Services (“L-3”); Data Systems (“DSA”); and BearingPoint (“BPt”). -22- Rating (Adjective and Color) Factors and Subfactors Systems Research Wyle Booz Allen Savantage Binary 1: Corporate Capability [. . .] [. . .] [. . .] [. . .] [. . .] 1.1: Knowledge and Experience [. . .] [. . .] [. . .] [. . .] [. . .] 1.2: Performance Capability [. . .] [. . .] [. . .] [. . .] [. . .] 2: Performance Risk [. . .] [. . .] [. . .] [. . .] [. . .] 3: Technical/ Management [. . .] [. . .] [. . .] [. . .] [. . .] 3.1: Understanding Support [. . .] [. . .] [. . .] [. . .] [. . .] 3.2: Program Management Plan [. . .] [. . .] [. . .] [. . .] [. . .] 5: Small Business Participation [. . .] [. . .] [. . .] [. . .] [. . .] Id. at 10448-72. Similarly, the following table contains the Source Selection Evaluation Board’s ratings of the nonprice factors for the four plaintiffs that were found to be within the competitive range: -23- Rating (Adjective and Color) Factors and Subfactors TAPE L-3 Services Data Systems BearingPoint 1: Corporate Capability [. . .] [. . .] [. . .] [. . .] 1.1: Knowledge and Experience [. . .] [. . .] [. . .] [. . .] 1.2: Performance Capability [. . .] [. . .] [. . .] [. . .] 2: Performance Risk [. . .] [. . .] [. . .] [. . .] 3: Technical/Management [. . .] [. . .] [. . .] [. . .] 3.1: Understanding Support [. . .] [. . .] [. . .] [. . .] 3.2: Program Management Plan [. . .] [. . .] [. . .] [. . .] 5: Small Business Participation [. . .] [. . .] [. . .] [. . .] Id. And, the following table contains the Source Selection Evaluation Board’s ratings of the nonprice factors for the remaining offerors in the competitive range: Rating (Adjective and Color) Factors and Subfactors [. . .] [. . .] SI Int’l 1: Corporate Capability [. . .] [. . .] [. . .] 1.1: Knowledge and Experience [. . .] [. . .] [. . .] 1.2: Performance Capability [. . .] [. . .] [. . .] 2: Performance Risk [. . .] [. . .] [. . .] 3: Technical/Management [. . .] [. . .] [. . .] 3.1: Understanding Support [. . .] [. . .] [. . .] 3.2: Program Management Plan [. . .] [. . .] [. . .] 5: Small Business Participation [. . .] [. . .] [. . .] Id. In addition to rating the nonprice factors, the Source Selection Evaluation Board verified the final proposed prices of all twelve offerors within the competitive range, listed below from lowest to highest: -24- Offeror Size Price Booz Allen Large Business [. . .] Systems Research Large Business [. . .] Wyle Large Business [. . .] Savantage Small Business [. . .] Binary Small Business [. . .] L-3 Services Large Business [. . .] [. . .] Large Business [. . .] SI Int’l Large Business [. . .] TAPE Small Business [. . .] [. . .] Small Business [. . .] Data Systems Large Business [. . .] BearingPoint Large Business [. . .] Id. at 10356, 12779. The contract ceiling price, as mentioned above, was [. . .], and the median proposed price was [. . .]. Id. at 12779. The Source Selection Evaluation Board determined that (1) “[i]n accordance with FAR 15.404-1(b), the Total Contract Life Prices [were] considered fair and reasonable”; (2) “[w]ith twelve offerors submitting a final proposal revision, there [was] adequate price competition to support a comparison of prices”; and (3) because every offeror certified their proposed labor rates, there was no concern with unbalanced bidding. Id. at 1035657. There is no evidence that the Source Selection Evaluation Board attempted to determine whether the labor categories proposed by the offerors were substantially similar to the descriptions provided by the Army in the solicitation. Subsequent to its review, the Source Selection Evaluation Board briefed the Source Selection Authority and the Source Selection Advisory Council concerning its findings during meetings held on February 13-14 and 19-20, 2008. Id. at 10474. Based upon the findings of the Source Selection Evaluation Board, the advice of the Source Selection Advisory Council, and an independent comparison of the proposals, “giving appropriate consideration to the evaluation criteria set forth in the solicitation and their relative weight,” the Source Selection Authority selected the contract awardees. Id. at 10474-75; see also id. (“I have conducted a careful comparison of the proposals under all of the criteria set forth in the solicitation, including price and the non-price factors.”). Under the “unrestricted full and open portion of the procurement,” the Source Selection Authority found that the proposals of Systems Research, Wyle, and Booz Allen were “the best value and most advantageous to the Government . . . .” Id. at 10474; accord -25- id. at 10573. Under the “portion of the procurement reserved for small business,” the Source Selection Authority found that the proposals of Savantage and Binary were “the best value and most advantageous to the Government . . . .” Id. at 10474; accord id. at 10573. In the Source Selection Decision Document, the Source Selection Authority explained how she arrived at her best value determination: As this is a best value procurement, I note that I carefully assessed the benefits of the strengths identified in the reports to determine whether the value justified a higher price considering the relative weights of the factors. Where I concluded that an offer with a higher rating in one or more factors was not the best value, my conclusion reflected a judgment that the higher price was not justified. All of the proposals were good and low risk, or better.12 None had any weaknesses, and the successful offerors provided value based upon various strengths even where an unsuccessful offeror had the higher rating. In the circumstances and based upon an exhaustive comparison of the individual strengths of the successful offerors with those of the unsuccessful offerors, my judgment was that the delta, or difference, between the offerors did not warrant paying the higher price. While I exercised independent judgment in making the selections, I note that not a single member of the [Source Selection Advisory Council], which included representatives from the two customer organizations, PEO EIS and CIO/G6, disagreed. My advisors and I were completely in accord in this matter. Id. at 10475 (footnote added). She then described the strengths of each offeror under each of the nonprice factors, id. at 10477-525, enumerated the evaluated price of each offeror, id. at 1052526, and compared the unsuccessful offerors against each of the five awardees, id. at 10526-68; see also id. at 10477 (“Below is my factor-by-factor comparison of the successful offerors with the unsuccessful offerors.”). There is no evidence in the Source Selection Decision Document that the Source Selection Authority compared the proposal of each offeror against the proposals of all other offerors; only comparisons between the proposals of the successful offerors and the proposals of the unsuccessful offerors were included. 5. Contract Award and Debriefing Pursuant to the Source Selection Authority’s decision, the Army awarded contracts to Systems Research, Wyle, Booz Allen, Savantage, and Binary on March 20, 2008. See id. at 10589-665 (Booz Allen), 10675-752 (Systems Research), 10790-866 (Wyle), 10877-953 12 As detailed above, the Source Selection Authority’s statement that “[a]ll of the proposals were good and low risk, or better” is contradicted by the final rating determinations of the Source Selection Evaluation Board, with which the Source Selection Authority concurred. See AR 10448 (indicating that Booz Allen, Binary, and TAPE received ratings of [. . .] for the Performance Capability subfactor of the Corporate Capability factor); accord id. 10488-89, 10495-96, 10498-99. -26- (Savantage), 10966-11040 (Binary). The contracts adopted the Master Labor Rate Tables of the awardees. See id. at 10666-74 (Booz Allen), 10753-89 (Systems Research), 10867-76 (Wyle), 10954-65 (Savantage), 11041-113 (Binary). The following table summarizes some of the labor rates of the large business awardees:13 Labor Rate Labor Category Administrative Support Labor Hours Systems Res. Wyle Booz Allen 148,823 [. . .] [. . .] [. . .] Computer Scientist 21,805 [. . .] [. . .] [. . .] Database Management Specialist 14,025 [. . .] [. . .] [. . .] 7,983 [. . .] [. . .] [. . .] 132,136 [. . .] [. . .] [. . .] 32,981 [. . .] [. . .] [. . .] Program Management Engineer 128,673 [. . .] [. . .] [. . .] Program Analyst 121,775 [. . .] [. . .] [. . .] Program Integrator/Representative 111,246 [. . .] [. . .] [. . .] Senior Analyst 244,998 [. . .] [. . .] [. . .] 1,700 [. . .] [. . .] [. . .] Systems Engineer - Intermediate 142,334 [. . .] [. . .] [. . .] Systems Analyst 625,645 [. . .] [. . .] [. . .] Technical Director 159,987 [. . .] [. . .] [. . .] Training Specialist 91,419 [. . .] [. . .] [. . .] Functional Analyst Logistic Management Specialist Program Manager - Senior Senior Applications Engineer 13 The labor categories included in this table, as well as a similar table infra, are those with the greatest number of labor hours throughout the five-year life of the contract, as well as a few other categories demonstrating the broad range of labor rates proposed by the awardees. The labor rates included in both tables are the highest proposed rates listed on the Master Labor Rate Tables for work to be performed at a government site in the first year of the contract. For the purposes of this table, Systems Research is abbreviated as “Systems Res.” -27- See id. at 109 (labor hours), 10666-74 (Booz Allen), 10753-89 (Systems Research), 10867-76 (Wyle). Also on March 20, 2008, the Army notified the unsuccessful offerors of the awards by letter. See id. at 10575-88. Thereafter, the Army debriefed several of the unsuccessful offerors, including Femme Comp on April 4, 2008; TAPE on April 7, 2008; L-3 Services on April 8, 2008; and Data Systems and BearingPoint on April 9, 2008. See id. at 11114-67, 11218-327, 11380-488. At each debriefing, the Army outlined the source selection process, explained the rationale for its decision, provided the ratings and prices of the five awardees and the unsuccessful offeror being debriefed, detailed the strengths and weaknesses of the unsuccessful offeror’s proposal, and responded to the unsuccessful offeror’s questions. Id. B. Procedural History 1. Protests Before the Army and the Government Accountability Office Beginning April 1, 2008, six unsuccessful offerors, including four of the plaintiffs here, filed protests with the GAO. See id. at 11564-12453, 12782-13503. TAPE filed a protest on April 11, 2008, id. at 11622; L-3 Services filed a protest on April 14, 2008, id. at 11738; and Data Systems and BearingPoint filed protests on April 21, 2008, id. at 12265, 12401. The Army submitted its agency reports to the GAO in response to the protests, along with a statement by the contracting officer, on May 12, 2008, id. at 12899-939 (protest of TAPE); May 14, 2008, id. at 12995-13075 (protest of L-3 Services); and May 22, 2008, id. at 13267-330, 13387-433 (protests of BearingPoint and Data Systems). Thereafter, TAPE, L-3 Services, Data Systems, and Bearing Point filed supplemental protests, TAPE and L-3 Services filed second supplemental protests, and L-3 Services filed a third supplemental protest. Id. at 11638-659, 11720-36, 11835-90, 12307-47, 12431-53. While the six unsuccessful offerors were pursuing their protests before the GAO, Femme Comp decided to pursue an agency-level protest, filing a protest with the Army on April 14, 2008.14 See id. at 11494-563. In response to the protest, the contracting officer prepared a statement dated May 7, 2008, and legal counsel for the Source Selection Evaluation Board prepared an agency report dated May 9, 2008. Id. at 11527-62. The Army, in a decision prepared by the Principal Assistant Responsible for Contracting on May 19, 2008, denied Femme Comp’s protest. Id. at 11495, 11504. Thus, on June 3, 2008, Femme Comp filed the instant protest in the Court of Federal Claims. As a result, two days later, the GAO dismissed all of the protests concerning this procurement, indicating that it was not permitted to “decide a protest where the matter involved is the subject of litigation before a court of competent jurisdiction.” Id. at 12454. 14 The administrative record does not include a copy of Femme Comp’s protest; however, the date of its protest is included in the Army’s decision. See AR 11494. -28- 2. Proceedings Before the Court of Federal Claims The court conducted an initial status conference in Femme Comp Inc. v. United States, No. 08-409C, on June 6, 2008. During the month that followed, the court issued a protective order, granted the motions of Savantage and Booz Allen to intervene, and consolidated the laterfiled cases of TAPE, L-3 Services, Data Systems, and BearingPoint with Femme Comp’s case. The court also issued an order that set forth a briefing schedule for the parties’ cross-motions for judgment on the administrative record. Briefing concluded on August 25, 2008, and the court heard argument on August 27, 2008. Subsequently, on September 2, 2008, the court issued a permanent injunction. This opinion elaborates on the court’s prior ruling. II. LEGAL STANDARDS A. Bid Protests The Court of Federal Claims has “jurisdiction to render judgment on an action by an interested party objecting to . . . the award of a contract or any alleged violation of statute or regulation in connection with a procurement or a proposed procurement,” 28 U.S.C. § 1491(b)(1) (2000), and may “award any relief that the court considers proper, including declaratory and injunctive relief except that any monetary relief shall be limited to bid preparation and proposal costs.” Id. § 1491(b)(2). Interested parties are those “prospective bidders or offerors whose direct economic interest would be affected by the award of the contract or by failure to award the contract.” Am. Fed’n of Gov’t Employees v. United States, 258 F.3d 1294, 1302 (Fed. Cir. 2001) (citing 31 U.S.C. § 3551(2)(A)). The court reviews the procuring agency’s action pursuant to the standards set forth in 5 U.S.C. § 706. 28 U.S.C. § 1491(b)(4). Although section 706 contains several standards, “the proper standard to be applied in bid protest cases is provided by 5 U.S.C. § 706(2)(A): a reviewing court shall set aside the agency action if it is ‘arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with law.’” Banknote Corp. of Am. v. United States, 365 F.3d 1345, 1350 (Fed. Cir. 2004). Under this standard, the court may set aside a procuring agency’s contract award “if either: (1) the procurement official’s decision lacked a rational basis; or (2) the procurement procedure involved a violation of regulation or procedure.” Impresa Construzioni Geom. Domenico Garufi v. United States, 238 F.3d 1324, 1332 (Fed. Cir. 2001). “Contracting officers are ‘entitled to exercise discretion upon a broad range of issues confronting them’ in the procurement process.” Id. at 1332-33 (quoting Latecoere Int’l, Inc. v. U.S. Dep’t of the Navy, 19 F.3d 1342, 1356 (11th Cir. 1994)). Thus, when engaging in a negotiated procurement, a “protestor’s burden of proving that the award was arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with law is greater than in other types of bid protests.” Galen Med. Assocs., Inc. v. United States, 369 F.3d 1324, 1330 (Fed. Cir. 2004). Moreover, when a contract is to be awarded on a “best value” basis, contracting officers have “even greater discretion than if the contract were to have been awarded on the basis of cost -29- alone.” Id. (citing E.W. Bliss Co. v. United States, 77 F.3d 445, 449 (Fed. Cir. 1996) (“Procurement officials have substantial discretion to determine which proposal represents the best value for the government.”)). Accordingly, the court’s review is “highly deferential” to the agency’s decision. Advanced Data Concepts, Inc. v. United States, 216 F.3d 1054, 1058 (Fed. Cir. 2000). “The court is not empowered to substitute its judgment for that of the agency.” Citizens to Preserve Overton Park, Inc. v. Volpe, 401 U.S. 402, 416 (1971). In addition to showing “a significant error in the procurement process,” a protester must show “that the error prejudiced it.” Data Gen. Corp. v. Johnson, 78 F.3d 1556, 1562 (Fed. Cir. 1996); see also Bannum, Inc. v. United States, 404 F.3d 1346, 1351 (Fed. Cir. 2005) (holding that if the procuring agency’s decision lacked a rational basis or was made in violation of the applicable statutes, regulations, or procedures, the court must then “determine, as a factual matter, if the bid protester was prejudiced by that conduct”). “To establish prejudice . . . , a protester must show that there was a ‘substantial chance’ it would have received the contract award absent the alleged error.” Banknote Corp. of Am., 365 F.3d at 1350 (quoting Emery Worldwide Airlines, Inc. v. United States, 264 F.3d 1071, 1086 (Fed. Cir. 2001)); see also Data Gen. Corp., 78 F.3d at 1562 (“[T]o establish prejudice, a protester must show that, had it not been for the alleged error in the procurement process, there was a reasonable likelihood that the protester would have been awarded the contract.”). B. Judgment on the Administrative Record The parties filed cross-motions for judgment on the administrative record pursuant to RCFC 52.1. In ruling on such motions in bid protests, the court will not disturb an agency’s decision unless it was “‘arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with law.’” Bannum, Inc., 404 F.3d at 1351 (quoting 5 U.S.C. § 706(2)(A) (2000)).15 Moreover, the court makes “factual findings . . . from the record evidence as if it were conducting a trial on the record.” Id. at 1357; see also id. at 1356 (“[J]udgment on the administrative record is properly understood as intending to provide for an expedited trial on the administrative record.”). III. DISCUSSION Plaintiffs attack the instant procurement on a variety of fronts. The court has grouped together plaintiffs’ related contentions and addresses them in an order that mirrors the procurement process. The court begins by addressing the Army’s evaluation scheme and then continues with an examination of the competitive range determination, discussions, the evaluation of the nonprice factors, the evaluation of Price, and the best value tradeoffs. 15 The decision in Bannum was based upon RCFC 56.1, which was abrogated and replaced by RCFC 52.1. RCFC 52.1, however, was designed to incorporate the decision in Bannum. See RCFC 52.1, Rules Committee Note (June 20, 2006). -30- A. The Army’s Evaluation Scheme The court begins its discussion by examining TAPE’s contention that the Army’s evaluation scheme was flawed. Federal law requires that in solicitations for competitive proposals, contracting agencies are required to describe “all significant factors and significant subfactors” that “the agency reasonably expects to consider in evaluating” the proposals, as well as “the relative importance assigned to each of those factors and subfactors.” 10 U.S.C. § 2305(a)(2)(A) (2000); accord id. § 2305(a)(3)(A). Agencies have the “broad discretion” to identify the relevant factors and subfactors and their relative importance. FAR § 15.304(c); see also Maint. Eng’rs v. United States, 50 Fed. Cl. 399, 415 (2001) (noting that agencies have broad discretion in determining the scope of evaluation factors). While those factors may include technical excellence, management capability, personnel qualifications, prior experience, past performance, and small business participation, agencies must identify price as a factor that they will consider. 10 U.S.C. § 2305(a)(3)(A); FAR § 15.304(c)(1)-(5). In addition, agencies must indicate whether the nonprice factors, when combined, are significantly more important than, approximately equal to, or significantly less important than price. 10 U.S.C. § 2305(a)(3)(A)(iii). Agencies must evaluate proposals “based solely on the factors specified in the solicitation.” Id. § 2305(b)(1). “Evaluations may be conducted using any rating method or combination of methods, including color or adjectival ratings, numerical weights, and ordinal rankings.” FAR § 15.305(a). However, “[t]he rating method need not be disclosed in the solicitation.” Id. § 15.304(d). Regardless of the chosen rating method, agencies must document the “relative strengths, deficiencies, significant weaknesses, and risks” of the evaluated proposals. Id. § 15.305(a). Moreover, when conducting tradeoffs between technical factors and price, agencies must assess “each offeror’s ability to accomplish the technical requirements” and prepare “[a] summary, matrix, or quantitative ranking, along with appropriate supporting narrative, of each technical proposal using the evaluation factors.” Id. § 15.305(a)(3). Agencies “must treat all offerors equally, evaluating proposals evenhandedly against common requirements and evaluation criteria.” Banknote Corp. of Am. v. United States, 56 Fed. Cl. 377, 383 (2003), aff’d, 365 F.3d at 1345. Within these constraints, however, agencies have great discretion in evaluating the technical factors of a proposal. See E.W. Bliss Co., 77 F.3d at 449. An agency’s final award decision must “be based on a comparative assessment of proposals against all source selection criteria in the solicitation.” FAR § 15.308. The agency must document its decision, and its documentation must “include the rationale for any business judgments and tradeoffs made or relied on by the [source selection authority], including benefits associated with additional costs.” Id. However, the “documentation need not quantify the tradeoffs that led to the decision.” Id. In the instant solicitation, the Army described five factors upon which each proposal would be evaluated, in descending order of importance: Corporate Capability, Performance Risk, Technical/Management, Price, and Small Business Participation. AR 379. The Army also indicated that the Corporate Capability factor had two unequally weighted subfactors– -31- Knowledge and Experience being more important than Performance Capability–and that the Technical/Management factor had two equally weighted subfactors–Understanding Support and Program Management. Id. The Army indicated its intent to base its award decisions “upon the evaluation of each Offeror’s complete proposal against the evaluation criteria . . . .” Id. Although it described numerous criteria that it would evaluate under each factor and subfactor, the Army did not describe the precise rating scheme it would utilize in the solicitation. Instead, the Army described its rating scheme in a Source Selection Plan that was not available to the offerors. See id. at 385-464. In the Source Selection Plan, the Army defined “rating” as “[t]he evaluators’ conclusions (supported by narrative write-ups) identifying the strengths, weaknesses, and deficiencies of an evaluation factor or subfactor,” which was to be “expressed as an adjective.” Id. at 402. In turn, each adjective was assigned a color. Id. at 398-402. TAPE attacks the Army’s evaluation scheme in two ways. First, TAPE contends that the solicitation lacked any indication of how the Army would translate its evaluation of each factor and subfactor into adjectival/color ratings. Pl.’s Mot. J. Administrative R. (“TAPE Mot.”) 18. Second, TAPE asserts that the entire administrative record is devoid of a “description of how the Government would conduct a comparative analysis of the different offerors.” Id. at 18-19. Neither argument has merit. As an initial matter, the court notes that the Army was not required to describe its “rating method” in the solicitation. FAR § 15.304(d). Instead, federal regulations require that the Army conduct a “comparative assessment of proposals against all source selection criteria in the solicitation,” id. § 15.308, document the “relative strengths, deficiencies, significant weaknesses, and risks” of the evaluated proposal, id. § 15.305(a), and describe its “rationale for any business judgments and tradeoffs made or relied on,” id. § 15.308. Thus, to the extent that the rating method described in the Source Selection Plan is consistent with the terms of the solicitation, in that it requires the Army to consider only the factors identified in the solicitation and to afford those factors the weights indicated in the solicitation, the Army has acted within the bounds of the relevant statutes and regulations. See, e.g., Dismas Charities, Inc. v. United States, 61 Fed. Cl. 191, 205-06 (2004) (finding that the scoring scheme set forth in the Source Selection Plan was consistent with the terms of the solicitation and was therefore not arbitrary and capricious). In the instant solicitation, the Army properly described the factors and subfactors it intended to evaluate, indicated the weight it would ascribe each factor and subfactor, and provided that it would evaluate each proposal against the evaluation criteria. The solicitation therefore complied with the relevant regulations. Further, if the court construes TAPE’s arguments as an attack of the development or implementation of the Source Selection Plan, they still fall short. It is well within the Army’s discretion to determine how to implement the provisions of the solicitation. See E.W. Bliss Co., 77 F.3d at 449 (indicating that the “minutiae of the procurement process in such matters as technical ratings and the timing of various steps in the procurement” constitute “discretionary determinations of procurement officials that a court will not second guess”). It also bears noting that TAPE has not cited any statute or regulation that the Army violated in using its Source -32- Selection Plan. Accordingly, the court cannot conclude that the Army acted arbitrarily, capriciously, or not in accordance with the law. Moreover, to the extent that TAPE is attacking the contents of the solicitation itself, TAPE’s argument is untimely. “[A] party who has the opportunity to object to the terms of a government solicitation containing a patent error and fails to do so prior to the close of the bidding process waives its ability to raise the same objection in a bid protest action in the Court of Federal Claims.” Blue & Gold Fleet, L.P. v. United States, 492 F.3d 1308, 1313 (Fed. Cir. 2007). Because it must have been aware of the solicitation’s purported lack of detail concerning proposal evaluation during the procurement process, TAPE should have raised its objection prior to the end of the procurement process. B. The Competitive Range Next, Femme Comp contests its exclusion from the competitive range. A contracting agency is required to establish a competitive range if it intends to conduct discussions with offerors. FAR §15.306(c)(1). The competitive range is “comprised of all of the most highly rated proposals,” based upon “the ratings of each proposal against all evaluation criteria.” Id. Thus, “‘an agency may not exclude a technically acceptable proposal from the competitive range without taking into account the relative cost of that proposal to the government.’” Bean Stuyvesant, L.L.C. v. United States, 48 Fed. Cl. 303, 338 (2000) (quoting Meridian Mgmt. Corp., B-285127, 2000 C.P.D. ¶ 121 (July 19, 2000)). Ultimately, however, agencies have “broad discretion in determining the competitive range,” and the court will not disturb an agency’s determination unless it is “clearly unreasonable.” Birch & Davis Int’l, Inc. v. Christopher, 4 F.3d 970, 973 (Fed. Cir. 1993). After evaluating Femme Comp’s initial proposal, the Source Selection Evaluation Board assigned the following ratings, with which the contracting officer concurred: -33- Factors and Subfactors Rating (Adjective and Color) 1: Corporate Capability [. . .] 1.1: Knowledge and Experience [. . .] 1.2: Performance Capability [. . .] 2: Performance Risk [. . .] 3: Technical/Management [. . .] 3.1: Understanding Support [. . .] 3.2: Program Management Plan [. . .] 4: Price [. . .] 5: Small Business Participation [. . .] AR 4730-49, 5078, 5082. The Source Selection Evaluation Board also evaluated Femme Comp’s initial proposed price as [. . .]. Id. at 5082, 12728. Based upon these findings, the contracting officer eliminated Femme Comp from the competitive range. Id. at 5078-80; see also id. at 12578-604 (comparing Femme Comp’s proposal with the proposals of the twelve offerors within the competitive range). Femme Comp raises several challenges to the Army’s establishment of the competitive range. First, it argues that the Army improperly failed to consider price when determining which offerors to include in the competitive range. Pl. Femme Comp’s Mot. J. Administrative R. (“Femme Comp Mot.”) 12-16. Femme Comp then contends that the Army improperly automatically excluded it from the competitive range based upon an evaluated deficiency in its proposal. Id. at 17-18. Third, Femme Comp asserts that the Army’s evaluation of its initial proposal was unreasonable. Id. at 18-30. The court addresses each argument in turn. 1. The Army Considered Price When Establishing the Competitive Range Femme Comp argues that it is clear that the Army did not consider price when establishing the competitive range because had the Army done so, “the competitive range would have included more lower-priced initial offers.” Id. at 14-15. In support of this argument, Femme Comp developed its own rating system, converting the Army’s adjectival/color ratings to numeric scores. Id. at 15. Femme Comp assigned each factor and subfactor a numerical score, attempting to replicate the relative weights assigned to each factor and subfactor in the solicitation, such that the maximum possible score received by a proposal would be 100. Id. at 15, Attach. A. Femme Comp then applied its rating system to the twenty-five initial proposals -34- and then ranked the proposals by their scores in descending order.16 Id. at 15, Attach. B. From this analysis, Femme Comp determined that “the competitive range did include all of the highest rated scores without regard to price except for one.” Because the rating system is of Femme Comp’s own creation, it is not part of the administrative record. Moreover, the rating system has no foundation in the solicitation, which does not assign specific numerical or percentage weights to the factors. For this reason, Femme Comp’s rating system is entirely arbitrary. Defendant has moved to strike the attachments to Femme Comp’s motion setting forth the scores assigned to each proposal pursuant to Femme Comp’s rating scheme, as well as the arguments in Femme Comp’s motion arising from those attachments. See generally Def.’s Mot. Strike Attachs. A & B Pl. Femme Comp’s Mot. J. Administrative R. Because Femme Comp’s rating scheme was not used by the Army and lacks a basis in the solicitation, the court grants defendant’s motion. Setting aside its arbitrary rating system, Femme Comp argues that “in the context of the overall solicitation process,” the Army “did not take the time and make the effort to consider price at the competitive range stage.” Femme Comp Mot. 16. Yet, Femme Comp cites no evidence from the administrative record to support this argument. Instead, Femme Comp makes the extremely tenuous inference that because the Army did not seriously consider price at the discussion stage of the procurement, it must not have seriously considered price when establishing the competitive range. Id. Certainly, whatever the Army did or did not do during discussions has no bearing on the establishment of the competitive range. Further, the evidence in the administrative record supports the conclusion that the Army did consider and analyze price when establishing the competitive range. See AR 5078 (indicating that the contracting officer reached his competitive range determination after comparing “the proposals under all of the criteria set forth in the solicitation, including price and the non-price factors,” as well as “the relative weight of the criteria”), 12578-604 (comparing Femme Comp’s proposal to those proposals selected for the competitive range on each factor and subfactor, including Price). Femme Comp also argues that although the Army had the broad discretion to determine the competitive range, it was not permitted to exclude Femme Comp’s technically acceptable proposal without considering its price. Femme Comp Mot. 13 (citing Bean Stuyvesant, L.L.C., 48 Fed. Cl. at 338), 16; Pl. Femme Comp’s Reply Mem. Supp. Its Mot. J. Administrative R. & Opp’n Def.’s Mot. J. Administrative R. 4. Femme Comp misinterprets the case law, which merely requires the Army to consider price before excluding a technically acceptable proposal 16 As noted by defendant, Femme Comp erroneously scored its own proposal, treating each of the Technical/Management subfactors as if it received [. . .] ratings and not [. . .] ratings. See Def.’s Mot. J. Administrative R. & Opp’n Pls.’ Mots. J. Administrative R. (“Def.’s Mot.”) 38 n.3 (citing Femme Comp Mot. Attach. A). However, defendant did not accurately describe the magnitude of the error. Femme Comp assigned scores of [. . .] for a [. . .] rating and [. . .] for a [. . .] rating. Thus, Femme Comp’s total score should be reduced by [. . .] points (six points for each subfactor), and not six points as suggested by defendant. See id. -35- from the competitive range. It does not require the Army to include all technically acceptable proposals in the competitive range that have a low price. Such a requirement would be contrary to the discretion afforded to the agency in establishing a competitive range consisting of the most highly rated proposals. Moreover, to the extent that Femme Comp relies upon the proposition that a proposal, without regard to price, “must be included within the competitive range unless it is so technically unacceptable that extensive revisions to the proposal would be required to make it technically acceptable,” Femme Comp Mot. 13, the court notes that the proposition is based upon a prior version of the FAR that required agencies to include proposals in the competitive range that had “a reasonable chance of being selected for award,” compare Labat-Anderson, Inc. v. United States, 42 Fed. Cl. 806, 840 (1999) (citing FAR § 15.609(a) (1991) (“The competitive range . . . shall include all proposals that have a reasonable chance of being selected for award.”)), with Bean Stuyvesant, L.L.C., 48 Fed. Cl. at 339-40 (citing FAR § 15.306(c) (1997) (“[T]he contracting officer shall establish a competitive range comprised of all of the most highly rated proposals.”)). Accordingly, the court cannot give the proposition any weight. In sum, Femme Comp has failed to establish that the Army failed to consider price in establishing the competitive range. 2. Femme Comp’s Evaluated Deficiency Did Not Lead the Army to Exclude Femme Comp’s Proposal From the Competitive Range The court can dispose of Femme Comp’s second argument without much effort. Femme Comp contends that during its postaward debriefing, the Army advised it that proposals containing evaluated deficiencies were automatically excluded from the competitive range, which was tantamount to the Army applying an unstated evaluation factor. Femme Comp Mot. 17-18. While it is certainly an undisputed legal proposition that agencies are prohibited from relying upon “undisclosed evaluation criteria,” Banknote Corp. of Am., 56 Fed. Cl. at 386, there is no evidence that the Army actually excluded offerors from the competitive range due to evaluated deficiencies, despite what the Army may have indicated in the postaward debriefing. Indeed, as noted above, some of the offerors that were included in the competitive range had evaluated deficiencies. See, e.g., AR 5216-17 (Booz Allen), 5334-35 (Systems Research), 565960 (Wyle). Thus, Femme Comp’s argument must fail.17 3. The Army Properly Evaluated Femme Comp’s Proposal Lastly, Femme Comp contends that the Army “failed to give coherent and reasonable explanations as to how it evaluated the [Femme Comp] proposal regarding” the Corporate Capability and Technical/Management factors. Femme Comp Mot. 18-19. At issue are certain 17 Because the court’s conclusion is not dependent upon any statements made during the postaward debriefing, the declarations supplied by Femme Comp are irrelevant. For this reason, the court denies Femme Comp’s motion to supplement the administrative record as moot. -36- assigned strengths and weaknesses under these factors.18 First, the Source Selection Evaluation Board noted [. . .] strengths under the Knowledge and Experience subfactor of the Corporate Capability factor, including the following: [. . .]. AR 4731. Under the same subfactor, the Source Selection Evaluation Board documented [. . .] and [. . .] weaknesses: [. . .]. Id. Second, with respect to the Performance Capability subfactor of the Corporate Capability factor, the Source Selection Evaluation Board highlighted [. . .] strengths, including the following: [. . .]. Id. at 4734. The Source Selection Evaluation Board also highlighted [. . .] strengths under this subfactor, including the following: [. . .]. Id. On the other hand, the Source Selection Evaluation Board documented [. . .] weaknesses under this subfactor: [. . .]. Id. at 4734-35. Further, the Source Selection Evaluation Board noted [. . .] weaknesses, including the following: [. . .]. Id. Third, in analyzing the Understanding Support subfactor of the Technical/ Management factor, the Source Selection Evaluation Board noted the following deficiency: [. . .]. Id. at 4744. But see id. at 5965 (identifying the same issue, in an Item for Negotiation, as a [. . .] and not a [. . .]). Based upon the above-described [. . .], Femme Comp first argues that [. . .]. Femme Comp Mot. 20-21. Specifically, Femme Comp contends that [. . .] amounted to examples of its experience. Id. at 21. Defendant responds that Femme Comp’s argument “only illustrates [Femme Comp]’s misunderstanding of the evaluation criteria and [. . .].” Def.’s Mot. 42-43. Specifically, defendant explains that under the Knowledge and Experience subfactor, the Army 18 In its motion, Femme Comp cites to the postaward debriefing slides when describing the strengths and weaknesses assigned by the Source Selection Evaluation Board. The court instead relies on the Consensus Evaluation Documents–the original source of the Source Selection Evaluation Board’s evaluations. -37- would evaluate not only Femme Comp’s knowledge and experience, but also Femme Comp’s “‘ability to manage and support information technology programs.’” Id. at 42 (quoting AR 380). Thus, according to defendant, the Source Selection Evaluation Board properly noted that “[. . .] is not the functional equivalent of implementation or managing that same policy and it was reasonable for the [Source Selection Evaluation Board] to [. . .].” Id. at 43 (quoting AR 4731). Defendant further argues that the Source Selection Evaluation Board’s assignment of a [. . .] weakness for Femme Comp’s failure “[. . .],” left unchallenged by Femme Comp, was sufficient to support a [. . .] rating. Id. Accordingly, defendant characterizes Femme Comp’s argument as a challenge to the scope of the Knowledge and Experience subfactor, an area within the Army’s broad discretion. Id. (citing Maint. Eng’rs, 50 Fed. Cl. at 415). The court agrees with defendant. The Army was well within its discretion to determine the scope of the Knowledge and Experience subfactor. See FAR § 15.304(c). Based upon the language emphasized by Femme Comp, the court finds that the Army was justified in determining that Femme Comp did not provide [. . .]. Thus, there is no contradiction between the strengths and weaknesses documented by the Source Selection Evaluation Board. Femme Comp next argues that the Army’s evaluation of the second subfactor of the Corporate Capability factor contained additional contradictions. First, Femme Comp contends that the Army’s finding that Femme Comp “‘[. . .]’” is clearly contradicted by the Army’s finding that Femme Comp “‘[. . .].’” Femme Comp Mot. 24 (quoting AR 4734). Femme Comp also contends that the Army’s finding that Femme Comp “[. . .]’” contradicts the Army’s finding that Femme Comp did not provide “‘[. . .]’” of its “‘[. . .].’” Id. at 25-26 (quoting AR 4734). Finally, Femme Comp asserts that the Army’s finding that Femme Comp did not provide “‘[. . .]’” of its “‘[. . .]’” is contradicted by the Army’s finding, under the Knowledge and Experience subfactor, that Femme Comp “‘[. . .]’” and provided “‘[. . .].’” Id. at 24-25 (quoting AR 4731, 4734). In addition to the purported contradictions, Femme Comp contends that the Army’s evaluation of the Performance Capability subfactor was unreasonable because although its proposal was marked down for [. . .], “[t]he solicitation contain[ed] no standard or guidance on what the government would consider to be [. . .].” Id. at 26. In response to Femme Comp’s contentions concerning the Performance Capability subfactor, defendant argues that Femme Comp “[. . .],” Def.’s Mot. 45, but does not address the other two purported contradictions identified by Femme Comp. With respect to Femme Comp’s [. . .] argument, defendant argues that (1) because this argument attacks the terms of the solicitation, Femme Comp should have raised it prior to the close of the procurement process and (2) the Army’s finding on this criteria was appropriate because Femme Comp failed to include in its proposal “[. . .].” Id. at 46. Once again, Femme Comp’s arguments are unfounded. The Army was well within its discretion to determine the scope of the Performance Capability subfactor. See FAR § 15.304(c). Based upon the language quoted by Femme Comp, the court cannot find that the Army lacked a rational basis for determining that Femme Comp [. . .]. Thus, given the absence of required -38- information in Femme Comp’s proposal, there is no contradiction between the strengths and weaknesses documented by the Source Selection Evaluation Board under the Performance Capability subfactor. Further, although Femme Comp is correct that the solicitation did not provide any details concerning the [. . .], its argument on this issue must fail because the Army had a great deal of discretion in evaluating the [. . .], see E.W. Bliss Co., 77 F.3d at 449, Femme Comp has failed to identify any unequal treatment in the Army’s evaluation of the [. . .], and, based on the language quoted by Femme Comp, the Army’s evaluation had a rational basis. Moreover, the Army was under no obligation to consider information contained within the [. . .] section of Femme Comp’s proposal when evaluating the [. . .] section. Offerors were required to both confine information “to the appropriate volume,” AR 347, and “present all information relevant to the factor/subfactor in the appropriate section,” id. at 348. Femme Comp contends, in a status report filed the day after argument, that the solicitation’s requirement that information be presented “in the appropriate section” applies only to final proposal revisions. Status Report Pl. Femme Comp & Def. United States, Aug. 28, 2008, at 1. The court disagrees. Section L of the solicitation–Instructions, Conditions, and Notices to Offerors–described the required content of the proposals. See AR 343. Specific formatting requirements were addressed in subsection six. See id. at 347. Subsection six, in turn, was further broken down into its own subsections concerning proposal submissions (section L.6.1), proposal revisions (section L.6.2), and proposal volume breakout (section L.6.3). See id. at 347-49. The requirement that information be presented “in the appropriate section” appears in this last subsection, and not in the subsection concerning proposal revisions. Thus, the requirement applies to all proposals. Femme Comp must bear the responsibility for its failure to present the information that the Army sought in the proper section of its proposal. The Army properly confined its evaluation to the Performance Capability section. Finally, regarding the Technical/Management factor, Femme Comp argues that the Army assigned it a deficiency under the first subfactor for failing to provide its “‘[. . .],’”19 thereby unreasonably ignoring Femme Comp’s description of its [. . .] in the Program Management Plan subsection of its proposal. Femme Comp Mot. 28-29 (quoting AR 4744). Femme Comp explains that the “Questions and Answers” included with the solicitation demonstrate that there was confusion surrounding which subsection of the Technical/Management volume should include the offerors’ [. . .]. Id. at 28 (citing AR 290). Femme Comp also contends that because the information was included within the same volume of its proposal, the Army should have considered it. Id. at 29-30; see also Tr. 14 (arguing that information only has to be in the same volume and not “in a particular section of a volume”). In response, defendant argues that to the extent Femme Comp is challenging the contents of the solicitation, Femme Comp should have raised its challenge prior to the close of the procurement process. Def.’s Mot. 47. Defendant 19 Although the Source Selection Evaluation Board described Femme Comp’s [. . .]. Compare AR 4744, with id. at 5965. Because Femme Comp was excluded from the competitive range and therefore did not receive the relevant Item for Negotiation, the contents of the Item for Negotiation are irrelevant. -39- also argues that because the Army clearly indicated in the solicitation that it would “‘evaluate the Offerors’ overall [. . .]’” under the Understanding Support subfactor, id. (quoting AR 381), and that offerors were required to confine all information “‘to the appropriate volume to facilitate independent evaluation,’” id. (quoting AR 347), “offerors were required to provide [. . .]” [. . .], id.; see also Tr. 181 (quoting AR 348). As the court noted above, defendant’s interpretation of the solicitation is correct. Offerors were required to both confine information “to the appropriate volume,” AR 347, and “present all information relevant to the factor/subfactor in the appropriate section,” id. at 348. Femme Comp must bear the responsibility for its failure to present the information that the Army sought in the proper section of its proposal. The Army was not required by the terms of the solicitation to look beyond the specific section of the proposal that it was evaluating. Accordingly, Femme Comp has failed to show that the Army acted arbitrarily, capriciously, or unlawfully in evaluating the Technical/Management volume of its proposal. 4. Because the Army Properly Excluded Femme Comp’s Proposal From the Competitive Range, Femme Comp Lacks Standing to Protest the Army’s Contract Awards In sum, Femme Comp has failed to establish that the Army improperly excluded its proposal from the competitive range. This conclusion raises the issue of whether Femme Comp had standing to protest the Army’s contract awards in the first instance. As noted above, only “interested parties” may protest a contract award. See 28 U.S.C. § 1491(b)(1). The United States Court of Appeals for the Federal Circuit (“Federal Circuit”) has held that the term “interested party” should be construed in accordance with the Competition in Contracting Act of 1984, and that, therefore, “standing under § 1491(b)(1) is limited to actual or prospective bidders or offerors whose direct economic interest would be affected by the award of the contract or by failure to award the contract.” Am. Fed’n of Gov’t Employees, 258 F.3d at 1302 (citing 31 U.S.C. § 3551(2)(A)). Accordingly, Femme Comp must establish that it “(1) is an actual or prospective bidder, and (2) possesses the requisite direct economic interest.” Rex Serv. Corp. v. United States, 448 F.3d 1305, 1307 (Fed. Cir. 2006). To prove that it possesses a “direct economic interest,” Femme Comp must show that it had a “substantial chance” of receiving the contract. Id. In other words, Femme Comp “need only establish that it ‘could compete for the contract’” to have standing to protest.20 Myers Investigative & Sec. Servs., Inc. v. United States, 275 F.3d 1366, 1370 (Fed. Cir. 2002) (quoting Impresa, 238 F.3d at 1334). Because Femme Comp was excluded from the competitive range, it could not compete for a contract and thus had 20 In Rex Service Corp., the Federal Circuit cited Statistica, Inc. v. Christopher, 102 F.3d 1577, 1582 (Fed. Cir. 1996), for the proposition that “the ‘substantial chance’ standard requires the protesting party to ‘establish not only some significant error in the procurement process, but also that there was a substantial chance it would have received the contract but for that error.’” 448 F.3d at 1308. However, Statistica, Inc., addressed the “substantial chance” standard as it related to the merits of the protestor’s request for injunctive relief, and not for the threshold issue of standing. 102 F.3d at 1581-82. -40- no chance to receive a contract. Accordingly, Femme Comp lacks standing to protest the instant contract awards. The court denies Femme Comp’s motion for judgment on the administrative record and dismisses Femme Comp’s complaint for lack of jurisdiction. C. The Army’s Discussions With the Offerors After establishing the competitive range, the Army conducted discussions with the offerors. TAPE, Data Systems, and BearingPoint allege that those discussions were not meaningful. Contracting agencies may opt to conduct oral or written discussions with “all responsible offerors who submit[ted] proposals within the competitive range.” 10 U.S.C. § 2305(b)(4)(A)(i). Discussions “are undertaken with the intent of allowing the offeror to revise its proposal,” with the “primary objective of “maximiz[ing] the Government’s ability to obtain best value.” FAR § 15.306(d)(2). Agencies must conduct discussions “with each offeror within the competitive range.” Id. § 15.306(d)(1). The FAR describes the scope of discussions as follows: At a minimum, the contracting officer must . . . indicate to, or discuss with, each offeror still being considered for award, deficiencies, significant weaknesses, and adverse past performance information to which the offeror has not yet had an opportunity to respond. The contracting officer also is encouraged to discuss other aspects of the offeror’s proposal that could, in the opinion of the contracting officer, be altered or explained to enhance materially the proposal’s potential for award. However, the contracting officer is not required to discuss every area where the proposal could be improved. The scope and extent of discussions are a matter of contracting officer judgment. Id. § 15.306(d)(3). For such discussions to be meaningful, they must “‘generally lead offerors into the areas of their proposals requiring amplification or correction, which means that discussions should be as specific as practical considerations permit.’” Advanced Data Concepts, Inc. v. United States, 43 Fed. Cl. 410, 422 (1999) (quoting SRS Techs., B-254425.2, 94-2 C.P.D. ¶ 125 (Sept. 14, 1995)), aff’d, 216 F.3d at 1054. Furthermore, although agencies may not favor “one offeror over another,” FAR § 15.306(e)(1), agencies are not required to conduct identical discussions with each offeror,” WorldTravelService v. United States, 49 Fed. Cl. 431, 440 (2001). Among the topics appropriate for discussions is an offeror’s proposed price. FAR § 15.306(d)(3), (e)(3). Agencies may inform offerors that their prices are either too high or too low and supply the offerors with “the results of the analysis supporting that conclusion.” Id. § 15.306(e)(3); see Banknote Corp. of Am., 56 Fed. Cl. at 385 (noting that “an agency may inform an offeror during discussions that its cost or price is considered to be too high or unrealistic”). However, “the government has no responsibility to inform an offeror that its cost or price is high where the offeror’s cost or price is not considered excessive or unreasonable.” -41- Banknote Corp. of Am., 56 Fed. Cl. at 385. Moreover, agencies may not reveal “an offeror’s price without that offeror’s permission.” FAR § 15.306(e)(3). To provide context for its analysis, the court first describes the facts relevant to the three plaintiffs’ arguments. 1. Discussions With TAPE Volume II of TAPE’s initial proposal contained, as required, information about [. . .]. AR 3154-73. [. . .]. [. . .]. After evaluating TAPE’s initial proposal, the Army transmitted sixteen Items for Negotiation to TAPE for a response, as outlined in the following table:21 Item for Negotiation Number Issue Date Issued [. . .] [. . .] 10/1/07 [. . .] [. . .] 10/1/07 [. . .] [. . .] 11/6/07 [. . .] [. . .] 10/19/07 [. . .] [. . .] 10/5/07 [. . .] [. . .] 10/5/07 [. . .] [. . .] 10/5/07 [. . .] [. . .] 10/3/07 [. . .] [. . .] 11/14/07 [. . .] [. . .] 8/17/07 21 The first letter in the Item for Negotiation number indicates the evaluation team that prepared the Item for Negotiation: “K” for Corporate Capability, “P” for Performance Risk, “T” for Technical/Management, “C” for Price, and “S” for Small Business Participation. The second letter indicates the reason that the Army issued the Item for Negotiation: “D” for a deficiency, “W” for a weakness and “O” for another issue. The third and fourth characters constitute the unique two-character code assigned to the offeror by the Army for evaluation purposes. -42- [. . .] [. . .] 8/28/07 [. . .] [. . .] 12/3/07 [. . .] [. . .] 1/23/08 [. . .] [. . .] 9/4/07 [. . .] [. . .] 9/4/07 [. . .] [. . .] 9/3/07 Id. at 6037 ([. . .]), 6040-41 ([. . .]), 6059 ([. . .]), 6074 ([. . .]), 6084-85 ([. . .]), 6095-96 ([. . .]), 6106-07 ([. . .]), 6109-10 ([. . .]), 6140 ([. . .]), 6150-51 ([. . .]), 6154-55 ([. . .]), 6187 ([. . .]), 6217-18 ([. . .]), 6223-25 ([. . .]), 6230-32 ([. . .]), 6237 ([. . .]). As described in the table, three of TAPE’s Items for Negotiation concerned the [. . .] factor, six concerned the [. . .] factor, four concerned the [. . .] factor, and three concerned the [. . .] factor.22 Further, there is no evidence 22 Defendant filed the administrative record with the court on June 16, 2008; consequently, all of the parties were in possession of the administrative record prior to filing their cross-motions for judgment on the administrative record. Despite having the administrative record in its possession, TAPE inexplicably misrepresented the quantity and subject matter of the Items for Negotiation that it received from the Army in its motion. Then, even after this error was noted and discussed by defendant and Savantage, TAPE reiterated the same misrepresentations in its reply brief. Specifically, in its motion, TAPE contends, without qualification, that “[t]he Army then generated three Items for Negotiation . . . for TAPE.” TAPE Mot. 11 (citing AR 6037, 6040, 6059 (containing [. . .])), 19. But see id. at 13 (referring to a fourth Item for Negotiation: [. . .]). TAPE further contends that “[t]hese [Items for Negotiation] only related to [. . .],” and that “[t]here were no [Items for Negotiation] related to the Army’s concerning [sic] [. . .].” Id. at 13; accord id. at 15 (“There were no [Items for Negotiation] related to the Army’s concerning [sic] [. . .].”), 20 (“[T]he Government never informed TAPE of the deficiencies noted in the Army’s evaluation of [. . .]. No [Items for Negotiation] were generated concerning these items . . . .”). Clearly, one of the three Items for Negotiation cited by TAPE, [. . .], concerned the second subfactor of the Technical/ Management factor (i.e., “Factor 3”), as did five other Items for Negotiation left unmentioned by TAPE–[. . .]. See AR 6059, 6074, 6084-85, 6095-96, 6106-07, 6150-51. Both defendant and Savantage noted this fact in their respective cross-motions for judgment on the administrative record. See Def.’s Mot. 92-93 (noting that TAPE received six Items for Negotiation concerning the second subfactor of the Technical/Management factor but erroneously stating that TAPE received a total of nine Items for Negotiation); Def.-Intervenor Savantage’s Cross-Mot. Summ. J. & Opp’n Pl. TAPE’s Mot. Summ. J. (“Savantage Mot.”) 5, -43- that the Army transmitted any Items for Negotiation to TAPE concerning the [. . .] factor. TAPE responded to all of the Items for Negotiation and the Army ultimately found all of the responses to be adequate. Id. When TAPE submitted its final proposal to the Army, it left Volume II–Performance Risk–unchanged. See id. at 9284-304. Accordingly, the Source Selection Evaluation Board’s final evaluation of TAPE’s Performance Risk was substantially identical to its initial evaluation. See id. at 4949-51 (initial evaluation), 10307-09 (final evaluation). 2. Discussions With Data Systems In its initial proposal, Data Systems identified [. . .] as its Major Subcontractor. See, e.g., id. at 4260, 4287-95. When it first analyzed Volume II of Data Systems’s proposal, the Source Selection Evaluation Board noted that [. . .] received twenty [. . .] ratings, ten [. . .] ratings, and ten [. . .] ratings on its seven past performance questionnaires. Id. at 4582. The Source Selection Evaluation Board identified these results as a [. . .] strength. Id. at 4583. Although the Source Selection Evaluation Board noted that “[. . .],” it determined that it could not rate Data Systems higher than [. . .] on the Performance Risk factor because “Major Subcontractor, [. . .], received 20 [. . .] ratings out of a total of 40. (20 [. . .], 10 [. . .], 10 [. . .]).” Id. During discussions, the Army transmitted fourteen Items for Negotiation to Data Systems, as outlined in the following table: 19-20 (same). Undaunted by the facts, TAPE, in its reply brief, steadfastly reiterates its earlier argument that it only received three Items for Negotiation, none of which concerned the [. . .] factor. Reply Supp. Pl.’s Mot. J. Administrative R. & Opp’n Def.’s & Intervenor’s CrossMots. J. Administrative R. 5, 9. It was not until pressed by the court at oral argument that TAPE acknowledged its receipt of thirteen Items for Negotiation. See Tr. 24-25. In an attempt to explain its arguments in the briefs, counsel for TAPE stated: “We did [receive sixteen Items for Negotiation]. We focused on three that we believed were the most important and were related to the evaluation that eventually mattered, the prejudicial aspects of the evaluation. But, yes, we did receive [thirteen] more. We responded to them.” Id. at 25. This “explanation” is clearly contrary to the arguments set forth in TAPE’s briefs and cannot be accepted by the court. TAPE’s arguments concerning its discussions with the Army on the [. . .] factor clearly relied upon its assertion that it received no Items for Negotiation for this factor. Because this assertion is incorrect, the court will disregard TAPE’s arguments on this matter. -44- Item for Negotiation Number Issue Date Issued [. . .] [. . .] 9/27/07 [. . .] [. . .] 9/27/07 [. . .] [. . .] 9/27/07 [. . .] [. . .] 9/26/07 [. . .] [. . .] 9/26/07 [. . .] [. . .] 9/28/07 [. . .] [. . .] 9/28/07 [. . .] [. . .] 9/28/07 [. . .] [. . .] 1/22/08 [. . .] [. . .] 9/28/07 [. . .] [. . .] 8/13/07 [. . .] [. . .] 8/13/07 [. . .] [. . .] 11/29/07 [. . .] [. . .] 11/29/07 Id. at 6702-30. There is no evidence that the Army transmitted any Items for Negotiation to Data Systems concerning the Performance Risk factor. Data Systems responded to all of the Items for Negotiation and the Army found all of the responses to be adequate. Id. Compared with the fourteen Items for Negotiation transmitted to Data Systems, the Army transmitted twenty Items for Negotiation to Booz Allen, eight Items for Negotiation to Systems Research, and nineteen Items for Negotiation to Wyle.23 Id. at Tab 21, passim. 23 Data Systems uses page totals from tab twenty-one of the administrative record as a measure of the scope of discussions conducted by the Army with itself, Systems Research, Wyle, and Booz Allen. Pl. DSA’s Statement Facts Supp. Mot. J. Administrative R. (“Data Systems Facts”) 8; Pl. DSA’s Mot. J. Administrative R. (“Data Systems Mot.”) 4. Specifically, Data Systems (1) calculates the total number of pages in this tab, which includes the Items for Negotiation prepared by the Army and some of the offerors’ responses; (2) expresses the total number of pages associated with an offeror as a percentage of the total number of pages; and (3) asserts that a higher percentage indicates that an offeror was given a greater opportunity to revise -45- When Data Systems submitted its final proposal to the Army, it left Volume II–Performance Risk–unchanged. See id. at 9963-90. Accordingly, the Source Selection Evaluation Board’s final evaluation of Volume II was substantially identical to its initial evaluation. See id. at 4581-84 (initial evaluation), 10155-58 (final evaluation). 3. Discussions With BearingPoint During discussions, the Army transmitted eight Items for Negotiation to BearingPoint, seven of which are outlined in the following table:24 Item for Negotiation Number Issue Date Issued [. . .] [. . .] 10/1/07 [. . .] [. . .] 11/7/07 [. . .] [. . .] 11/30/07 its proposal or took the opportunity to rewrite a larger portion of its proposal. Data Systems Facts 8. The approach taken by Data Systems is extremely inaccurate. First, tab twenty-one includes Items for Negotiation prepared for, but not sent to, offerors not in the competitive range. See, e.g., AR 5940-87 (Items for Negotiation prepared for Femme Comp). Thus, these Items for Negotiation lack responses. Second, the revisions submitted in response to the Items for Negotiation of all of the offerors in the competitive range are not included in tab twenty-one. See, e.g., id. at 6702-30 (Items for Negotiation prepared for Data Systems without the revisions it attached in response). Third, many of the proposal pages containing revisions were submitted multiple times because they contained revisions in response to multiple Items for Negotiation. See, e.g., id. at 5084-90, 5095-101, 5104-10, 5113-19, 5122-28, 5131-37, 5142-48 (containing seven identical copies of a revised portion of Booz Allen’s proposal, in response to the following eight Items for Negotiation: [. . .]). Finally, at least one offeror, Wyle, submitted an entire proposal section in response to an Item for Negotiation even though not every page in the section contained a change, and then resubmitted the same section in response to multiple Items for Negotiation. See, e.g., id. at 5336-55, 5359-78, 5381-400 (containing three identical copies of Wyle’s revised [. . .] section, with revisions occurring on only some of the pages, in response to the following three Items for Negotiation: [. . .]). All of these factors wildly skew the calculations performed by Data Systems. Accordingly, the court will disregard those arguments of Data Systems that rely solely upon these calculations. 24 The table includes only seven of the Items for Negotiation because although it appears that Item for Negotiation number [. . .] was transmitted to BearingPoint, see AR 6692, the court was unable to locate it in the administrative record. -46- [. . .] [. . .] 11/30/07 [. . .] [. . .] 8/17/07 [. . .] [. . .] 8/22/07 [. . .] [. . .] 8/29/07 Id. at 6689-701. BearingPoint responded to all of the Items for Negotiation and the Army found all of the responses to be adequate. Id. 4. The Army Conducted Meaningful Discussions a. The Army’s Performance Risk Discussions Were Proper To begin, both TAPE and Data Systems contend that the Army did not conduct meaningful discussions with respect to the Performance Risk factor. See TAPE Mot. 19-20; Data Systems Mot. 9-10. Specifically, TAPE contends that it “[. . .].” TAPE Mot. 20. According to TAPE, due to this failure, the Army prejudicially downgraded its proposal in violation of the terms of the solicitation. Id. For its part, Data Systems argues that the Army “treated the ratings of [its] major subcontractor, [. . .], as a weakness” that prevented it from receiving a higher Performance Risk rating, but failed to raise the purported weakness during discussions. Data Systems Mot. 9-10. Thus, according to Data Systems, it was placed “at a competitive disadvantage.” Id. at 10. In response to TAPE’s Performance Risk argument, Savantage contends that because the Army was not required to contact all past performance references, Savantage Mot. 21 (quoting CSE Constr. Co. v. United States, 58 Fed. Cl. 230, 252 (2003)), “it was certainly under no obligation to request from TAPE information regarding secondary points of contact” for the STG, Inc. contract, id. at 22. In addressing Data Systems’ Performance Risk argument, defendant contends that because the Army did not identify any weaknesses or deficiencies, the Army was not required to conduct discussions regarding the Performance Risk factor. Def.’s Mot. 90-91. Neither TAPE nor Data Systems was entitled to discussions concerning the Performance Risk factor. As the solicitation clearly informed the offerors, the Army reserved the right to not “contact all of the sources provided by an Offeror.” See AR 381. [. . .]. Moreover, the scope and extent of discussions are within the Army’s province, FAR § 15.306(d)(3), and TAPE has provided the court with no valid reason to overturn the Army’s determination on the scope of Performance Risk discussions. Nor has Data Systems provided a valid reason. The Army did not assign Data Systems any weaknesses based on [. . .] past performance, but instead assigned a [. . .] strength. Although [. . .] past performance prevented Data Systems from receiving a higher rating, the Army was not required to discuss the issue with Data Systems because [. . .] past performance was not an evaluated weakness. See id. Accordingly, the Army did not act -47- arbitrarily, capriciously, or unlawfully by failing to conduct Performance Risk discussions with TAPE and Data Systems. b. The Army’s Price Discussions Were Proper Next, both Data Systems and BearingPoint argue that the Army did not conduct meaningful discussions with respect to the Price factor. See Data Systems Mot. 5-9; BearingPoint’s Mot. J. Administrative R. (“BearingPoint Mot.”) 48-49. Specifically, Data Systems contends that although the Army did not identify its total price as a weakness or deficiency, “it in fact considered it a material weakness” of the proposal, and was therefore “obligated to address [it] with [Data Systems] because it was a deficiency that could reasonably be addressed so as to materially enhance [Data Systems]’s potential for receiving award.” Data Systems Mot. 5. Further, Data Systems asserts that because it had a technically superior proposal, the Army should have addressed its total price during discussions “in order to satisfy its obligation to negotiate the best value for the Government.” Id. at 6. Moreover, argues Data Systems, the Army treated offerors with lower-rated technical proposals differently from those offerors who proposed higher prices by transmitting Items for Negotiation to the former group for multiple technical weaknesses while the latter group received no indication that their prices were too high. Pl. Data Systems’ Reply Mem. Supp. Its Mot. J. Administrative R. & Opp’n Def.’s & Intervenor’s Mots. J. Administrative R. (“Data Systems Resp.”) 23. According to Data Systems, such disparate treatment meant that the low-priced, technically inferior offerors could beef up their proposals to a level that would allow the Army to justify awarding them a contract. Id. For its part, BearingPoint asserts that “at no time during . . . discussions did the Army inform BearingPoint that [. . .] and that the Army had decided to emphasize price in its award decision.” BearingPoint Mot. 48. In response to these arguments, both defendant and Booz Allen generally contend that the Army was not obligated to discuss price with the offerors so long as the price was considered reasonable. Def.’s Mot. 88, 91; Mem. Supp. Intervenor’s Cross-Mot. J. R. & Opp’n Pls.’ Mots. J. R. (“Booz Allen Mot.”) 51-52. Specifically, defendant notes that both Data Systems and BearingPoint submitted reasonable total prices because they certified their proposed labor rates and arrived at a total price that was lower than the Independent Government Estimate. Def.’s Mot. 88, 91. Furthermore, in response to Data Systems’ specific argument, defendant reiterates its assertion that so long as the Army did not identify any weaknesses or deficiencies with an offeror’s total price, it was not required to initiate discussions on that issue. Id. at 89; accord Booz Allen Mot. 53. Booz Allen adds that even if the Army considered Data Systems’ price to be a material weakness, it does not necessarily follow that the price was unreasonable. Booz Allen Mot. 53. Defendant and Booz Allen are correct. Neither Data Systems nor BearingPoint proposed total prices that the Army considered to be unreasonable and the Army did not identify either total price to be a weakness. Further, discussions about the magnitude of the proposed total prices were not required by the solicitation. Thus, the Army was well within its discretion to -48- decline to discuss total prices with these two offerors. See FAR § 15.306(d)(3). Moreover, contrary to Data Systems’ assertion, there was no unequal treatment here by the Army. The fact that the Army opted not to discuss total prices while, at the same time, discussing the four nonprice factors does not constitute unequal treatment. The Army applied the same standards for discussing total price to all of the offerors. Accordingly, the Army did not act arbitrarily, capriciously, or unlawfully by failing to conduct Price discussions with Data Systems and BearingPoint. D. The Army’s Evaluation of the Nonprice Factors As previously noted, contracting agencies must evaluate proposals “based solely on the factors specified in the solicitation.” 10 U.S.C. § 2305(b)(1). “Evaluations may be conducted using any rating method or combination of methods, including color or adjectival ratings, numerical weights, and ordinal rankings.” FAR § 15.305(a). Regardless of the chosen rating method, agencies must document the “relative strengths, deficiencies, significant weaknesses, and risks” of the evaluated proposals. Id. Agencies “must treat all offerors equally, evaluating proposals evenhandedly against common requirements and evaluation criteria.” Banknote Corp. of Am., 56 Fed. Cl. at 383. Within these constraints, however, agencies have great discretion in evaluating the technical factors of a proposal. See E.W. Bliss Co., 77 F.3d at 449. Thus, “a court will not second guess” an agency’s evaluation, id., unless it is irrational or contrary to law, Impresa, 238 F.3d at 1332. A protester’s mere disagreement with an evaluation does not provide an adequate basis to overturn the agency’s decision. See Banknote Corp. of Am., 56 Fed. Cl. at 384; Biospherics, Inc. v. United States, 48 Fed. Cl. 1, 14 (2000). Plaintiffs challenge the Army’s evaluations of the nonprice factors in several ways. The court first addresses TAPE’s arguments that the Army’s evaluation of its technical proposal was defective. Next, the court addresses the contentions of L-3 Services that the Army (1) impermissibly considered the strengths of subcontractors in areas where the solicitation only allowed the performance of the offeror to be evaluated; (2) improperly failed to award it a strength for the on-site decision-making authority of its program managers; and (3) improperly failed to perform a crosswalk between the offerors’ proposals and the solicitation. Third, the court addresses the argument advanced by both L-3 Services and BearingPoint that the Army unreasonably credited Systems Research with the uncommitted resources of its parent and affiliate corporations. The court then addresses the argument made by L-3 Services, Data Systems, and BearingPoint that the Army erred in evaluating the Small Business Participation factor with respect to Wyle and Booz Allen. Finally, the court addresses Data Systems’ argument that the Army improperly failed to credit it with a strength for its 2006 small business participation goals. 1. The Army Properly Evaluated TAPE’s Technical Proposal As previously noted, the Army transmitted sixteen Items for Negotiation to TAPE for discussion, ten of which addressed weaknesses with TAPE’s proposal. See supra Part III.C.1. -49- TAPE responded to all of the Items for Negotiation and the Army ultimately found the responses to be adequate. See id. Further, in its final evaluation of TAPE’s proposal, the Source Selection Evaluation Board determined that TAPE’s proposal no longer contained weaknesses. See supra Part I.A.4. In assigning ratings, the Source Selection Evaluation Board did not change TAPE’s ratings for the Corporate Capability factor and its two subfactors. See AR at 4942-98 (initial evaluation), 10300-06 (final evaluation). However, the Source Selection Evaluation Board’s final evaluation of both TAPE’s Program Management Plan and overall Technical/Management volume improved from [. . .] to [. . .]. See id. at 4957-57 (initial evaluation), 10315-17 (final evaluation). TAPE contends that the Army downgraded its proposal despite not having identified any weaknesses with the proposal. TAPE Mot. 17-18. TAPE further argues that the Army’s ratings did not reflect the changes TAPE made to its proposal in response to the issues raised during discussions. Id. at 18. Finally, TAPE asserts that under the Corporate Capability factor, the [. . .] rating it received for the Knowledge and Experience subfactor was inconsistent with the [. . .] rating it received for the Performance Capability subfactor. Id. In response to TAPE’s first contention, both defendant and Savantage argue that the Army’s failure to find any weaknesses in TAPE’s final proposal does not indicate that TAPE was entitled to a higher rating. Def.’s Mot. 95, 99-100 (citing Patriot Contract Servs., B-294777.3, 2005 C.P.D. ¶ 97 (May 11, 2005) (“There is no requirement that an offeror receive the highest possible rating just because its proposal does not contain specific weaknesses.”)); Savantage Mot. 13-14. Defendant further argues that even if “the Army assigned TAPE the incorrect adjectival rating based on the absence of weaknesses in the proposal, the [Source Selection Authority] conducted a well-reasoned analysis of [] TAPE’s advantages under this subfactor as compared to the awardees.” Def.’s Mot. 96. Responding to TAPE’s second contention, defendant asserts that the Army considered the revisions TAPE made to its proposal and viewed those changes favorably, as demonstrated by the elimination of TAPE’s prior evaluated weaknesses. Id. at 95. Then, in response to TAPE’s third argument, defendant avers that “[t]he Army is not precluded from assigning two different ratings to two different subfactors” because the subfactors had distinct descriptions and weights. Id. at 94; accord Savantage Mot. 16-17. TAPE’s arguments amount to nothing more than its disagreement with the Army’s evaluation of its proposal. As noted above, dissatisfaction or disagreement with an agency’s evaluation of a proposal is an insufficient ground to challenge the agency’s action. See Banknote Corp. of Am., 56 Fed. Cl. at 384; Biospherics, Inc., 48 Fed. Cl. at 14. The Army’s conclusion that TAPE’s proposal revisions prompted by discussions were sufficient to eliminate weaknesses and improve some ratings, but insufficient to be assigned strengths or improve other ratings, was well within the Army’s discretion in evaluating proposals. Moreover, the court discerns no inconsistency between the ratings of the two Corporate Capability subfactors; after all, they are separate subfactors with separate criteria. Accordingly, the Army did not act arbitrarily, capriciously, or unlawfully in evaluating TAPE’s technical proposal. Because this finding -50- addresses the last of TAPE’s claims, all of which the court has rejected, the court denies TAPE’s motion for judgment on the administrative record and dismisses TAPE’s complaint. 2. The Army Properly Considered the Strengths of Subcontractors Next, L-3 Services argues that the Army impermissibly considered the strengths of the subcontractors of Systems Research and Wyle in areas where the solicitation only allowed the performance of the offeror to be evaluated, i.e., the Corporate Capability and Technical/ Management volumes. Pl.’s Mot. J. Administrative R. & Permanent Injunctive & Declaratory Relief (“L-3 Services Mot.”) 59-63. Specifically, it argues that the language used by the Army in the solicitation makes it clear that the Army intended to distinguish between the performance of offerors alone and the performance of offerors and their subcontractors together. Id. at 59-60. According to L-3 Services, when the Army wanted to consider the performance of the offerors and their subcontractors together, it used language such as “‘based on the quality and relevancy of the Offeror’s past performance, as well as that of its proposed major subcontractor,’” id. at 59 (quoting AR 381), but when the Army wanted to make a clear distinction between offerors and their subcontractors, it used language such as “evaluate the Offeror’s approach to manage and coordinate its subcontractor efforts,” id. (citing AR 382). As additional evidence on this point, L-3 Services notes that the Army responded to a prospective offeror’s question with the following answer: “‘[I]n the statement ‘The Offeror must be capable of performing ALL taskings listed under the nine Functional Areas in the Performance Work Statement,’ the term ‘Offeror’ refers to the combined contractor/subcontractor team.’” Id. at 59-60 (quoting AR 200). L-3 Services asserts that the limiting phrase “in the statement” means that when the Army used the term “offeror” elsewhere in the solicitation, it did not refer to the “the combined contractor/ subcontractor team.” Id. at 60. In sum, avers L-3 Services, when the Army uses just the term “offeror,” it is precluding any consideration of the performance of subcontractors unless it otherwise so stated. Id. at 59-60. According to Booz Allen, the argument advanced by L-3 Services is based on a “tortured reading” of the solicitation. Booz Allen Mot. 34. More specifically, both defendant and Booz Allen assert that “[t]he solicitation did not state or indicate in any way that evaluation . . . must exclude consideration of subcontractors.” Def.’s Mot. 81; accord Booz Allen Mot. 36 (citing Sci. & Tech., Inc., B-272748, B-272748.2, B-272748.3, B-272748.4, 97-1 C.P.D. ¶ 121 (Oct. 25, 1996)). Defendant then asserts that L-3 Services must have been aware that the Army intended to consider the performance of subcontractors under the Corporate Capability factor, quoting the same response to a prospective offeror’s question noted by L-3 Services. Def.’s Mot. 81 (citing AR 200). In the same vein, Booz Allen points out that L-3 Services was clearly aware that subcontractors would be evaluated as its “proposal was based on the same ‘team’ approach about which L-3 now complains.” Booz Allen Mot. 36-37 (noting that the very first sentence of L-3 Services’ proposal was “L-3 Communications Titan Corporation Technical and Management Services Division (L-3 TMSD) is leading a strong, diverse team of companies to support PEO EIS and CIO/G-6 on the PMSS2 effort” and that L-3 Services refers to the “L-3 TMSD team” throughout the Corporate Capability and Technical/Management volumes of its proposal). -51- The solicitation interpretation advanced by L-3 Services is contradicted by the express terms of the solicitation and defies common sense. According to the solicitation, an offeror was required to “be capable of performing ALL tasking listed under the nine Functional Areas,” AR 343, and to indicate in its proposal, using a specially designated table, which functional areas it would perform and which functional areas its subcontractor or subcontractors would perform, id. at 350, 363. Thus, the Army was entitled to evaluate the ability of an offeror’s subcontractors to perform work in one or more of the nine functional areas. Further, as noted by defendant and Booz Allen, the solicitation does not prohibit the Army from considering the capabilities or experience of subcontractors. Moreover, L-3 Services itself proposed and received credit for its own subcontractor capabilities and experience. Thus, the court concludes that the Army’s consideration of subcontractors was not arbitrary, capricious, or unlawful. 3. The Army Improperly Evaluated the Offerors’ Program Managers’ On-site Decisionmaking Authority The court next addresses a dispute concerning the authority of the offerors’ program managers. Systems Research, Wyle, Booz Allen, and L-3 Services all described their respective program managers’ on-site decision-making authority in their proposals. Specifically, Systems Research indicated that its program manager was “[. . .].” Id. at 8663. In addition, Wyle indicated that it would provide “[. . .].” Id. at 8824. Further, Booz Allen indicated that its “Officer-in-Charge” had the ability to “[. . .]” and that its program manager would serve as the “[. . .].” Id. at 8488-89. Finally, L-3 Services indicated that its “[. . .],” id. at 9586, and that [. . .] “[. . .],” id. at 9590. Yet, with respect to the on-site decision-making authority of the program managers proposed by these four offerors, the Source Selection Evaluation Board, when evaluating the proposals under the Technical/Management factor, assigned a [. . .] strength to Systems Research, id. at 10184, Wyle, id. at 10296, and Booz Allen, id. at 10352, but no strength at all to L-3 Services, id. at 10222-24. L-3 Services argues that the Army’s failure to assign it a strength for the on-site decisionmaking authority of its program managers prevented it from achieving even higher technical superiority over Wyle and Booz Allen. L-3 Services Mot. 47-48. Defendant counters that the Army’s evaluation was reasonable, asserting that L-3 Services did not indicate in its proposal that its “[. . .].” Def.’s Mot. 63; accord Booz Allen Mot. 38. Defendant’s position lacks merit. The court is unable to discern a meaningful difference among the following phraseology: [. . .]. Because each of the four offerors described similar onsite program manager decision-making authority and included its description the proper section of its proposal, the Army should have assessed strengths of a similar magnitude to each of these proposals. Because the Army evaluated the proposal of L-3 Services unequally, its evaluation lacked a rational basis. -52- 4. The Army Was Not Required to Perform Crosswalks Between the Offerors’ Proposed Labor Categories and the Labor Categories Described in the Solicitation L-3 Services next argues that upon receipt of the proposals, the Army should have analyzed whether the labor categories proposed by each offeror were substantially equivalent to the labor categories that the Army described in the solicitation. L-3 Services Mot. 53. Without such an analysis, it contends, the Army could not have performed a meaningful evaluation of the merits of the offerors’ technical proposals. Id. at 53-54, 59; see also BearingPoint Mot. 43 (noting that the Army made no effort to compare the labor categories proposed by the offerors with the labor category descriptions provided in the solicitation). L-3 Services avers that the requirement to perform such a crosswalk is implicit in the Army’s stated intention to evaluate the offerors’ “‘qualified resources currently available . . . .’” L-3 Services Mot. 57 (quoting AR 380). On the other hand, both defendant and Booz Allen deny that such a crosswalk requirement can be found in the solicitation. Def.’s Mot. 78; Booz Allen Mot. 29-31. Moreover, defendant asserts that the Army was under no obligation “to use any particular evaluation methodology since none was specified in the evaluation scheme.” Def.’s Mot. 78. Further, defendant contends that the Army specifically rejected use of a crosswalk analysis by requiring offerors to certify that their labor categories matched those described in the solicitation. Id. at 78-79. Anticipating this argument, L-3 Services contends that the blanket certifications “‘are generally insufficient where the agency has reason to question the characteristics of the products proposed.’” L-3 Services Mot. 57 (quoting Telemetrics, Inc., B-242957, B-242957.7, 92-2 C.P.D. ¶ 168 (Apr. 3, 1992)). Defendant’s final argument is that to the extent L-3 Services is attacking the terms of the solicitation, its argument is untimely. Id. at 79. The court agrees with defendant and Booz Allen on this issue. Certainly, at first blush, the wide range of proposed labor rates within each labor category might have signaled to the Army that the offerors’ labor categories were incongruent with the labor categories described in the solicitation. However, upon closer look, there is another valid explanation for the range of proposed labor rates. As defendant highlighted during oral argument, the proposed labor rates could be the result of the offerors making strategic choices informed by the competitive marketplace for the services to be provided. See Tr. 113. Given this valid alternative explanation, the range of proposed labor rates did not inherently mean that some of the offerors were not proposing substantially equivalent labor categories. Furthermore, neither the solicitation nor the Source Selection Plan required the Army to conduct the described crosswalks. Thus, even if the offerors had not been required to certify their labor categories, the Army still would not have been obligated to perform the crosswalks. In sum, the Army did not act arbitrarily, capriciously, or unlawfully by not conducting crosswalks between the offerors’ proposed labor categories and the labor categories described in the solicitation. 5. The Army Properly Evaluated Systems Research’s Parent and Affiliate Corporations The court next considers the contention of L-3 Services and BearingPoint that the Army improperly attributed to Systems Research the corporate capabilities and past performance of its -53- parent and affiliate corporations. L-3 Services Mot. 63-68; BearingPoint Mot. 28-38. The court prefaces its discussion by describing Systems Research’s use of its parent and affiliate corporations in its proposal and the Army’s references to those corporations in its evaluation of Systems Research’s proposal. Systems Research, according to the cover page of its proposal, is “[a] wholly owned subsidiary of” SRA International, Inc. (“SRA Int’l”).25 AR 1057, 8592; see also BearingPoint’s Mot. J. Administrative R. Ex. (“BearingPoint Ex.”) C at 1-2 (listing Systems Research, in a filing with the Securities and Exchange Commission, as a subsidiary of SRA Int’l). The two corporations are distinct entities for tax and procurement purposes. The Taxpayer Identification Numbers for Systems Research and SRA Int’l are 54-1013306 and 54-1360804, respectively. AR 1344. And, the Commercial and Government Entity (“CAGE”) codes for Systems Research and SRA Int’l are 6R517 and 3FUL4, respectively.26 AR 10675 (Systems Research); BearingPoint Ex. A (SRA Int’l). Moreover, Systems Research and SRA Int’l obtain government contracts in their own names. See BearingPoint Ex. B. In its final proposal, Systems Research identified, as the third of its three reference contracts in Volume II, a contract with the [. . .], known as the [. . .]. AR 8625; see also id. at 8603 (noting, in the Knowledge and Experience subsection of Volume I, that the [. . .] constituted “Relevant IV&V Experience”). The prime contractor on that contract was Galaxy Scientific Corporation (“Galaxy”), “a wholly owned subsidiary of” SRA Int’l that was acquired by SRA Int’l on July 1, 2005. Id. at 8625. But see BearingPoint Ex. C at 1-2 (listing Galaxy, in a filing with the Securities and Exchange Commission, as a subsidiary of both SRA Int’l and Systems Research). Galaxy is an entity distinct from both Systems Research and SRA Int’l; it has its own CAGE code, BearingPoint Ex. E (identifying Galaxy’s CAGE code as 0JLB2), and enters into government contracts under its own name, BearingPoint Ex. F. In addition to invoking one of Galaxy’s contracts in Volume II its proposal, Systems Research discussed the experience of two other corporations–Touchstone Consulting Group, Inc. (“Touchstone”) and Spectrum Solutions Group, Inc. (“Spectrum”)–in Volume I of its proposal. See, e.g., AR 8605-06 (containing references to “SRA Touchstone Strategic Consulting [Center 25 Despite indicating in its award notification letters to the unsuccessful offerors that a contract had been awarded to SRA Int’l, see AR 10575-88, the Army clearly awarded the contract to Systems Research and not SRA Int’l, see id. at 10675. But see id. at 8613 (indicating, in a table from Systems Research’s proposal listing the prime contractor and subcontractors, that “SRA International, Inc.” was the prime contractor), 8712 (implying, in a table describing the contractors involved in the proposal, that SRA Int’l was the prime contractor). 26 “CAGE codes are used to dispositively establish the identity of a legal entity for contractual purposes.” Perini/Jones, Joint Venture, B-285906, 2002 C.P.D. ¶ 68 (Nov. 1, 2000). The Army awarded the contract to Systems Research, identified by a CAGE code of 6R517. AR 10675. -54- of Excellence]” and “SRA Spectrum ERP [Center of Excellence]”). Touchstone and Spectrum were acquired by SRA Int’l in April 2005 and November 2005, respectively. BearingPoint Exs. G, J; cf. BearingPoint Ex. C at 1-2 (listing Touchstone and Spectrum, in a filing with the Securities and Exchange Commission, as subsidiaries of both SRA Int’l and Systems Research). Touchstone and Spectrum have their own CAGE codes. BearingPoint Ex. H (identifying Touchstone’s CAGE code as 02DR5); BearingPoint Ex. K (identifying Spectrum’s CAGE code as 3CR73). Moreover, Touchstone enters into government contracts under its own name. BearingPoint Ex. I. Although Systems Research’s proposal refers repeatedly to Galaxy, Touchstone, and Spectrum, none of the three corporations were identified as subcontractors that were committed to perform under an awarded contract. See AR 8613. Nor does Systems Research identify SRA Int’l as a subcontractor ready to perform upon the award of a contract.27 See id. Further, in the Master Labor Rate Table submitted by Systems Research in Volume IV, there are no proposed labor categories or proposed labor rates included for Galaxy, Touchstone, Spectrum, or SRA Int’l. See id. at 8677-707. The Army did not question Systems Research’s references to Touchstone, Spectrum, or SRA Int’l during discussions. However, it did transmit an Item for Negotiation to Systems Research regarding Systems Research’s inclusion of the [. . .] in Volume II of its final proposal. Id. at 5248-49 (containing Item for Negotiation number POP4001). The Army wanted Systems Research to clarify two issues: “Which company’s resources were used in managing and in performing the effort on this referenced contract, the Offeror or Galaxy? Which company’s resources will be used in managing and performing the effort on the PMSS2 contract?” Id. at 5248. Systems Research provided the following response: “The Offeror was responsible for the management and performance of the referenced contract immediately upon the acquisition of Galaxy Scientific Corporation on July 1, 2005. All Galaxy Scientific employees became employees of the Offeror as of that date.” Id. at 5249. After discussions had concluded and the offerors had submitted their final proposals, the Source Selection Authority summarized, in the Source Selection Decision Document, the strengths of Systems Research in the Knowledge and Experience subfactor of the Corporate Capability factor, noting that Systems Research “[. . .].” Id. at 10479. The Source Selection Authority also summarized Systems Research’s strengths in the Performance Risk factor, noting that Systems Research “[. . .].” Id. at 10504. 27 However, in the table describing the corporate capability resources of the prime contractor and subcontractors, SRA Int’l and not Systems Research is twice identified as the prime contractor. See AR 8613. Aside from the reference to Galaxy as “a wholly owned subsidiary of” SRA Int’l, id. at 8625, the only other specific reference to SRA Int’l in the proposal is in a table identifying the “SRA Team” in the Small Business Participation volume: the logo of SRA Int’l was placed beside a description beginning, “SRA is the prime contractor leading this effort,” id. at 8712. -55- L-3 Services and BearingPoint begin their argument by highlighting their view of the guiding legal principles. First, L-3 Services notes that “‘a parent corporation and a subsidiary are in law separate and distinct entities, and under ordinary circumstances a contract in terms and in name with one corporation cannot be treated as that of both, and a parent corporation will not be liable for the obligations of its subsidiaries.’” L-3 Services Mot. 63 (quoting BLH, Inc. v. United States, 13 Cl. Ct. 265, 272 (1987)). Then, citing decisions of the Comptroller General,28 both L-3 Services and BearingPoint assert that even though offerors are entitled to incorporate parent and affiliated corporations into their proposals, they “must actually commit the companies to meaningful involvement in performance of the contract.” Id. at 63; accord BearingPoint Mot. 36-37. In other words: An agency properly may attribute the experience or past performance of a parent or affiliated company to an offeror where the firm’s proposal demonstrates that the resources of the parent or affiliated company will affect the performance of the offeror. The relevant consideration is whether the resources of the parent or affiliated company–its workforce, management, facilities or other resources–will be provided or relied upon, such that the parent or affiliate will have meaningful involvement in contract performance. Further, where . . . no provision in the solicitation precludes offerors from relying on the resources of their corporate parent or affiliated companies in performing the contract, and an offeror represents in its proposal that resources of a related company will be committed to the contract, the agency properly may consider those resources in evaluating the proposal. Hot Shot Express, Inc., B-290482, 2002 C.P.D. ¶ 139 (Aug. 2, 2002) (citations omitted). Based on these legal propositions, L-3 Services and BearingPoint argue the Army improperly attributed to Systems Research the corporate capabilities and past performance of SRA Int’l, Galaxy, Touchstone, and Spectrum even though Systems Research did not commit the resources of those corporations to the performance of the contract, and as a result, Systems Research received stronger than appropriate evaluations. L-3 Services Mot. 64-68; BearingPoint Mot. 31-38. Defendant responds to these arguments by asserting that it is irrelevant whether Systems Research explicitly committed the resources of its parent and affiliate corporations in its proposal. Def.’s Mot. 84. Instead, argues defendant, the critical issue is “whether it was reasonable, given the information available, for the Army to question the commitment of resources to the contract as presented by [Systems Research].” Id. Defendant contends that the Army was “entirely reasonable” in concluding “that the resources of [Systems Research]’s 28 The decisions of the Comptroller General, while not binding on this court, are “‘expert [o]pinions’” that the court “‘should prudently consider.’” Thompson v. Cherokee Nation of Okla., 334 F.3d 1075, 1084 (Fed. Cir. 2003) (quoting Delta Data Sys. Corp. v. Webster, 744 F.2d 197, 201 (D.C. Cir. 1984)). -56- affiliates would be committed to the contract,” id., for four reasons. First, defendant argues that because Systems Research’s references to Spectrum and Touchstone in its proposal used either the “SRA” or “Center of Excellence” modifier, “the Army would have [had] no reason to question whether Touchstone or Spectrum were separate legal entities, much less whether their resources would be committed to performing the contract.” Def.’s Reply Supp. Its Cross-Mot. J. Administrative R. (“Def.’s Reply”) 18. Second, defendant avers that “it is reasonable to infer that the resources of [Systems Research]’s affiliates would be available to perform the contract[] because [Systems Research] expressly cites the contracts held by its affiliates in its proposal.” Def.’s Mot. 84. Third, defendant contends that Systems Research “explicitly addressed this issue during discussions and stated that the employees of its affiliate, Galaxy, had been completely absorbed into [Systems Research].” Id. at 85. Fourth, defendant indicates that “subsequent research into this issue by the Army’s contracting officials revealed that the [. . .] contracts described in [Systems Research]’s proposal have been modified to reflect the fact that the contracting party is [Systems Research], rather than the affiliate companies.” Id. at 86 (citing a declaration prepared by the contracting officer, attached to defendant’s motion, describing research on this issue that occurred after he awarded the contracts). As a threshold matter, the court must address the declaration prepared by the contracting officer and submitted in support of defendant’s motion. As correctly asserted by BearingPoint in its response to defendant’s motion, Opp’n Def.’s Mot. J. Administrative R. & Reply Def.’s Opp’n BearingPoint’s Mot. J. Administrative R. (“BearingPoint Resp.”) 19-20, the contents of the declaration and the exhibits attached to the declaration, do not, for the most part, reflect information that was evaluated by the Army during the procurement process. For this reason, the court grants BearingPoint’s contemporaneously filed motion to strike defendant’s Exhibit A, defendant’s Exhibit B (the contracting officer’s declaration), the exhibits attached to the declaration, and the arguments in defendant’s motion that find their sole support in the declaration and exhibits. Moving to the substantive issues, the court first notes that because the Comptroller General has developed a sound and reasonable approach to analyzing an offeror’s use of parent and affiliate corporations in its proposal, see, e.g., Hot Shot Express, Inc., 2002 C.P.D. ¶ 139, the court adopts the approach in this case. According to the Comptroller General, “[a]n agency properly may attribute the experience or past performance of a parent or affiliated company to an offeror where the firm’s proposal demonstrates that the resources of the parent or affiliated company will affect the performance of the offeror.” Id. Applying this standard first to Spectrum and Touchstone, the court concludes that the Army could have reasonably concluded that Systems Research’s proposal demonstrated that the resources of Spectrum and Touchstone would affect Systems Research’s performance on the contract. Both corporations are included in the narrative portion of Systems Research’s proposal in such a way as to suggest that they would play a role in Systems Research’s contract performance. The fact that the corporations were not expressly listed as subcontractors is immaterial; there is no requirement that an offeror must designate its affiliated corporations as subcontractors in order to officially commit their resources -57- to the performance of a contract. Accordingly, the Army properly considered the experience of Spectrum and Touchstone when evaluating Systems Research’s proposal. The court ultimately reaches the same conclusion with respect to Galaxy. At first glance, Systems Research’s response to the Item for Negotiation about the CECOM Rapid Response contract is troublesome. Systems Research did not clearly respond to the Army’s second inquiry: “Which company’s resources will be used in managing and performing the effort on the PMSS2 contract?” See AR 5248-49. However, Systems Research’s response that all of Galaxy’s employees were its own employees strongly implies that Galaxy was Systems Research’s subsidiary corporation. As such, the Army could have reasonably concluded that Systems Research’s proposal demonstrated that Galaxy’s resources would affect Systems Research’s performance on the contract. Accordingly, the Army properly considered Galaxy’s experience and past performance when evaluating Systems Research’s proposal. Lastly, the court considers Systems Research’s references to SRA Int’l in its proposal. See id. at 8613, 8712. First, Systems Research twice identifies SRA Int’l as the prime contractor. See id. at 8613. Of course, this cannot actually be the case because Systems Research, not SRA Int’l, is the offeror. Thus, the court does not read these two references as an intent to invoke the resources of SRA Int’l. For the same reason, the court sees no problem with Systems Research using the logo of SRA Int’l in the Small Business Participation volume. See id. at 8712. Significantly, there is no evidence in the Army’s evaluations that it considered the experience or past performance of SRA Int’l instead of Systems Research or that it was confused about which corporation was the offeror. Accordingly, the court finds no error in the Army’s evaluations with respect to SRA Int’l. Thus, in sum, the court concludes that the Army properly evaluated Systems Research’s parent and affiliate corporations. 6. The Army Improperly Evaluated the Small Business Participation Factor Next, L-3 Services, Data Systems, and BearingPoint all object to the ratings that the Army assigned to Wyle and Booz Allen for the Small Business Participation factor. L-3 Services Mot. 49-52; Data Systems Mot. 19-21; BearingPoint Mot. 23-27. The following table summarizes the relevant information from the Army’s evaluations: -58- Army Goal Wyle Booz Allen L-3 Services Data Systems BearingPoint Mandatory Small Bus. 20% [. . .] [. . .] [. . .] [. . .] [. . .] Disadvant. 6% [. . .] [. . .] [. . .] [. . .] [. . .] Womanowned 5% [. . .] [. . .] [. . .] [. . .] [. . .] Servicedisabled 3% [. . .] [. . .] [. . .] [. . .] [. . .] Veteranowned 3% [. . .] [. . .] [. . .] [. . .] [. . .] HUBZone 1% [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] Nonmandatory HBCU/MI Rating Pos. Goal [. . .] [. . .] [. . .] [. . .] [. . .] AR 10160 (Data Systems), 10219 (L-3 Services), 10293 (Wyle), 10329 (BearingPoint), 10348 (Booz Allen). Also relevant to plaintiffs’ argument are the following definitions from the Source Selection Plan that apply to the Small Business Participation factor: Color Adjective and Definition Blue Outstanding - Outstanding quality; proposed participation significantly exceeds all solicitation goals; very good probability of success with overall low degree of risk in achieving extent of proposed participation. Green Good - High quality; proposed participation significantly exceeds some solicitation goals, and meets or exceeds all goals; good probability of success with overall low to moderate degree of risk in achieving extent of proposed participation. Yellow Fair - Adequate quality; proposed participation meets or exceeds all solicitation goals; fair probability of success with overall moderate degree of risk in achieving extent of proposed participation. Id. at 401. -59- Based upon the proposed levels of small business participation and the rating definitions included in the Source Selection Plan, plaintiffs argue that because Wyle did not exceed any participation goals, it should have received a [. . .] rating instead of a [. . .] rating. L-3 Services Mot. 49-50; Data Systems Mot. 19; BearingPoint Mot. 25-26. Plaintiffs further argue that because Booz Allen did not exceed all of the participation goals, it should have received a [. . .] rating instead of an [. . .] rating. L-3 Services Mot. 51-52; Data Systems Mot. 20-21; BearingPoint Mot. 26-27. Moreover, L-3 Services and Data Systems contend that because the Army did not assign the correct, lower ratings to Wyle and Booz Allen, the Army’s best value tradeoff was skewed in favor of Wyle and Booz Allen. L-3 Services Mot. 50-53; Data Systems Mot. 19, 21. In response to plaintiffs’ arguments, defendant first contends that the ratings were “based upon a variety of elements in addition to the small business solicitation percentages.” Def.’s Mot. 70; accord id. at 74. As such, defendant avers, the Consensus Evaluation Reports prepared by the Source Selection Evaluation Board and the Source Selection Decision Document prepared by the Source Selection Authority “demonstrate[] that the evaluations were made in accordance with the [solicitation] criteria.” Id. at 75. Defendant also argues that “the assignment of an adjectival rating for a particular factor does not, by itself, dictate a particular outcome in a comparison of proposals” and that the Source Selection Authority made her award decision based upon her “detailed examination of each proposal’s individual strengths” and not the adjectival/ color ratings. Id. at 71-72. Building on defendant’s arguments, Booz Allen contends that the plaintiffs are not entitled to rely on the rating criteria set forth in the Source Selection Plan because such internal agency guidance does not grant offerors any enforceable rights. Booz Allen Mot. 40-41 (citing ManTech Telecomms. & Info. Sys. Corp.v. United States, 49 Fed. Cl. 57, 67 & n.15 (2001)). L-3 Services rejects Booz Allen’s contention, arguing that a more recent decision from the Court of Federal Claims “took the opposite position” and concluded that departures from the Source Selection Plan amounted to “‘unequal treatment of offerors.’” Pl. L3 Services’ Reply Supp. Its Mot. J. Administrative R. & Permanent Injunctive & Declaratory Relief & Resp. Def.’s & Intervenor’s Mot. J. Administrative R. (“L-3 Services Resp.”) 26 n.18 (quoting Beta Analytics Int’l, Inc. v. United States, 67 Fed. Cl. 384, 407 (2005)); accord Data Systems Resp. 26. The threshold question is whether the Source Selection Plan was inconsistent with the solicitation. The solicitation indicated that the Army would evaluate the Small Business Participation volume of the offerors’ proposals by assessing, in relevant part, the following criteria: The Government will evaluate the Offeror’s approach to meeting or exceeding the Government’s minimum mandatory requirement for overall Small Business participation and the non-mandatory objectives . . . for participation by Small Disadvantaged Business, Woman-Owned Small Business, Service-Disabled Veteran-Owned Small Business, Veteran Owned Small Business, HUBZone Small Businesses, and HBCU/MI in the performance of the PMSS2 contract. The -60- amount by which the proposed percentage exceeds the overall small business category percentage and the amount by which the proposed percentages exceed the small business sub-category percentages will be considered. The assessment will include the extent to which such firms are specifically identified in the proposals; the extent of commitment to use such firms (for example, enforceable commitments are more significant than nonenforceable commitments); the extent of participation of such firms in terms of the value of the total acquisition; and, for large businesses only, an assessment of the Offerors’ Small Business goals for FY 2006 for the DoD and other Federal Government agencies and the actual goals achieved. AR 382. Each of the relevant rating definitions in the Source Selection Plan required the Army to consider (1) the “quality” of the proposal, (2) whether and to what extent an offeror’s “proposed participation” met or exceeded “all solicitation goals,” and (3) the “probability of success” and the “risk in achieving extent of proposed participation.” Id. at 401. With regard to an offeror’s proposed small business participation goals, the Source Selection Plan merely translated the solicitation’s indication that the Army would consider “[t]he amount by which the proposed percentage exceeds the overall small business category percentage and the amount by which the proposed percentages exceed the small business sub-category percentages,” id. at 382, into more explicit criteria. The Army logically determined that to receive a higher rating on this factor, an offeror would have to exceed or significantly exceed some or all of the small business participation goals. Moreover, the rating definitions in the Source Selection Plan were broad enough to address the remaining criteria described in the solicitation that the Army was required to consider. Accordingly, the Source Selection Plan was consistent with the solicitation. To find otherwise would lead the court to conclude that none of the rating definitions in the Source Selection Plan were lawful because they deigned to translate the criteria described in the solicitation into more explicit, ratable criteria. This result would completely obviate the utility of source selection plans and would be contrary to the FAR. Because the Source Selection Plan did not “depart[] from the requirements of [the] solicitation,” ManTech Telecomms. & Info. Sys. Corp., 49 Fed. Cl. at 67, the Army was required to adhere to its rating scheme. Whether the Army properly applied the rating definitions set forth in the Source Selection Plan is “subject to review” by this court. See Beta Analytics Int’l, Inc., 67 Fed. Cl. at 399. To receive an Outstanding (Blue) rating, an offeror must have proposed participation that significantly exceeded all goals; to receive a Good (Green) rating, an offeror must have proposed participation that significantly exceeded some goals and met or exceeded all goals; and to receive a Fair (Yellow) rating, an offeror must have proposed participation that met or exceeded all goals. AR 401. These are mandatory, objective requirements for a proposal to achieve the designated rating. As stated in Beta Analytics Int’l, Inc., “nothing in the language suggests these are just examples or illustrations. There is no preface with the words ‘may include,’ or use of the disjunctive ‘or,’ or anything indicating that these are not necessary conditions . . . .” 67 Fed. Cl. at 402. Thus, the court concludes that the Army erred in its -61- evaluation of the Small Business Participation volumes of Wyle and Booz Allen; the maximum rating available to Wyle was [. . .] and the maximum rating available to Booz Allen was [. . .]. The Army’s “departure from the Source Selection Plan . . . shows unequal treatment of the offerors.” Id. at 407. Accordingly, the court finds that the Army’s Small Business Participation evaluations lacked a rational basis. 7. The Army Properly Evaluated Data Systems’ 2006 Small Business Participation In another challenge to the Army’s evaluation of the Small Business Participation factor, Data Systems contends that the Army misapplied its evaluation scheme when it evaluated Data Systems’ 2006 small business participation. Data Systems Mot. 21-22. As noted previously, large business offerors were “required to submit the small business goals for fiscal year 2006 and the goals for fiscal year 2006 that were actually achieved” in a table provided in an appendix to the solicitation. AR 356; see also id. at 377 (containing the required table that allowed the offeror to input its “Fiscal Year 2006 Small Business Subcontracting Goal” and “Fiscal Year 2006 Small Business Subcontracting Actually Achieved” for each of the seven small business participation categories). Then, when evaluating the proposals under the Small Business Participation factor, the Army would assess the goals that were proposed and achieved. Id. at 382. Several offerors, including Systems Research, [. . .], [. . .], and Data Systems, indicated in their proposals that they did not have [. . .] for 2006. See id. at 8714 (Systems Research), 9895 ([. . .]), 10101 (Data Systems); see also id. at 10275 (describing the proposal of [. . .]). Thus, each of the four offerors provided a substitute measure by which to compare the small business participation levels it actually achieved in 2006. Data Systems and Systems Research suggested that the Army [. . .]. Id. at 8714, 10101. [. . .] suggested that the Army compare its [. . .]. Id. at 9895. And, [. . .] suggested that the Army compare its [. . .]. Id. at 10275. All of the offerors except Data Systems provided data in the required table by inputting their substitute goals under the column titled “Fiscal Year 2006 Small Business Subcontracting Goal.” See id. at 8714 (Systems Research), 9895 ([. . .]); see also id. at 10275 (describing the proposal of [. . .]). Data Systems left that column blank, and instead provided a narrative comparing its actual participation levels to the goals set forth in the instant solicitation. Id. at 10101. In its evaluation of the Small Business Participation factor, the Source Selection Evaluation Board addressed how well each of the four offerors performed in 2006 when compared with their proffered substitute goals. For three of the offerors–Systems Research, [. . .], and [. . .]–it duplicated the offeror’s required table in its Consensus Evaluation Documents, analyzed the contents of the table, and assigned an appropriate strength. See id. at 10180-81 (Systems Research), 10329-30 ([. . .]), 10275-76 ([. . .]). Specifically, with respect to Systems Research, the Source Selection Evaluation Board identified a [. . .] strength for the offeror’s meeting or exceeding all of the percentages listed in the solicitation except those for service- -62- disabled, veteran-owned small business and HUBZone. Id. at 10180-81. [. . .].29 Id. at 1032930. Then, with respect to [. . .], the Source Selection Evaluation Board identified a [. . .] strength for the offeror’s exceeding all of the DoD’s subcontracting goals except that for HBCU/MI.30 Id. at 10275-76. The Source Selection Evaluation Board took a different approach when evaluating Data Systems’ proposal. It did not duplicate Data Systems’ required table in its Consensus Evaluation Document, as it had done with the other three offerors without 2006 small business participation goals. See id. at 10160. Instead, it merely noted that [. . .]. Id. Further, the Source Selection Evaluation Board did not identify any related [. . .]. Id. at 10160-61. Based upon the above-referenced facts, Data Systems contends that “[t]he Army’s scoring and analysis of this requirement was disparate and unreasonable, thus resulting in significant errors . . . .” Data Systems Mot. 21. Specifically, argues Data Systems, “[. . .] was improperly used against [Data Systems] in comparison with other offerors.” Data Systems Resp. 28. Defendant counters that the Army’s “evaluation of [Data Systems]’s [. . .] was commensurate with the information provided by [Data Systems] in its proposal.” Def.’s Mot. 76. Particularly, defendant notes that “of the seven areas for overall small business participation and the small business subcategories, [Data Systems]’s 2006 data [. . .].” Id. Defendant also argues that to the extent Data Systems is attacking the terms of the solicitation, its argument is untimely. Id. at 77. The court finds that the Army treated Data Systems disparately when evaluating its actual 2006 small business participation. The Army should have acknowledged, as it did for Systems Research, [. . .], and [. . .], that Data Systems’ actual 2006 small business participation met or exceeded some of its substitute goals, i.e., the goals set forth in the instant solicitation. Because Data Systems was not required to input its substitute goals into the required table, the Army could not use the [. . .] in the table as an excuse to ignore Data Systems’ actual 2006 [. . .]. Thus, the Army’s evaluation method was flawed. However, upon examining the proposals of the four offerors, the court concludes that, ultimately, the Army’s flawed evaluation amounts to harmless error. See Metcalf Constr. Co. v. United States, 53 Fed. Cl. 617, 638 (2002) (“[M]inor errors or irregularities, i.e., harmless errors, committed in the course of the procurement process are not sufficient grounds to warrant judicial intrusion to upset a procurement decision.”). Among the four offerors without 2006 [. . .], [. . .] met or exceeded [. . .], Systems Research met or exceeded [. . .], [. . .] met or exceeded [. . .], and Data Systems met or exceeded [. . .]. A comparison of the four offerors on this point reflects that only Data Systems fell below a threshold of those offerors meeting or exceeding [. . .]. Thus, it 29 The court notes that [. . .]’s actual participation level for HBCU/MI actually [. . .] its goal, in that both the goal and the participation level were [. . .]. See AR 9895, 10329. 30 The court notes that [. . .] actually [. . .] the goal for Service-Disabled Small Business and [. . .]. See AR 10275. -63- appears that the Army credited various degrees of strength for those offerors meeting or exceeding a certain number of substitute goals, and by meeting only [. . .], Data Systems fell below the Army’s threshold. The Army’s decision to not assign even a [. . .] strength for meeting or exceeding [. . .] might appear curious, especially in light of its frequent assignment of [. . .] strengths in other areas of evaluation, but it is not improper on its face. It was well within the Army’s broad discretion in evaluating proposals to assign strengths to those offerors that met or exceeded four substitute goals and no one else. See E.W. Bliss Co., 77 F.3d at 449. Accordingly, because the Army’s failure to assign Data Systems a strength for its 2006 small business participation has a rational basis in the administrative record despite the Army’s disparate evaluation process, the court concludes that the Army’s error was harmless and nonprejudicial to Data Systems. E. The Army’s Evaluation of the Price Factor In addition to challenging the Army’s evaluation of the nonprice factors, L-3 Services and BearingPoint raise a variety of complaints concerning the Army’s evaluation of the Price factor. Generally, they argue that upon receiving numerous proposals that contained noticeably low proposed labor rates and total prices, the Army should have taken further action to analyze price and ensure that the technical proposals were not deficient. See L-3 Services Mot. 57-58; BearingPoint Mot. 42-43. Relevant to the court’s analysis are the final evaluated total prices of the offerors in the competitive range, which ranged from [. . .] to [. . .] (i.e., a range of [. . .]), or from [. . .] to [. . .] below the Independent Government Estimate. AR 12779. Also relevant are the labor rates proposed by the awardees, L-3 Services, and BearingPoint in their Master Labor Rate Tables, as summarized in the table that follows:31 Labor Rates Labor Category Systems Wyle Booz L-3 Bearing Administrative Support [. . .] [. . .] [. . .] [. . .] [. . .] Computer Scientist [. . .] [. . .] [. . .] [. . .] [. . .] Database Mgmt. Spec. [. . .] [. . .] [. . .] [. . .] [. . .] Functional Analyst [. . .] [. . .] [. . .] [. . .] [. . .] 31 For the purposes of this table, some of the offerors’ names have been abbreviated as follows: Systems Research (“Systems”), Booz Allen (“Booz”), L-3 Services (“L-3”), and BearingPoint (“Bearing”). The court calculated the average labor rate for each offeror by totaling the offeror’s labor rates and dividing by fifteen (i.e., the number of labor categories in the table). The weighted average labor rate is adjusted for the number of labor hours associated with each labor category, as detailed in Part I.A.5, supra. -64- Logistic Mgmt. Spec. [. . .] [. . .] [. . .] [. . .] [. . .] Program Mgmt. Eng’r [. . .] [. . .] [. . .] [. . .] [. . .] Program Manager - Sr. [. . .] [. . .] [. . .] [. . .] [. . .] Program Analyst [. . .] [. . .] [. . .] [. . .] [. . .] Prog. Integrator/Rep. [. . .] [. . .] [. . .] [. . .] [. . .] Senior Analyst [. . .] [. . .] [. . .] [. . .] [. . .] Sr. Applications Eng’r [. . .] [. . .] [. . .] [. . .] [. . .] Systems Eng’r - Inter. [. . .] [. . .] [. . .] [. . .] [. . .] Systems Analyst [. . .] [. . .] [. . .] [. . .] [. . .] Technical Director [. . .] [. . .] [. . .] [. . .] [. . .] Training Specialist [. . .] [. . .] [. . .] [. . .] [. . .] Average Labor Rate [. . .] [. . .] [. . .] [. . .] [. . .] Weighted Average [. . .] [. . .] [. . .] [. . .] [. . .] Id. at 109 (labor hours), 9622-63 (L-3 Services), 9857-88 (BearingPoint), 10666-74 (Booz Allen), 10753-89 (Systems Research), 10867-76 (Wyle). 1. The Army Was Not Required to Take Additional Action to Mitigate the Potential for Buying-in Beginning with the broadest argument, the court addresses BearingPoint’s contention that the Army, upon receiving proposals with total prices ranging from [. . .] to [. . .], failed to mitigate the possibility that one or more of the offerors was attempting to buy into the contracts. BearingPoint Mot. 38-41. “‘Buying-in’ . . . means submitting an offer below anticipated costs, expecting to–(1) Increase the contract amount after award (e.g., through unnecessary or excessively priced change orders); or (2) Receive follow-on contracts at artificially high prices to recover losses incurred on the buy-in contract.” FAR § 3.501-1. Attempting to buy into a contract is not illegal. See, e.g., Little Susitna, Inc., B-244228, 91-2 C.P.D. ¶ 6 (July 1, 1991) (“An offeror, for various reasons, in its business judgment, may decide to submit a below-cost offer, and such an offer is not inherently invalid. Whether an awardee can perform the contract at the price offered is a matter of responsibility which, as discussed above, we will not review absent a showing of possible fraud or bad faith or that definitive responsibility criteria have not been met.” (citation omitted)). However, contracting agencies are encouraged to “minimize the opportunity for buying-in by seeking a price commitment covering as much of the entire program -65- concerned as is practical,” FAR § 3.501-2(b), and contracting officers “must take appropriate action to ensure buying-in losses are not recovered by the contractor,” id. § 3.501-2(a). Specifically, BearingPoint argues: When the Army reviewed [Booz Allen, Systems Research,] and Wyle’s offered labor rates, the Army should have considered whether these offerors might be “buying-in” for the purpose of winning contract awards, with the likely intent to recover losses from these low labor rates by increasing the number of hours proposed to perform the work required in future task orders. Competition for these task orders will not preclude such recovery, because all three awardees appear to have used similar strategies in their proposals and thus all three could likely submit inflated hours estimates for task order work. Thus, . . . the Army was required to consider whether it would actually pay more for the work to be performed in future task orders because all three large-business offerors will likely need significantly more hours to perform the required tasks than offerors[,] such as BearingPoint[,] who used more realistic labor rates in their proposals. BearingPoint Mot. 38-39. Moreover, BearingPoint contends that “[t]he temptation to ‘buy-in’ by submitting unrealistically low labor rates was particularly great in this procurement because the offeror proposing the lowest labor rates would obtain the highest score for Price.” Id. at 40. The only party responding to this argument, Booz Allen, rejects the contention that the awardees bought into their contracts, arguing that “the prospect of subsequent task order competition for which the contractors would propose differing skill mixes, labor hours and technical approaches” sufficiently mitigates any buying-in concerns. Booz Allen Mot. 32-34. Moreover, argues Booz Allen, [. . .]. Id. at 33. There is no indication in the administrative record that upon receiving the initial proposals, the Army became concerned that one or more of the offerors was attempting to buy into the contract.32 Certainly, the Army sought to commit the offerors to specified labor rates, 32 At oral argument, counsel for Booz Allen implied that the Army had no reason to question the offerors’ proposed total prices because it was an expert in determining its own needs and how much it would have to pay to satisfy those needs. See Tr. 126 (“As a result, really no one understands its internal support needs and its operational capability needs better than the Army, itself. And, here, it knew especially well what it was buying. . . . [T]he Army was very capable here[,] as the government buying support for its own internal operations[,] to understand which proposals evaluated in their complete sense provided the best value.”). However, counsel for BearingPoint provided the following apt rebuttal: Booz Allen basically states during argument that no one understands the government requirements like the government does, yet it’s the government that developed the independent government cost estimate; it’s the government that -66- which, given the nature of the procurement, was the maximum commitment it could have received. On the other hand, because the contemplated contracts merely served to limit the number of offerors eligible to bid on subsequently issued Firm Fixed Price and Time and Materials task orders, the protection afforded the Army by guaranteed labor rates is questionable. BearingPoint’s assertion that lower fixed labor rates will not ensure lower-priced task orders is well-taken; because all of the offerors proposed similarly low labor rates, they all may need more labor hours to complete the work described in the task order than the number of hours estimated by the Army. Ultimately, however, the FAR does not prevent offerors from buying into a contract. The Army’s only obligations are forward-looking; it must ensure that the contract awardees do not attempt to improperly recover any losses. Accordingly, the Army was not required to take additional action to mitigate the potential for buying-in. 2. BearingPoint’s Assertion That the Army Was Required to Take Additional Action to Assess Price Reasonableness Is Untimely Another argument advanced by BearingPoint is that the Army, upon receiving proposals containing proposed labor rates that were “so much lower than those currently being charged for the same work,” was required to assess the reasonableness of the offerors’ prices. BearingPoint Mot. 44-45. A contracting agency must analyze the proposals it receives in response to a solicitation “to ensure that the final agreed-to price is fair and reasonable.” FAR § 15.404-1(a); accord id. § 15.402(a) (requiring agencies to ““[p]urchase . . . services from responsible sources at fair and reasonable prices”). Typically, price reasonableness is established by competition, id. § 15.305(a)(1), which occurs in a best value procurement when “[t]wo or more responsible offerors, competing independently, submit priced offers that satisfy the Government’s expressed requirements” and the successful awardee’s price is not unreasonable, id. § 15.403-1(c)(1). However, as a general rule, “[t]he complexity and circumstances of each acquisition should determine the level of detail of the analysis required.” Id. § 15.404-1(a)(1); see also id. § 15.4041(a)(2) (requiring only a “price analysis,” as described in FAR section 15.404-1(b), when the solicitation does not require offerors to submit cost or pricing data). Specifically, BearingPoint contends that with such low proposed labor rates, the Army risked not receiving task order proposals or receiving task order proposals that “include[d] far more hours than are currently being charged to perform the work.” BearingPoint Mot. 44. BearingPoint specifically focuses on the Time and Materials task orders contemplated by the solicitation, explaining: utilizes the [rates] that they utilize to come up with that estimate. When they start to receive rates . . . significantly different, . . . [t]he fact that there’s no question as to why . . . is a potential problem. Id. at 177-78. -67- When the awardees submit proposals on individual [Time and Materials] task orders, they have absolutely no risk, because if the work is not completed when the task order awardee has expended the proposed ceiling hours/price, the contractor has no obligation to continue performance. The Army is then left with the choice of either increasing the ceiling price of the task order, or not having the work completed. Id. Thus, according to BearingPoint, “the Army was obligated to conduct price analysis of the offerors’ proposed labor rates to determine whether they were reasonable” and “whether the offered prices [were] realistic.” Id. at 44-45. As noted by BearingPoint, the FAR describes the two favored price analysis techniques: (1) “[c]omparison of proposed prices in response to the solicitation” and (2) “[c]omparison of previously proposed prices and previous Government and commercial contract prices with current proposed prices for the same or similar items . . . .” FAR § 15.404-1(b)(2), cited in BearingPoint Mot. 45. Because of the task order structure of the procurement, asserts BearingPoint, only the second technique is appropriate in this case. BearingPoint Mot. 45. Accordingly, BearingPoint contends that “the Army should have compared the offerors’ proposed prices with the prices the Government paid for work under previous contracts.” Id. Defendant offers no direct response to BearingPoint’s arguments on this issue. However, in addressing Femme Comp’s arguments, defendant avers that because “the solicitation was for a firm-fixed price,”33 the Army’s reliance on competition to establish price reasonableness satisfied the price analysis requirement and foreclosed the need to perform a cost analysis. Def.’s Mot. 34; see also Booz Allen Mot. 26-29 (asserting that adequate price competition satisfied the price reasonableness requirement). Specifically, as noted by defendant, “when contracting on a firmfixed-price or fixed-price with economic price adjustment basis, comparison of the proposed prices will usually satisfy the requirement to perform a price analysis, and a cost analysis need not be performed.” FAR § 15.305(a)(1), cited in Def.’s Mot. 34. However, “[w]hen contracting on a cost-reimbursement basis, evaluations [should] include a cost realism analysis to determine what the Government should realistically expect to pay for the proposed effort, the offeror’s understanding of the work, and the offeror’s ability to perform the contract.” FAR § 15.305(a)(1). Moreover, adds Booz Allen, “[w]here the award of a fixed-rate contract is contemplated, the realism of offerors[’] proposed labor rates is not ordinarily considered since a fixed-rate contract . . . places the risk and responsibility of contract price and resulting profit or loss on the contractor.” Booz Allen Mot. 31 (citing WinStar Fed. Servs., B-284617, B-284617.2, B-284617.3, 2000 C.P.D. ¶ 92 (May 17, 2000)34). 33 During oral argument, defendant acknowledged that this averment was in error and that the solicitation did not contemplate the award of firm-fixed-price contracts. Tr. 114. 34 WinStar Federal Services concerned a fixed-price contract and nowhere in the decision does the Comptroller General discuss fixed-rate contracts. However, the cited proposition can -68- As indicated in the solicitation, the Army intended to award multiple IDIQ contracts. AR 359, 378. An IDIQ contract is “used to acquire supplies and/or services when the exact times and/or exact quantities of future deliveries are not known at the time of contract award,” FAR § 16.501-2(a), and provides for both a maximum quantity of supplies and services and a fixed period of time for performance, id. § 16.504(a). As confirmed by defendant at oral argument, the contemplated contracts were neither firm-fixed-price contracts nor cost-reimbursement contracts. See Tr. 114-15; see also FAR § 16.202-1 (defining a firm-fixed-price contract as one that “provides for a price that is not subject to any adjustment on the basis of the contractor’s cost experience in performing the contract); id. § 16.301-1 (defining a cost-reimbursement contract as one that “provide[s] for the payment of allowable incurred costs”). Instead, defendant characterized the contracts as “IDIQ contract[s] with fixed labor rates.” Tr. 114. Although this type of contract is not expressly defined in the FAR, defendant indicated that it was similar to a time and materials contract. Id. at 115. The only types of contracts for which the FAR provides specific guidance in evaluating price reasonableness are firm-fixed-price contracts and cost-reimbursement contracts. See FAR § 15.305(a)(1). Because similar specific guidance is lacking for IDIQ contracts and time and materials contracts, the court must look to the FAR’s more general requirements. As previously noted, although price reasonableness is typically established by competition, id., contracting agencies should consider “[t]he complexity and circumstances” of the procurement when determining the scope and extent of the reasonableness analysis, id. § 15.404-1(a)(1). Here, the Army implied that price reasonableness would be established through competition by incorporating both FAR section 52.216-29, Time-and-Materials/Labor-Hour Proposal Requirements—Non-commercial Item Acquisition With Adequate Price Competition (Feb. 2007), and Defense Federal Acquisition Regulation Supplement section 252.216-7002, Alternate A, Time-and-Materials/Labor-Hour Proposal Requirements–Non-commercial Item Acquisition With Adequate Price Competition, by reference into the solicitation. AR 98, 100. Thus, as reflected by the absence of additional price analysis guidance in the solicitation, the Army must have determined that a more extensive price reasonableness analysis was unnecessary. Challenges to the extent and scope of the Army’s price analysis, such as BearingPoint’s challenges, are challenges to the terms of the solicitation, and should have been dealt with “prior to the close of the bidding process . . . .” Blue & Gold Fleet, L.P., 492 F.3d at 1313. Thus, regardless of the merit of BearingPoint’s objections, it would be improper for the court to revisit the Army’s price analysis method at this stage of the proceedings. instead be found in another case cited by Booz Allen, PharmChem, Inc., B- 291725.3, B-291725.4, B-291725.5, 2003 C.P.D. ¶ 148 (July 22, 2003). -69- 3. The Army Was Not Required to Perform Crosswalks Between the Offerors’ Proposed Labor Rates and Their Technical Volumes Next, according to L-3 Services and BearingPoint, “the prices and rates proposed by the awardees should have put the Army on notice that a comparison” of the proposed labor rates and the awardees’ Corporate Capability and Technical/Management volumes was necessary. L-3 Services Mot. 57-58; accord BearingPoint Mot. 42. Specifically, they assert that the Army should have performed a crosswalk between each offeror’s technical and price proposals to ensure that the offerors could recruit and retain qualified personnel. L-3 Services Mot. 53-54; BearingPoint Mot. 43; see also L-3 Services Mot. 57 (indicating that the requirement to perform a crosswalk is implicit in the Army’s stated intention to evaluate the offerors’ “‘qualified resources currently available’” (quoting AR 380)); BearingPoint Mot. 43 (same). Absent a crosswalk between each offeror’s technical and price proposals, they argue, the Army could not have performed a meaningful evaluation of the merits of the offerors’ ability to recruit and retain qualified personnel. L-3 Services Mot. 57; BearingPoint Mot. 43. Further, according to L-3 Services, had the Army performed the required crosswalks, it would have been alerted to the [. . .] proposed by the eventual awardees and the fact “that the Offerors both did not share a common understanding of the requirements and were not offering personnel with comparable qualifications.” L-3 Services Mot. 55. Specifically, L-3 Services asserts that the awardees’ prices reflected “(a) labor categories that are less-skilled/less-qualified than required by the [solicitation] and/or (b) compensation that is insufficient to permit the recruitment and retention of personnel with the required qualifications.” Id. at 59. In contrast, argue L-3 Services and BearingPoint, their [. . .] gave them “an enhanced ability to recruit and retain the qualified personnel required to perform the contract work, and an ability to recruit and retain personnel with better skills and qualifications than those offered by the awardees.” Id. at 54; accord id. at 55, 59; BearingPoint Mot. 43-44. L-3 Services concludes that “because the Army’s failure to conduct basic crosswalk analyses among the [solicitation] and the Offerors’ technical and price proposals precluded rational apples-to-apples price comparisons,” the Source Selection Authority’s “express reliance on the apparent price disparities” was improper. L-3 Services Mot. 58. In response to L-3 Services and BearingPoint, both defendant and Booz Allen reiterate their previous crosswalk-related argument, denying that a crosswalk requirement can be found in the solicitation. Def.’s Mot. 78; Booz Allen Mot. 24, 29; cf. Booz Allen Mot. 25 (noting that “plaintiffs are advocating an evaluation methodology that is not reflected in, or relied upon, in their own proposals”). Indeed, argues defendant, the Army was under no obligation “to use any particular evaluation methodology since none was specified in the evaluation scheme.” Def.’s Mot. 78. Further, defendant and Booz Allen contend that the Army specifically rejected use of a crosswalk analysis by requiring offerors to certify that their labor rates were at or below their commercial list price, GSA Schedule price, or the Schedule B price in a competitively awarded -70- Federal contract.35 Id. at 78-79; Booz Allen Mot. 24. Anticipating this argument, L-3 Services contends that the blanket certifications “‘are generally insufficient where the agency has reason to question the characteristics of the products proposed.’” L-3 Services Mot. 57 (quoting Telemetrics, Inc., 92-2 C.P.D. ¶ 168). However, defendant further argues that the crosswalk argument advanced by L-3 Services and BearingPoint is best characterized as “an argument that the Army should have conducted a price realism analysis of proposed labor rates,” a requirement that cannot be found in the solicitation. Def.’s Mot. 80. As such, argues defendant, to the extent plaintiffs are attacking the terms of the solicitation, their argument is untimely. Id.; accord Booz Allen Mot. 22-24. As it did with the prior crosswalk argument advanced by L-3 Services, the court concludes that the Army was not required to perform crosswalks between the offerors’ proposed labor rates and their technical proposals. Neither the solicitation nor the Source Selection Plan required the Army to conduct the described crosswalks. More specifically, when describing the criteria to be evaluated under the relevant subfactors of the Corporate Capability and Technical/ Management factors, the solicitation did not indicate that the Army would evaluate the offerors’ proposed labor rates to determine the offerors’ available qualified resources or ability to recruit and retain qualified personnel. Moreover, even if the offerors had not been required to certify their proposed labor rates, the Army still would not have been obligated to perform the crosswalks. In sum, the Army did not act arbitrarily, capriciously, or unlawfully by not conducting crosswalks between the proposed labor rates and the offerors’ technical proposals. 4. BearingPoint’s Contention That the Army’s Use of the Highest Proposed Labor Rates to Calculate the Offerors’ Total Prices Is Untimely Expanding upon its argument that its [. . .] increased the likelihood that it would have more highly qualified personnel than the [. . .] awardees, BearingPoint Mot. 43-44, BearingPoint contends that the Army’s “simplistic” method of evaluating the proposed total prices “did not allow the Army to properly determine whether offerors were proposing a range of personnel who would be capable of performing the work,” id. at 46. Specifically, BearingPoint objects to the Army’s use of the highest proposed labor rate from all of the labor rates proposed by an offeror and its subcontractors in the Price Model as potentially unrepresentative of the offeror’s staffing plan. Id. The Army clearly indicated in the solicitation that it would calculate the offerors’ 35 Similarly, defendant argues that “because the solicitation provided that the Army’s evaluation would be limited to the proposal volume and section identified, it would have been improper for the evaluation team to consider information from the Price proposal in the evaluation of the Technical/Management factor.” Def.’s Reply 20 (citation omitted). By this logic, the Price evaluation team violated the terms of the solicitation when it “[v]erified that all proposed subcontractors and corresponding rates were provided by cross-referencing the subcontractors identified in the corporate capability proposal with the subcontractors identified in the Master Labor Rate Table.” See AR 5041 (emphasis added). Thus, defendant’s argument is inconsistent with the Army’s actions. -71- proposed total prices using the highest proposed labor rate for a particular labor category. BearingPoint’s argument is essentially an attack on the terms of the solicitation; it is, therefore, untimely. See Blue & Gold Fleet, L.P., 492 F.3d at 1313. F. The Army’s Best Value Tradeoffs Finally, L-3 Services, Data Systems, and BearingPoint all contend that the Army’s best value tradeoffs were flawed. According to the FAR, the competitive acquisition process is “designed to foster an impartial and comprehensive evaluation of offerors’ proposals, leading to selection of the proposal representing the best value to the Government . . . .” FAR § 15.002(b). “‘Best value’ means the expected outcome of an acquisition that, in the Government’s estimation, provides the greatest overall benefit in response to the requirement.” Id. § 2.101. A best value “tradeoff process is appropriate when it may be in the best interest of the Government to consider award to other than the lowest priced offeror or other than the highest technically rated offeror.” Id. § 15.101-1(a). As previously noted, contracting agencies must evaluate proposals “based solely on the factors specified in the solicitation,” 10 U.S.C. § 2305(b)(1), treating “all offerors equally,” Banknote Corp. of Am., 56 Fed. Cl. at 383. Similarly, an agency’s final award decision must “be based on a comparative assessment of proposals against all source selection criteria in the solicitation.” FAR § 15.308. “‘To determine whether and to what extent meaningful differences exist between proposals,’” agencies should consider both adjectival ratings and “‘information on advantages and disadvantages of [the] proposals . . . .’” Metcalf Constr. Co., 53 Fed. Cl. at 64041 (quoting Oceaneering Int’l, Inc., B-287325, 2001 C.P.D. ¶ 95 (June 5, 2001)). Looking beyond the adjectival ratings is necessary because “‘[p]roposals with the same adjectival rating are not necessarily of equal quality . . . .’” Id. at 641 (quoting Oceaneering Int’l, Inc., 2001 C.P.D. ¶ 95). An agency must document its final award decision, and its documentation must “include the rationale for any business judgments and tradeoffs made or relied on by the [source selection authority], including benefits associated with additional costs.” FAR § 15.308. “Conclusory statements, devoid of any substantive content, . . . fall short of” this documentation requirement. Serco Inc. v. United States, 81 Fed. Cl. 463, 497 (2008). Further, the “documentation need not quantify the tradeoffs that led to the decision.” FAR § 15.308. However, “the magnitude of the price differential between two offerors” is relevant because “as that magnitude increases, the relative benefits yielded by the higher-priced offer must also increase.” Serco Inc., 81 Fed. Cl. at 497. Ultimately, “[p]rocurement officials have substantial discretion to determine which proposal represents the best value for the government.” E.W. Bliss Co., 77 F.3d at 449. L-3 Services, Data Systems, and BearingPoint generally contend that the Army’s best value tradeoffs were inconsistent with the solicitation’s factor weighting scheme, essentially converting the best value procurement into a technically acceptable, low-price procurement. L-3 Services Mot. 16, 20-23, 26 n.9; Data Systems Mot. 12-13, 16; BearingPoint Mot. 9-12, 18-19, -72- 21. As evidence of the Army’s purported abandonment of the best value analysis, plaintiffs emphasize three details from the administrative record. First, L-3 Services and BearingPoint note that even though the nonprice factors were significantly more important than Price, “all of the awards were made to the Offerors with the lowest prices.” L-3 Services Mot. 20; accord BearingPoint Mot. 9, 18; see also L-3 Services Mot. 20 (noting that only one of the offerors who received an Outstanding (Blue) rating on the most important Corporate Capability factor was selected for award). Second, L-3 Services and BearingPoint assert that the language used by the Source Selection Authority in making the tradeoffs–such as (1) “‘provide[s] value,’” L-3 Services Mot. 22 (quoting AR 10475) (emphasis omitted); (2) “offers ‘an acceptable solution,’” id. at 22 (quoting AR 10536) (emphasis omitted); accord BearingPoint Mot. 19; and (3) “‘proposals were good and low risk, or better,’” BearingPoint Mot. 19 (quoting AR 10475)– reveals her emphasis on technical acceptability over best value. See also L-3 Services Mot. 26 n.9 (citing instances when the Source Selection Authority found an inferior proposal to be “sound”). Third, BearingPoint contends that the Army’s failure to identify any weaknesses in the final proposals of those offerors in the competitive range “suggests that the evaluators were merely going through the motions” because the Source Selection Authority had “signaled that price would be the most important factor in [the] procurement.” BearingPoint Resp. 12. Beyond those basic observations, both L-3 Services and BearingPoint assert that when an “agency chooses the technically inferior, lower-priced proposal,” it must provide “particularly strong” justification for doing so. Id. at 17-18 (citing EPSCO, Inc., B-183816, 75-2 C.P.D. ¶ 338 (Nov. 21, 1975)); accord BearingPoint Mot. 21-22 (same); BearingPoint Resp. 8-9 (citing DLI Eng’g Corp., B-218335, 85-2 C.P.D. ¶ 468 (Oct. 28, 1985); Blue Rock Structures, Inc., B293134, 2004 C.P.D. ¶ 63 (Feb. 6, 2004)). They imply that the Army failed to supply such justification in this case. See L-3 Services Mot. 18; BearingPoint Mot. 22; see also L-3 Services Resp. 4 (“In determining why these strengths were not worth, for example, the [. . .] price premium between Wyle and L-3, the [Source Selection Authority] should have addressed them individually.”); cf. BearingPoint Resp. 9-10 (explicitly arguing that the Army did not provide the required strong justification). Moreover, L-3 Services argues that the Source Selection Authority’s comparison of awardees with the unsuccessful offerors “obscures whatever analysis was done to actually choose the Awardees” in the first place. L-3 Services Mot. 22.; cf. BearingPoint Mot. 18 (noting that the Source Selection Authority failed to pick “the three best proposals from a Non-Price factor standpoint and compar[e] the other proposals to them, which is what one would expect in a procurement where the Non-Price evaluation factors are significantly more important than Price”). Defendant responds to these general arguments about the Army’s best value tradeoffs with several general rebuttals of its own. First, argues defendant, “[a]lthough the non-price factors [were] ‘significantly more important tha[n] the Price factor’ collectively, that is not equivalent to Price being ‘relatively insignificant’ and ‘minor.’” Def.’s Mot. 54; accord Booz Allen Mot. 44-45. Next, defendant notes that although each of the complaining plaintiffs received some ratings that were better than those received by the awardees on the same factors, -73- “[. . .].”36 Def.’s Mot. 49. Then, noting that “[. . .]–ranging from [. . .] to [. . .],” id., defendant contends that the Source Selection Authority “thoroughly document[ed]” her “painstaking price trade-off analysis” in the Source Selection Decision Document, reaching a conclusion after engaging in “a factor-by-factor and subfactor-by-subfactor comparison of the strengths of each awardee and the unsuccessful offerors,” identifying discriminators, and assessing “comparative superiority.” Id. at 49-50; accord Booz Allen Mot. 43. Indeed, defendant emphasizes that the “sequence of the [Source Selection Authority]’s decision reveals that she did not pre-select the least expensive offerors in making her best value determinations.” Def.’s Reply 4. Specifically: The [Source Selection Authority] began by comparing [Systems Research] against all other offerors, even though it was not the least expensive, but because it had the highest ratings and an attractive price. She concluded that [Systems Research] offered the best value. She then turned to [Booz Allen], because it had relatively strong ratings and the lowest price, and determined that it represented best value. Next, she compared L-3 to Wyle to see which of the two was the better value, despite the fact that Savantage and Binary Group were less expensive. Then, she compared Wyle to the remaining offerors (other than [Systems Research], [Booz Allen], and L-3) and determined that it was the best value. Finally, she compared the small business offerors and selected Savantage and Binary Group as the best value. Id. (citations omitted); accord Tr. 93-94. Fourth, defendant avers that “plaintiffs’ arguments are predicated upon the false assumption that the Army’s adjectival rankings can be compared on a quantitative basis” and the implication that “the award decision was based on a comparison of color ratings and the number of strengths that an offeror earned.” Def.’s Mot. 50-51; accord Booz Allen Mot. 48. Instead, defendant asserts, the Source Selection Authority “delved deeply into the relative technical merit of each proposal and considered the nature of each proposal’s strengths and weaknesses.” Def.’s Mot. 51; accord Booz Allen Mot. 51. Moreover, notes defendant, the Army’s failure to identify any weaknesses in the final proposals meant nothing more than the fact that the offerors had cured all of the issues with their proposals that the Army identified during discussions. Def.’s Reply 5-6; accord Booz Allen’s Reply Supp. Its Cross-Mot. J. R. 3. Next, defendant contends that “accepting the plaintiffs’ arguments would require the Government to reject the lowestpriced offers, despite the fact that the proposals all received [. . .] for each of the non-price factors.” Def.’s Mot. 51. Finally, defendant distinguishes EPSCO, Inc. from the instant case because the protesters in EPSCO, Inc. were “rated two levels higher” than the awardee. Id. at 5354. 36 In support of this argument, defendant asserts that “none of the awardees received a rating lower than [. . .] on any non-price technical factor.” Def.’s Mot. 49. While technically true, the court notes that two awardees, Booz Allen and Binary, received [. . .] ratings on the second Corporate Capability subfactor. AR 10448. -74- In addition to these general arguments, each of the three plaintiffs challenging the Army’s best value tradeoffs advances arguments that apply to the specific best value tradeoffs that the Army performed for their respective proposals. Thus, before addressing the parties’ general arguments, it is necessary for the court to describe the best value tradeoffs made by the Army, the specific objections raised by the plaintiffs, and defendant’s responses to those objections. 1. Tradeoffs Concerning L-3 Services L-3 Services contends that the best value tradeoffs conducted by the Source Selection Authority between it and Wyle, and between it and Booz Allen, were contrary to the weighting scheme set forth in the solicitation and reflected disparate treatment of L-3 Services. L-3 Services Mot. 23. The Source Selection Evaluation Board determined the strengths of all three proposals and assigned adjectival/color ratings, with which the Source Selection Authority concurred, AR 10474, as noted in the following table:37 Rating and Strengths Factors and Subfactors L-3 Services Wyle Booz Allen 1: Corporate Capability [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 1.1: Knowledge and Experience [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 1.2: Performance Capability [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 2: Performance Risk [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 3: Technical/ Management [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 3.1: Understanding Support [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 3.2: Program Management Plan [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 5: Small Business Participation [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 37 For the purposes of this table, and two similar tables infra, the court uses the following abbreviations: significant strength (“S”); moderate strength (“M”); and minor strength (“X”). -75- Id. at 10448, 10452, 10456, 10459. In addition, the Source Selection Evaluation Board determined the offerors’ evaluated prices, as provided in the following table: Price Premium of L-3 Services Offeror Price Amount Percentage Booz Allen [. . .] [. . .] [. . .] Systems Research [. . .] [. . .] [. . .] Wyle [. . .] [. . .] [. . .] L-3 Services [. . .] [. . .] [. . .] Id. at 10356, 12779. Exercising her “independent judgment” about the identified strengths and assigned ratings, id. at 10475, the Source Selection Authority made the following comparison between Wyle and L-3 Services:38 [. . .]. [. . .],39 [. . .].40 Id. at 10550-51 (footnotes added). Then, comparing Booz Allen with L-3 Services, the Source Selection Authority stated: [. . .]. Id. at 10536-37. a. L-3 Services’ Arguments In light of the above comparisons and tradeoffs, L-3 Services advances a few, general contentions that apply to all of the best value tradeoffs made by the Army between L-3 Services 38 In her comparisons, the Source Selection Authority abbreviated factor as “F” and subfactor as “S.” For readability purposes, the court has not altered these abbreviations. 39 “[. . .]” are the sections of the Source Selection Decision Document that describe the strengths of each of the offerors in each of the five factors. See AR 10477-526. 40 In the last sentence of this quotation, the Source Selection Authority uses the unique two-character code for another unsuccessful offeror, [. . .], instead of the code for L-3 Services, “[. . .].” The court treats the substitution as a typographical error. -76- and the awardees. First, L-3 Services argues that the Army consistently downplayed its technical superiority while emphasizing the lesser technical merits of Wyle and Booz Allen. L-3 Services Mot. 18-20. In so doing, it argues, the Army applied a more rigorous standard to its proposal than to the proposals of the awardees. Id. at 19. L-3 Services also argues that the Source Selection Authority irrationally performed detailed comparisons of the proposals when it would benefit an awardee but not when it would benefit an unsuccessful offeror. Id. at 43-44. For example, L-3 Services notes that the Source Selection Authority, in comparing the proposals of L-3 Services and Booz Allen under the Corporate Capability factor, separately considered each subfactor so as to highlight the positive aspects of Booz Allen’s lower-rated proposal: “‘[. . .].’” Id. at 43 (quoting AR 10537). It then notes that when the tables are turned and L-3 Services was the offeror found to be superior on one subfactor but not on the other subfactor (or the factor overall), the Source Selection Authority did not highlight the positive aspects of its proposal. Id. at 44 (citing AR 10551 (“[. . .].”)). Finally, in a related argument, L-3 Services asserts that “where analysis of the proposals and of the detailed evaluation finding would have led the [Source Selection Authority] to recognize an L-3 advantage, the [Source Selection Authority] simply relied on the ratings themselves in making her decision,” abandoning any “reasoned analysis” or “citation of strengths benefit[t]ing the government.” Id. at 45-46; see also id. at 30 (labeling the one-sentence comparison between L-3 Services and Wyle on the Performance Risk factor as “conclusory”). As examples of this practice, L-3 Services cites the following onesentence conclusions: (1) “‘[. . .],’” id. at 46 (quoting AR 10550); (2) “‘[. . .],’” id. (quoting AR 10528); (3) “‘[. . .],’” id. (quoting AR 10528); (4) “‘[. . .],’” id. (quoting AR 10528); and (5) “‘[. . .],’” id. (quoting AR 10528). Then, applying its general contentions to the comparison made by the Source Selection Authority between it and Wyle, L-3 Services asserts that its “proposal was superior to Wyle’s under any measure.” Id. at 23. It explains: [. . .]. Id. at 23-24 (citations and emphasis omitted). L-3 Services then raises specific issues under three of the factors. Under the Corporate Capability factor, L-3 Services contends that while the Source Selection Authority began its comparison by acknowledging its “obvious superiority” over Wyle, she then “denigrat[ed] that superiority at length.” Id. at 26. With respect to the Knowledge and Experience subfactor, L-3 Services asserts that instead of assessing the extent of its superiority over Wyle, the Source Selection Authority “chose to emphasize the technical acceptability of [Wyle’s] lower-priced, lower-rated proposal: ‘[. . .].’” Id. (quoting AR 10550). L-3 Services explains that it had received significant strengths in those same three areas and was even found to be “‘[. . .]’ to Wyle in the DITSCAP/DIACAP area.” Id. (quoting AR 10550). Moreover, according to L-3 Services, in making her comparison, the Source Selection Authority ignored many of the strengths she had already acknowledged earlier in the Source Selection Decision Document. Id. at 26-27. In sum, L-3 Services argues that in finding “[. . .],’” the Source -77- Selection Authority denied it “the huge advantage accruing to” it based upon its greater number of significant strengths ([. . .]) and overall strengths ([. . .]). Id. at 27 (quoting AR 10550). L-3 Services makes a similar argument with respect to the Performance Capability subfactor, noting the “wide and substantive disparity in identified significant strengths,” the similarity between the identified strengths, and the nonconsideration of a previously identified significant strength. Id. at 28. Again, it argues that “[. . .].” Id. at 29 (quoting AR 10550). Under the Technical/Management factor, L-3 Services asserts that the Source Selection Authority made “no effort to compare the advantages to the Army of L-3’s advantage under Subfactor 1 with Wyle’s supposed advantage under Subfactor 2,” even though the two subfactors were equal in weight. Id. at 31 (quoting AR 10551 (“[. . .].”)). But see AR 10551 (“[. . .].”). Moreover, L-3 Services contends that Wyle’s “[. . .]” does not actually exist, arguing that (1) the Source Selection Authority ignored a strength concerning disincentive percentages that she had previously identified in the Source Selection Decision Document that aligned with one of Wyle’s strengths,41 and (2) the Army did not properly evaluate its proposal with respect to the on-site decision-making authority of its program managers, the other strength attributed to Wyle. L-3 Services Mot. 31. Finally, under the Small Business Participation factor, L-3 Services argues that the Source Selection Authority’s conclusion that L-3 Services was [. . .] to Wyle is based on Wyle’s improperly high adjectival/color rating. Id. at 32. L-3 Services emphasizes that while Wyle [. . .] all of the small business participation goals, L-3 Services [. . .] all of the small business participation goals. Id. Thus, in sum, L-3 Services argues that the Source Selection Authority’s tradeoff denigrated its proposal and “significantly inflate[d] the importance of Price.” Id. at 33. L-3 Services next applies its general contentions to the comparison made by the Source Selection Authority between it and Booz Allen, asserting that its proposal was clearly superior to Booz Allen’s proposal. Id. at 35. Specifically, it argues that the Source Selection Authority “improperly awarded a contract to [Booz Allen], and not to L-3, even though [Booz Allen] was [. . .].” Id. at 34. Indeed, notes L-3 Services, Booz Allen received an award even though it received a [. . .] rating under the second subfactor of the most important factor–Corporate Capability. Id. at 35. Moreover, L-3 Services stresses the difference in the number of assigned significant strengths ([. . .] for L-3 Services versus [. . .] for Booz Allen). Id. L-3 Services then raises specific issues under all four nonprice factors. 41 For L-3 Services, the Source Selection Authority found: “[. . .].” AR 10213. For Wyle, the Source Selection Authority found: [. . .]. Id. at 10516. -78- Under the Corporate Capability factor, L-3 Services contends that while the Source Selection Authority began its comparison by acknowledging its “obvious advantage” over Booz Allen, she then “immediately turned to minimizing that advantage.” Id. at 36. First, with respect to the Knowledge and Experience subfactor, L-3 Services argues that the Source Selection Authority’s one-sentence conclusion that “‘[. . .]’” improperly lacked an explanation “why the two proposals were equal, notwithstanding L-3’s apparent advantage” based on the greater number of strengths assigned to its proposal. Id. (quoting AR 10536). Second, with respect to the Performance Capability subfactor, L-3 Services contends that despite its [. . .] rating (versus Booz Allen’s [. . .] rating) and [. . .] significant strengths (versus Booz Allen’s [. . .] significant strengths), the Source Selection Authority “attempted to minimize” its superiority. Id. at 37. According to L-3 Services, given its technical superiority under this subfactor, the Source Selection Authority should have explained the “significant benefits to the Army of L-3’s [. . .] additional significant strengths.” Id. Instead, it asserts, the Source Selection Authority “denigrated the extent of L-3’s superiority” by noting that “‘[. . .].’” Id. (quoting AR 10536). L3 Services characterizes the remaining analysis of the Source Selection Authority as a justification based on “low-price, technically acceptable grounds,” especially given the Source Selection Authority’s emphasis that Booz Allen had “‘[. . .]’” and “‘[. . .].’” Id. (quoting AR 10536); see also id. at 38 (arguing that the Source Selection Authority’s use of an “unlimited superiority” standard is “affirmatively prejudicial”). Under the Performance Risk factor, L-3 Services asserts that in finding that the two proposals were essentially equal in value, the Source Selection Authority ignored “negative past performance by [Booz Allen]’s major subcontractor.” Id. at 39. As L-3 Services notes, the Source Selection Authority found that both Booz Allen and L-3 Services “‘[. . .].’” Id. (quoting AR 10536-37). This finding, complains L-3 Services, fails to account for the Source Selection Evaluation Board’s finding that Booz Allen’s “‘Major Subcontractor, [. . .], received [. . .] on its GS-35F-5833H/ GST0704BG0916 (Army ILAP support) contract in the area of [. . .] and [. . .] and relate to [. . .].’” Id. at 39 (quoting AR 10344); see also AR 10508-09 (containing the Source Selection Authority’s analysis of Booz Allen’s Performance Risk volume, which lacks any concern about the negative evaluations the Army received for [. . .]). It bears noting, however, that the Source Selection Evaluation Board followed the sentence quoted by L-3 Services with the following statements: [. . .]. AR 10344. Under the Technical/Management factor, L-3 Services asserts that although the two proposals had the same rating and “seemed balanced,” the Source Selection Authority “took the opportunity to downgrade L-3’s proposal.” L-3 Services Mot. 40. Concerning the Understanding Support subfactor, L-3 Services contends that the Source Selection Authority’s “comparison is based on the Army’s faulty evaluation,” which “ignores L-3’s extensive experience as an incumbent.” Id. Moreover, asserts L-3 Services, the Source Selection -79- Authority had previously acknowledged this experience. Id. (citing acknowledged strengths from the Source Selection Decision Document concerning the Corporate Capability, Performance Risk, and Technical/Management factors). Regarding the Program Management Plan subfactor, L-3 Services argues that the Source Selection Authority should have engaged in a more than onesentence analysis, as she did for the first subfactor, and that had she done so, she would have concluded that the additional strength L-3 Services should have received for its program managers’ on-site decision-making authority would have given L-3 Services the edge. Id. at 41. Finally, under the Small Business Participation factor, L-3 Services argues that the Source Selection Authority “did not give any consideration to [its] advantages in the small business subcategory percentages.” Id. Moreover, L-3 Services contends that Booz Allen received an improperly high adjectival/color rating, emphasizing that Booz Allen [. . .]. Id. at 42. Thus, in sum, L-3 Services argues that the Source Selection Authority’s tradeoff denigrated its proposal and overvalued the importance of Booz Allen’s proposed price. Id. at 42. b. Defendant’s Arguments Responding to the general arguments of L-3 Services, defendant contends that the Source Selection Authority “properly applied the best value and weighting criteria set forth in the solicitation.” Def.’s Mot. 52. Defendant emphasizes the Source Selection’s statement that upon concluding “‘[. . .],’” as well as her statement that “‘[. . .].’” Id. (quoting AR 10475) (emphasis omitted). Accordingly, defendant argues that the Source Selection Authority “used her discretion and reasonably determined that L-3’s higher premium was not warranted.” Id. at 53. Moreover, according to defendant, “the disparity in technical ratings between L-3 and Wyle and [Booz Allen] is not nearly as vast as L-3 claims,” asserting that the two awardees had “[. . .].” Id. Further, defendant argues that L-3 Services “misleadingly downplays its higher price.” Id. Finally, defendant dismisses the contention that the Source Selection Authority applied stricter scrutiny in evaluating the proposal of L-3 Services, emphasizing that the Source Selection Authority used comparative phrases such as “essentially equal to,” “somewhat superior,” and “slightly superior” to both unsuccessful offerors and awardees alike. Id. at 55. Then, responding to the more specific arguments of L-3 Services, defendant rejects the contention that the Source Selection Authority’s tradeoffs between L-3 Services and Wyle, and between L-3 Services and Booz Allen, were contrary to the weighting scheme set forth in the solicitation. Id. at 55-66. According to defendant, L-3 Services “seeks to exaggerate the technical superiority of its own proposal, deny the manifest benefits identified in the proposals of the awardees, and deny the right of the Army to give meaningful weight to the Price factor.” Id. at 56. Beginning with the Corporate Capability factor, defendant contends that the Source Selection Authority focused on the “comparative strengths” of the proposals without limiting “her consideration to a comparison based upon a simple count of the number of strengths.” Id. at 57. Next, with regard to the Performance Risk category, defendant argues that the Source Selection Authority did not ignore the past performance information of Booz Allen’s major -80- subcontractor. Id. at 59. Turning to the Technical/Management factor, defendant asserts that “[. . .],” id., and that L-3 Services did not adequately describe the [. . .], id. at 63. Moreover, defendant indicates that L-3 Services’ status as an incumbent could not “shore up weaknesses in its technical proposal.” Id. at 65. 2. Tradeoffs Concerning Data Systems Like L-3 Services, Data Systems contends that the best value tradeoffs conducted by the Source Selection Authority between it and Wyle, and between it and Booz Allen, were contrary to the weighting scheme set forth in the solicitation. Data Systems Mot. 12-13, 16. The Source Selection Evaluation Board determined the strengths of all three proposals and assigned adjectival/color ratings, with which the Source Selection Authority concurred, AR 10474, as noted in the following table: Rating and Strengths Factors and Subfactors Data Systems Wyle Booz Allen 1: Corporate Capability [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 1.1: Knowledge and Experience [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 1.2: Performance Capability [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 2: Performance Risk [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 3: Technical/ Management [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 3.1: Understanding Support [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 3.2: Program Management Plan [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 5: Small Business Participation [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] Id. at 10448, 10449, 10456, 10459. In addition, the Source Selection Evaluation Board determined the offerors’ evaluated prices, as provided in the following table: -81- Price Premium of Data Systems Offeror Price Amount Percentage Booz Allen [. . .] [. . .] [. . .] Systems Research [. . .] [. . .] [. . .] Wyle [. . .] [. . .] [. . .] Data Systems [. . .] [. . .] [. . .] Id. at 10356, 12779. Exercising her “independent judgment” about the identified strengths and assigned ratings, id. at 10475, the Source Selection Authority made the following comparison between Wyle and Data Systems: [. . .]. Id. at 10547-48. Then, comparing Booz Allen with Data Systems, the Source Selection Authority stated: [. . .]. Id. at 10533-34. In light of these comparisons and tradeoffs, Data Systems begins by arguing that its “superior technical merit should have been sufficient to offset the price advantage of Wyle’s proposal,” explaining that [. . .]. Data Systems Mot. 13. According to Data Systems, aside from a “begrudging nod” to its superiority under the Knowledge and Experience subfactor of the Corporate Capability factor, there is nothing in the Source Selection Decision Document to suggest that the [Source Selection Authority] gave any weight at all to [Data Systems]’s rating under this factor or considered all of the advantages to the Government, as necessary in a best value solicitation, offered by [Data Systems]’s superiority under this factor. Id. at 14. Instead, contends Data Systems, the Source Selection Authority “devoted more attention in her comparative analysis to building up the inferior rating of Wyle.” Id. (citing AR 10547 (“[. . .].”)). Moreover, with respect to the Performance Capability subfactor, Data Systems avers that the Source Selection Authority “literally gushes over Wyle’s ‘[. . .]’ rating by specifically considering every strength at length and completely ignoring the many significant strengths of [Data Systems].” Id. at 14 (citing AR 10547). In sum, Data Systems argues that the Source Selection Authority effectively treated its [. . .] ratings under the Corporate Capability as merely [. . .] ratings, while treating Wyle’s [. . .] rating as if it were an [. . .] rating. Id. at 15. -82- Data Systems next contends that in comparing its proposal to that of Booz Allen, the Source Selection Authority did not discuss the fact that the nonprice factors were significantly more important than Price. Id. at 15. Indeed, Data Systems argues that its “superior technical merit should have been sufficient to offset the price advantage of [Booz Allen]’s proposal,” emphasizing the fact that its proposal “[. . .],” and that Booz Allen’s proposal “[. . .].” Id.; accord id. at 16. Moreover, Data Systems asserts that the Source Selection Authority “boosted” Booz Allen’s lesser technical evaluation while at the same time minimizing or ignoring the superiority of its proposal. Id. at 16. As an example, Data Systems asserts that while the Source Selection Authority concluded that its proposal was superior to Booz Allen’s proposal under the Performance Capability subfactor of the Corporate Capability factor, she only briefly described its strengths, quickly turning to a lengthy explanation of why its superiority was “[. . .]” and noting that Booz Allen “[. . .].” Id. at 17-18 (citing AR 10533). Defendant takes issue with the arguments advanced by Data Systems. First, addressing the Source Selection Authority’s tradeoff between Data Systems and Wyle, defendant rejects the contention that the tradeoff was contrary to the weighting scheme set forth in the solicitation. Def.’s Mot. 66-67. In particular, defendant notes that the superiority of the proposal of Data Systems was “‘not great’” with respect to [. . .]. Id. at 66 (quoting AR 10547). Defendant also rejects the contention that the Source Selection Authority’s tradeoff between Data Systems and Booz Allen was contrary to the weighting scheme set forth in the solicitation. Id. at 67-69. Instead, defendant contends that the Source Selection Authority “considered the recommendations” of the Source Selection Evaluation Board, “delved deeper into the evaluation factors and closely examined the relative merit of both offers,” and “used her independent business judgment to determine that paying a [. . .] price premium for [Data Systems] was not warranted.” Id. at 67; see also id. at 68-69 (“Rather than mechanistically accepting the [Source Selection Evaluation Board]’s ratings or formulaically adding up ‘strengths’ and ‘weaknesses,’ [the Source Selection Authority] compared each proposal by quantitatively analyzing each factor.”). 3. Tradeoffs Concerning BearingPoint Like L-3 Services and Data Systems, BearingPoint contends that the best value tradeoffs conducted by the Source Selection Authority between it and Wyle, and between it and Booz Allen, were contrary to the weighting scheme set forth in the solicitation. BearingPoint Mot. 1217. The Source Selection Evaluation Board determined the strengths of all three proposals and assigned adjectival/color ratings, with which the Source Selection Authority concurred, AR 10474, as noted in the following table: -83- Rating and Strengths Factors and Subfactors BearingPoint Wyle Booz Allen 1: Corporate Capability [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 1.1: Knowledge and Experience [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 1.2: Performance Capability [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 2: Performance Risk [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 3: Technical/ Management [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 3.1: Understanding Support [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 3.2: Program Management Plan [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] 5: Small Business Participation [. . .] [. . .] [. . .] [. . .] [. . .] [. . .] Id. at 10448, 10456, 10458-59. In addition, the Source Selection Evaluation Board determined the offerors’ evaluated prices, as provided in the following table: Price Premium of BearingPoint Offeror Price Amount Percentage Booz Allen [. . .] [. . .] [. . .] Systems Research [. . .] [. . .] [. . .] Wyle [. . .] [. . .] [. . .] BearingPoint [. . .] [. . .] [. . .] Id. at 10356, 12779. Exercising her “independent judgment” about the identified strengths and assigned ratings, id. at 10475, the Source Selection Authority made the following comparison between Wyle and BearingPoint: [. . .]. -84- Id. at 10557-59. In addition, comparing Booz Allen with BearingPoint, the Source Selection Authority stated: [. . .]. Id. at 10544-45. In light of these comparisons and tradeoffs, BearingPoint contends that although the Source Selection Authority noted BearingPoint’s technical superiority, she chose to award contracts to Wyle and Booz Allen based solely on their lower prices. BearingPoint Mot. 12, 15. As such, argues BearingPoint, the Source Selection Authority improperly emphasized price in her best value tradeoffs. Id. at 17-18, 21. BearingPoint explains: The Army correctly indicated in the Solicitation that Price was less significant in the initial contract award than the Non-Price factor[s] because true competition would be for the task orders. In the initial award of contracts[,] the Government is attempting to identify the contractors’ superior Non-Price factors. Higher labor rates do not necessarily mean higher task order pricing . . . . In competing for a [Firm Fixed Price] task order, an offeror may be able to base its proposal on far fewer hours for a labor category because it is using more experienced/competent individuals. Id. at 17-18. BearingPoint asserts that given the solicitation’s de-emphasis of Price, the Source Selection Authority failed to “offer a detailed explanation as to why BearingPoint’s superiority in the Non-Price evaluation factors was not sufficient to justify award to BearingPoint” or “quantify the added value of the Non-Price factors in BearingPoint’s proposal when making the best value decision.” Id. at 18; accord id. at 13 n.7; cf. id. at 19 (arguing that the Source Selection Authority did not “quantify the value of [its] strength[s]”). Defendant does not address BearingPoint’s specific contentions. 4. The Source Selection Authority’s Best Value Tradeoffs Were Improper For several reasons, the court agrees with L-3 Services, Data Systems, and BearingPoint that the Source Selection Authority’s best value tradeoffs were arbitrary, capricious, and not in accordance with the law. As a threshold matter, the Source Selection Authority’s tradeoffs were based upon the Army’s irrational evaluations of the Technical/Management and Small Business Participation factors. Because a reevaluation of these factors would, at a minimum, lower the ratings of two contract awardees, the possibility exists that the Source Selection Authority’s tradeoffs would change. However, even if all of the Army’s evaluations were proper, there is ample evidence in the Source Selection Decision Document that the Source Selection Authority’s tradeoffs were improper. -85- First, the court finds that the Source Selection Authority did not adequately document her best value tradeoffs in the Source Selection Decision Document. Pursuant to the FAR, the Source Selection Authority was required to document the tradeoffs she made, FAR § 15.308, and conclusory statements are insufficient to meet this requirement, see Serco Inc., 81 Fed. Cl. at 497. Yet, the Source Selection Decision Document was replete with conclusory statements. For example, many of the Source Selection Authority’s “comparisons” between the offerors’ Performance Risk volumes lacked any comparative language and merely indicated that the two offerors received the same adjectival rating. See, e.g., AR 10528 (“[. . .].”), 10550 (“[. . .].”), 10558 (“[. . .].”). The Source Selection Authority did not document any effort that she made to compare the two proposals to determine whether, despite the identical ratings, one was stronger than the other, as she did in other factor comparisons. Given that the Performance Risk factor was the second most important factor, the Source Selection Authority’s brusque nonsubstantive “comparisons” are inadequate. Also inadequate are the instances where the Source Selection Authority noted the proposals’ identical ratings and then, without further explanation, declared that the proposals were “essentially equal.” See, e.g., id. at 10528 (“[. . .].”), 10533 (“[. . .].”), 10534 (“[. . .].”), 10536 (“[. . .].”), 10537 (“[. . .].”), 10544 (“[. . .].”). Although the Source Selection Authority used comparative language, she provided no explanation for why she considered the proposals to be equal.42 It appears that the Source Selection Authority’s conclusions relied upon the Army’s assigned ratings instead of the Army’s underlying evaluations. The FAR requires more detail. See Serco Inc., 81 Fed. Cl. at 498 (“[I]t is conceivable that the [Source Selection Authority], in his own mind, made such . . . comparisons, but merely failed to capture them on paper. But, that too would violate the FAR and its documentation requirements.”). Moreover, the Source Selection Authority’s conclusory statements contradicted her assertions that she [. . .]. See AR 10475. If the Source Selection Authority did, in fact, perform an “[. . .],” the fruits of that effort should be contained within the Source Selection Decision Document. However, the Source Selection Decision Document is bereft of such evidence. Second, the Source Selection Decision Document clearly reflects the Source Selection Authority’s intent to inflate the technical portions of the low-price offerors’ proposals and downplay the technical superiority of the higher-priced offerors’ proposals. Such intent manifested in two ways. One manifestation was the Source Selection Authority’s unequal treatment of the proposals when comparing a proposal that was deemed superior in one subfactor but inferior in the companion subfactor to a proposal that was deemed inferior in the former subfactor but superior in the companion subfactor. Often, when a split decision coincided with the higher-priced proposal receiving a higher rating on the overall factor, the Source Selection Authority would emphasize either the positive aspects of the lower-priced offeror’s proposal or the limits of the higher-priced offeror’s proposal in its tradeoff decision. See, e.g., id. at 1053637 (comparing [. . .]). Yet, when the split decision coincided with the higher-priced proposal 42 The court notes that the Source Selection Authority made an effort to explain the equality of proposals in some instances. See, e.g., AR 10544 (“[. . .].”). However, all such comparisons need to be documented. -86- receiving a lower rating on the overall factor, the Source Selection Authority often failed to mention that proposal’s positive aspects in her tradeoff decision. See, e.g., id. at 10550-51 (comparing [. . .]). The second manifestation was the appearance that the Source Selection Authority went out of her way to undercut the superior technical evaluations of the higher-priced proposals. Specifically, when the Source Selection Authority concluded that a higher-priced proposal that had received a higher adjectival/color rating was technically superior, she would often note the superiority and then proceed to either downplay it as “slight” or “not unlimited,”43 or highlight the “soundness” or “acceptability” of the technically inferior proposal.44 See, e.g., id. at 10533 (“[. . .].”), 10536 (“[. . .].”), 10544 (“[. . .].”), 10547 (“[. . .].”), 10550 (“[. . .].”), 10557 (“[. . .].”). Yet, very rarely did the Source Selection Authority note the superiority of a lower-priced proposal with a higher adjectival/color rating and then proceed to downplay the superiority as “slight.” See, e.g., id. at 10534 (“[. . .].”), 10537 (“[. . .].”), 10545 (“[. . .].”). Further, with respect to the three examples cited in support of the last proposition, the court has already held that the Army improperly rated Booz Allen on the Small Business Participation factor, rendering these examples meaningless. Thus, from the Source Selection Authority’s statements in the Source Selection Decision Document, it appears that when comparing proposals on a nonprice factor, she strived to praise the merits of lower-rated proposals when they were lower-priced, but steadfastly declined to praise the merits of lower-rated proposals when they were higher-priced. To put it differently, whenever a lower-priced proposal received a better rating for a factor, it was never a close call as to that proposal’s superiority on that factor. But, when a higher-priced 43 The Source Selection Authority’s assertions that the superiority of L-3 Services, Data Systems, and BearingPoint was “[. . .]” seem particularly gratuitous and represent glaring examples of the Source Selection Authority’s attempt to downgrade the higher-priced, technically superior proposals. Such gratuitous comments are tantamount to the Source Selection Authority applying a “superiority-lite” rating, which is not described in the solicitation or Source Selection Plan. 44 It bears noting that the first three cited examples compare Data Systems, L-3 Services, and BearingPoint to Booz Allen under the Performance Capability subfactor, part of the most important factor–Corporate Capability. And, the remaining cited examples compare Data Systems, L-3 Services, and BearingPoint to Wyle under the Knowledge and Experience subfactor, also part of the most important factor–Corporate Capability. Yet, despite the importance accorded the Corporate Capability factor in the solicitation, and despite the range of adjectival/color ratings assigned by the Army for the Corporate Capability factor and its subfactors, the Source Selection Authority appears to have merely cut and pasted her comments from one comparison to the next. As with the previously identified conclusory statements, such cutting and pasting undercuts the Source Selection Authority’s assertions that she “carefully assessed the benefits of the strengths identified in the reports” and performed an “exhaustive comparison of the individual strengths of the successful offerors with those of the unsuccessful offerors.” See AR 10475. -87- proposal received a better rating for a factor, it was often the case that the lower-priced proposal was found to be of almost equal quality. This clearly represents unequal treatment. Third, it is also apparent that the Source Selection Authority emphasized the nature, quality, and extent of a proposal’s strengths when it benefitted a lower-priced proposal but often ignored the nature, quality, and extent of a proposal’s strengths and relied instead on the adjectival/color ratings when a higher-priced proposal was found to be superior. Compare id. at 10533 (“[. . .].”), 10536 (“[. . .].”), 10544 (“[. . .].”), 10550 (“[. . .].”), 10557 (“[. . .].”), with id. at 10528 (“[. . .].”), 10534 (“[. . .].”). Again, this is improper, unequal treatment. Many of the above-noted issues reflect a more basic problem with the Source Selection Authority’s best value tradeoffs–she accorded the price factor more weight than the solicitation permitted. As has been repeated throughout this opinion, Price was the fourth most important factor and the four nonprice factors, taken together, were significantly more important than Price. See id. at 379. In other words, according to the terms of the solicitation, the Source Selection Authority was required to give “far more consideration” to the four nonprice factors than to Price. See id. at 380. However, the Source Selection Authority’s consistent efforts to (1) boost the technical merits of the low-priced proposals and (2) denigrate the technical superiority of the higher-priced proposals are evidence of her desire to award contracts to the lowest-priced offerors that met a minimum level of technical acceptability.45 Moreover, there is a clear reason why the solicitation accorded less importance to Price. At this stage of the procurement, offerors were not proposing to provide any particular services at a specified contract price. Instead, offerors were competing for the opportunity to be in the pool of offerors that would, at some future date, submit proposals on yet-to-be-announced task orders. The details of these possible future task orders were not contemplated in the solicitation. Accordingly, the Army could not know for certain exactly how much it would expend pursuant to the contracts awarded pursuant to the solicitation. And because the Army could not know, undoubtedly the offerors could not know. Indeed, the uncertainty concerning the scope and quantity of future task orders was reflected in the wide range of total prices proposed by the offerors. Both the task order uncertainty and the wide range of proposed total prices demonstrate the unreliability of the offerors’ proposed pricing and, accordingly, the reason that the solicitation downgraded the importance of the Price factor. Thus, the Source Selection Authority’s elevation of the importance of the Price factor is contrary to the terms of the solicitation. Accordingly, the Source Selection Authority’s best value tradeoffs were arbitrary, capricious, and not in accordance with the law. 45 Defendant does not help its case by arguing that “accepting the plaintiffs’ arguments would require the Government to reject the lowest-priced offers, despite the fact that the [. . .].” See Def.’s Mot. 51. Defendant’s argument implies that a minimum level of competence was all that was required in determining best value, which is contrary to the terms of the solicitation and thus lends credence to plaintiffs’ position that the Army treated the instant procurement as a lowcost, technically acceptable procurement. -88- In reaching this conclusion, the court is compelled to address two of defendant’s arguments. First, defendant’s invocation of the portion of the solicitation providing that “an award may not necessarily be made to the lowest priced Offeror” and that “if the non-Price factors are evaluated as comparatively equal between two or more Offerors, Price may become a determinative factor,” see id. at 379, cited in Tr. 93, to explain the weight accorded to the Price factor by the Source Selection Authority is unpersuasive. While an evaluation based solely on a strict comparison of adjectival/color ratings and counting of strengths is neither contemplated by the solicitation nor described in the Source Selection Plan, the Source Selection Authority cannot just ignore the fact that an Outstanding (Blue) rating is quantitatively stronger than a Good (Green) rating or the fact that an evaluation factor that is assigned seven significant strengths is quantitatively stronger than an evaluation factor that receives four strengths altogether, only two of which are significant strengths. It strains credulity that the Source Selection Authority could find the proposals in this procurement to be comparatively equal given the wide range of adjectival/color ratings and strengths assigned to the proposals. Second, the court would be remiss if it did not address defendant’s description of the process by which the Source Selection Authority conducted her best value tradeoffs. See Def.’s Reply 4; accord Tr. 93-94. Defendant’s description is wholly unsupported by the administrative record. Defendant begins by stating that the Source Selection Authority began her analysis by “comparing [Systems Research] against all other offerors, [. . .].” Def.’s Reply 4; accord Tr. 9394. This assertion contains two misstatements: (1) nowhere in the Source Selection Decision Document did the Source Selection Authority indicate that she began her comparisons with Systems Research because Systems Research was [. . .] and (2) the Source Selection Authority did not compare Systems Research to all of the other offerors. Defendant continues by asserting that the Source Selection Authority “concluded that [Systems Research] offered the best value.” Def.’s Reply 4. To the extent that defendant is implying that the Source Selection Authority determined that Systems Research represented the overall best value, it is incorrect. Next, defendant indicates that the Source Selection Authority “turned to [Booz Allen], because it [. . .].” Id.; accord Tr. 94. Yet, the Source Selection Decision Document contained no explicit rationale by the Source Selection Authority for why she turned to Booz Allen after Systems Research. Defendant then indicates that the Source Selection Authority proceeded to “compare[] L-3 to Wyle to see which of the two was the better value.” Def.’s Reply 4; accord Tr. 94. This statement also contains two errors: (1) the Source Selection Authority never indicated that the final unrestricted award decision came down to a decision between L-3 Services or Wyle and (2) the first offeror to which the Source Selection Authority compared Wyle was Data Systems, not L-3 Services. Given the foregoing errors, it is clear that defendant’s statement reflects its conjecture regarding the process used by the Source Selection Authority, and not the process that was actually described in the Source Selection Decision Document. The court must be guided solely by the contents of the administrative record and reject conjecture. See Bannum, Inc., 404 F.3d at 1356-57. In sum, for the reasons set forth above, the court concludes that although the Source Selection Authority used the language of a best value analysis, she instead selected the contract -89- awardees based on a technically acceptable, low price analysis. Because almost all of the parties’ extensive arguments on this topic relate, in one way or another, to whether the Source Selection Authority converted the specified best value tradeoffs into a technically acceptable, low price analysis, there was no need for the court to address some of the more specific and nuanced arguments advanced by the parties. G. Plaintiffs Have Established Prejudice on Behalf of the Unsuccessful Offerors Competing Under the Unrestricted Portion of the Solicitation As noted previously, in order to prevail, plaintiffs must demonstrate that they were prejudiced by the Army’s errors; in other words, plaintiffs must show that had the Army not erred, they had a substantial chance of being awarded one of the contracts. Banknote Corp. of Am., 365 F.3d at 1350; Data Gen. Corp., 78 F.3d at 1562. The court has concluded that the Army improperly evaluated the proposals within the competitive range and that the Source Selection Authority’s best value tradeoffs were significantly flawed. Based upon these findings, the Army will be required to reevaluate most of the proposals within the competitive range and the Source Selection Authority will be required to redo most of her best value tradeoffs based on the new evaluations and using the factor weighting described in the solicitation.46 Moreover, the new evaluations will likely lead to new adjectival/color ratings and the reweighted (i.e., proper) tradeoffs may well lead to new determinations of best value. Thus, each offeror in the competitive range that remains eligible to receive a contract award pursuant to the unrestricted portion of the solicitation has a substantial chance of receiving a contract and therefore has been prejudiced by the Army’s errors.47 46 Savantage and Binary were awarded contracts under the portion of the solicitation reserved for small businesses. See AR 10573. Of the five plaintiffs, only TAPE was a small business with standing to challenge the Army’s small business contract awards, but it was unsuccessful in its challenge. See supra Parts III.A, C, D.1. Further, in addition to lacking standing to challenge the Army’s small business contract awards, none of the four large business plaintiffs made such a challenge. Accordingly, the court will not set aside these contract awards. Thus, the court excludes Savantage and Binary from its conclusions that the Army improperly evaluated the proposals within the competitive range and that the Source Selection Authority improperly performed the best value tradeoffs. 47 Although only three of the plaintiffs in this case–L-3 Services, Data Systems, and BearingPoint–successfully challenged the instant procurement and established prejudice, all of the offerors in the competitive range, including unsuccessful plaintiff TAPE, benefit from the court’s ruling. An additional reason for reopening the procurement to all of the offerors in the competitive range–except Savantage and Binary–is that the Army allowed for the possibility that the Army could award more than three contracts pursuant to the unrestricted portion of the solicitation. See AR 378. -90- H. Award of a Permanent Injunction The Court of Federal Claims has the authority to award injunctive relief pursuant to 28 U.S.C. § 1491(b)(2) and is guided in making such an award by RCFC 65. In determining whether to award a permanent injunction, a court must consider whether (1) the plaintiff has succeeded on the merits; (2) the plaintiff will suffer irreparable harm if the court withholds injunctive relief; (3) the balance of hardships favors the grant of injunctive relief; and (4) it is in the public interest to grant injunctive relief. PGBA, LLC v. United States, 389 F.3d 1219, 122829 (Fed. Cir. 2004). 1. Success on the Merits The court has concluded that the Army erred in evaluating the proposals within the competitive range in two ways, by (1) improperly evaluating the Small Business Participation factor and (2) failing to consistently evaluate the on-site decision-making authority of the offerors’ program managers. The court has also concluded that the Source Selection Authority accorded the Price factor more weight than permitted by the Army’s solicitation, resulting in best value tradeoffs of the proposals within the competitive range, except for the proposals of Savantage and Binary, that were arbitrary, capricious, and not in accordance with the law. By elevating Price over the more highly rated nonprice factors, whether it be the three more highly rated nonprice factors taken individually or the four nonprice factors taken collectively, the Source Selection Authority violated the express provisions of the solicitation. Moreover, the court concludes that the unsuccessful offerors were prejudiced by the improper evaluations and best value tradeoffs. 2. Irreparable Injury “When assessing irreparable injury, ‘[t]he relevant inquiry in weighing this factor is whether plaintiff has an adequate remedy in the absence of an injunction.’” Overstreet Elec. Co. v. United States, 47 Fed. Cl. 728, 743 (2000) (quoting Magellan Corp. v. United States, 27 Fed. Cl. 446, 447 (1993)). Offerors in the competitive range, aside from Savantage and Binary, have lost the opportunity to perform under the contract–a loss of business valued at $478,642,912–due to the Army’s failure to properly evaluate their proposals and perform the best value tradeoffs. In addition, any offeror that should have been awarded a contract, but was not, will be at a disadvantage when competing for future contracts. No adequate remedy exists to make up for this potential loss of business or competitive advantage. Accordingly, the offerors in the competitive range have been irreparably injured. 3. Balance of Hardships To assess the balance of hardships, the court must examine the harm to the government in issuing a permanent injunction. See id. at 744. Because the Army presently has incumbent -91- contractors performing the work covered by the solicitation at issue here, the balance of harms weighs in favor of plaintiffs. 4. Public Interest It is unquestioned that “the public interest in honest, open, and fair competition in the procurement process is compromised whenever an agency abuses its discretion in evaluating a contractor’s bid.” Id. Because the Army awarded the contracts at issue in this case based upon improper evaluations and arbitrary, capricious, and unlawful best value tradeoffs, the public interest was compromised. IV. CONCLUSION In sum, the court finds that injunctive relief is appropriate in this case. Accordingly, it is ORDERED that: • The motions for judgment on the administrative record filed by Femme Comp and TAPE are DENIED; • The motions for judgment on the administrative record filed by L-3 Services, Data Systems, and BearingPoint are GRANTED IN PART AND DENIED IN PART; • The cross-motions for judgment on the administrative record filed by defendant and Booz Allen are GRANTED IN PART AND DENIED IN PART; • The cross-motion for judgment on the administrative record filed by Savantage is GRANTED; • The motion to supplement the administrative record filed by Femme Comp is DENIED AS MOOT; • The motions to strike filed by BearingPoint and defendant are GRANTED; • The contracts awarded to Systems Research, Wyle, and Booz Allen pursuant to solicitation number W91QUZ-07-R-0007 shall be SET ASIDE; • Although the contract awards to Savantage and Binary were not successfully challenged and will not be set aside, the Army is ENJOINED from commencing performance on those contracts until it has re-awarded contracts under the unrestricted portion of the solicitation; -92- • For all of the proposals within the competitive range, except for the proposals submitted by Savantage and Binary, the Army shall reevaluate, at a minimum, the factors and/or criteria identified in the court’s forthcoming decision, conducting another round of discussions if necessary; • Upon reevaluating the ten identified proposals, the Army shall render a new Source Selection Decision, ensuring that each factor is accorded the appropriate weight; and • The court has filed this opinion under seal. The parties shall confer to determine proposed redactions agreeable to all parties. Then, by no later than Friday, September 26, 2008, the parties shall file a joint status report indicating their agreement with the proposed redactions, attaching a complete copy of the court’s opinion with all redactions clearly indicated. The clerk is directed to enter judgment accordingly. IT IS SO ORDERED. s/ Margaret M. Sweeney MARGARET M. SWEENEY Judge -93-

Some case metadata and case summaries were written with the help of AI, which can produce inaccuracies. You should read the full case before relying on it for legal research purposes.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.