As technologies such as digital signal processors (DSPs), field programmable gate arrays (FPGAs), and general-purpose processors (GPPs) advance, product and feature differentiation has become more difficult. As a result, designers must use rigorous approaches to evaluate technologies available from different military technology providers.
Evaluating Competing Criteria
Developing a system design for government projects typically requires defense contractors to evaluate and make system decisions based on documents such as a request for proposal (RFP), statement of work (SOW), and concept of operations (CONOP) (see Figure 1). From these documents, contractors assess system alternatives while maximizing the customer’s expected system goals, objectives, and capabilities. As there may be competing goals and objectives, a structured method of evaluating alternatives is required to support design decisions.
The analytical hierarchical process (AHP) is a commonly used multi-criteria decision analysis (MCDA) method used to evaluate design alternatives. AHP is favored over other MCDA methods due to its structured mathematical approach and ease of use.
This article summarizes the Analytical Hierarchical Process (AHP) and a weighted sum of products scoring approach. As with any MCDA approach, the evaluator should be careful to understand the biases of any approach on their decisions.
FPGA Trade Study Approach
An FPGA selection trade study typically focuses on mission key performance parameters (KPPs). Establishing traceability of these key performance parameters as they apply to the FPGA functionality is the first step in this process. The example criteria presented may not be applicable to every project, but many can be used to identify which FPGA vendor best meets mission objectives.
Device Availability — Developers must be confident that production devices are available when needed to support critical proposal milestones. Additionally, due to the long support and production lifecycles of military systems, FPGA product lifetimes must also be considered.
Logic Density and Efficiency — FPGA implementation efficiency can vary widely based on device architecture and algorithm design parameters. Logic utilization metrics from previous designs are the best source of data; however, metrics based on prototyping efforts may be necessary for new designs.
Power — Many military applications are portable and power-sensitive. These applications can include battery-operated radios or FPGAs processing SIGINT and radar signals. Advanced tools such as Programmable Power Technology available in Altera’s Stratix IV FPGAs can optimize FPGA power and performance to meet mission needs.
Productivity — Design productivity is difficult to measure when selecting an FPGA vendor. Using incremental compilation and a team-based design approach provides productivity advantages. Development tools such as Quartus II design software provide incremental compilation and team-based design.
Quality and Reliability — Device quality and reliability metrics need careful evaluation, using empirical data if possible. While these requirements are mission-specific, general operational specifications may include commercial, industrial, or military operating environments. Selecting a part to meet mission objectives at the best program value is necessary for competitive procurements.
Past Performance — Past performance is typical in government source selection. This criterion typically includes production schedule realism, test chip results, delivery history, manufacturing (fabrication and packaging), partner review, and leadership standing.
Design Reuse — Reuse includes algorithms and IP from previous programs or an FPGA vendor’s IP library. Large amounts of design reuse minimize schedule risk and non-recurrent engineering (NRE) costs. FPGA vendors offering system design tools such as SOPC Builder ease integration of IP including soft processors, memory controllers, and DSP functions.
Device Packaging — This criterion determines component size and weight, as well as pin count and thermal properties. Characteristics can be KPPs for certain missions. A device’s performance and technology process node directly affects die size, making smaller packaging viable.
Performance — Processing performance is an important metric for many military applications and can include I/O bandwidth, on- and off-chip memory performance, DSP capabilities, and processing efficiency. Evaluation of these parameters is best if portions of the designs requiring high performance can be prototyped early in the design.
I/O Bandwidth — Advanced systems such as radar, electronic warfare, SIGINT, and ISR applications require high memory bandwidth, typically using 533-MHz DDR3 and high-performance transceivers. Transceiver performance of 11.3 Gbps supporting numerous industry- standard protocols, with superior signal integrity, allows developers to meet KPPs using FPGA technology over expensive ASICs for the first time.
Security and Anti-Tamper Requirements — Security and anti-tamper capabilities of an FPGA are requirements for equipment that may be lost or compromised. In some cases, a separate security trade study may be used to determine the implementation of security functions. In either case, FPGA security should no longer be considered inferior to that of an ASIC. Evaluation of these metrics should be done with the assistance and guidance of appropriate government evaluators for military applications.
Cost — Cost is usually a key decision criterion after other requirements have been met. In some low-volume applications, product cost may not be a significant decision factor while in others, bill of material costs are decision drivers. Development cost, qualification cost, hardware cost, and operations/maintenance costs are significant cost factors that can be offset by using proven FPGA technology.
Switching Costs — Switching refers to moving an existing design from a previous technology or from one FPGA vendor to another. Switching costs are typically an independent variable assessed separately by a vendor trade study and often requires vendor support for evaluation.
Trade Study Criteria Evaluation
The AHP is based on pairwise criteria comparisons, making the evaluation of many dissimilar criteria simple because only two criteria are evaluated at once. The AHP pairwise comparison has three steps:
- Build a pairwise comparison table to reflect the number of decision criteria (Table 1).
- Assign the relative weights based on pairwise comparison. The values used in this table range from 1 to 9. Note: If criteria are not in rank order, scale inversion is needed (Table 2).
- Calculation of the pairwise comparison consistence ratio (CR).
The CR provides the evaluator feedback on the scoring process. If the CR is high (>10%), then the comparisons made are inconsistent. This feedback gives the evaluator a means of identify-ing inconsistencies and/or biases in scoring methodology.
Table 3 summarizes a sample ranking of device selection criteria. The color of the ranking indicates which factor is preferred when weighed against the others, while the number indicates the strength of preference (i.e. when comparing power against cost, power is preferred).
Trade Study Results
Once a designer determines whether cost and security will be part of the trade study, the evaluator builds a trade table shown in Table 4. The AHP weights from Table 3 are used to weight the solution scores on a scale of 1 to 10 (Table 2). Trade study results including a sum of products total score is shown in Table 4. Documentation should include the rationale behind the pairwise rankings, vendor solution scores, and score metrics used. This documentation may include a sensitivity analysis that identifies which criteria weightings or raw vendor scores are sensitive to evaluator decisions.
The results of a trade study should provide confidence when selecting an FPGA. More importantly, using trade study disciplines, engineering organizations can provide requirements traceability to the device level. Trade studies improve the quality of system documentation offered to government customers, and combine quantitative and qualitative data in a way that is accessible and modifiable by customer and designer alike. Using the Analytical Hierarchical Process to perform Multi-Criteria Decision Analysis provides decision-makers with the tools and processes to make critical decisions easier.
This article was written by Paul Quintana, Senior Technical Marketing Manager for Altera Corporation’s military and aerospace business unit in San Jose, CA. For more information, Click Here