May 1, 2005 -- Eight years ago, test cost for high-performance devices such as microprocessors was a serious concern. The International Technology Roadmap for Semiconductors (ITRS) published an infamous diagram indicating a trend of deep concern for the industry. The trend predicted test cost would become a significant majority of overall manufacturing cost (see Figure 1).
The result of these predictions was a mobilization of the industry to head off a situation where test costs could impede Moore's Law. The challenge brought forth by the ITRS report caused test tool suppliers to take notice. Specifically, they began to see this developing situation as an opportunity for increased utilization of design-for-test (DFT) methodologies. Ultimately, this projected trend never became a reality, and more recent ITRS reports show that test costs remain at a very manageable level. In fact, because of breakthrough technical advances, DFT is now being called upon to take an ever more prominent role in improving overall product quality.
Figure 1: Projection of Test Cost. Source ITRS 2001
For high-performance digital devices such as microprocessors and high-speed ASICs, three key technologies have emerged as the primary components of DFT test. Thorough coverage of the logic portion of designs is achieved through scan and automatic test pattern generation (ATPG). These have become the basis of most test strategies. A second powerful technology- scan test pattern compression-emerged nearly four years ago as a method to significantly improve test quality while utilizing existing capital test equipment. Third, for embedded memory arrays and registers, memory built-in self-test (MBIST) has become the standard test method. For this discussion, we will focus the first two technological advances: ATPG and test pattern compression.
In 1997, there were doubts concerning scan's role in test. At-speed functional testing was an established methodology. To supplant this functional testing, DFT needed to solve several important issues before it could become mainstream. First, it needed to have high test quality to prevent defects from "escaping" from the manufacturing floor. Second, the scan test needed to be applied with minimal impact on the functional design of the circuit. Third, it needed to be adoptable with existing automatic test equipment (ATE), avoiding expensive retooling of the manufacturing test floor. Finally, the performance of these DFT tools needed to exhibit continual improvement to keep pace with ever-increasing design sizes.
Higher Quality Results with ATPG
The acceptance of structured scan test circuitry implementation has been the key to improved test, enabling the arrival of a long list of new ATPG improvements over the past eight years. Scan enables test patterns to reach areas of a circuit that are much more difficult, if not impossible, to reach with traditional functional test.
The most common type of tests for ATPG are based upon the "stuck-at" fault model-where we assume a failure is a logical short to either 1 or 0. A continually improvement of stuck-at fault coverage has been quietly occurring behind the scenes. Compacting of patterns is now done so that high coverage can be achieved with a lower pattern count. Automatic pattern reordering can be used to put the most effective patterns up front to detect faults sooner or to maximize coverage in the event of a truncation of a pattern set due to ATE limitations.
Time-dependent defects have always been a key concern of any test methodology. The problem has been further magnified with the small features in designs of 130nm and below. This concern originally helped fuel the pessimism about test in 1997 because functional testers were being required to deliver at-speed patterns at an ever increasing frequency, causing projected ATE costs to skyrocket. The ATPG answer to these time-dependent defects is transition fault testing, using an on-chip PLL. High coverage levels for transition faults are now possible .
One issue of at-speed scan testing is avoiding over-testing and failing devices due to multi-cycle and false paths. These paths are not intended to propagate signals within one clock cycle during normal functional operation (see Figure 2). They can be erroneously sensitized during scan pattern generation. To avoid this, ATPG tools can now import SDC (Synopsys Design Constraint) information for multi-cycle and false paths to enable effective pattern set generation. Patterns are not allowed to propagate signals through these illegal paths; however, other legal paths can propagate signals through the same launch and capture cells.
Figure 2: P1 represents a hypothetical multi-cycle path-one that is illegal in functional operation.
The term design-for-test is interesting in that it implies work by the chip designer to achieve better test. This has proved to have been an initial problem with DFT. Chip designers are under incredible pressure to develop functional operation that meets high performance and power consumption requirements. A wide variety of circuit configurations are being developed that challenge test requirements. Design teams have little very little latitude in modifying the design specifically for improving test, nor are they willing to sacrifice too much additionally area overhead specifically for test. The presence of X-states during test can cause problems for most logic BIST approaches. Modifying the functional design to get rid of X-states can eat away valuable time from a design schedule. Insertion of "test points" to improve coverage can also hinder the design flow. Fortunately, there are ATPG and test compression methodologies that essentially eliminate these issues. The result is a test development methodology that is rarely in the critical path of the design schedule.
Achieving a high quality test pattern set can be challenging, even for experienced DFT engineers. There are too many variables within the design for many people to analyze, which leads to confusion about which settings to use. Fortunately there are tools available that automate this process. One such tool is the "ATPG Expert" feature of Mentor Graphics' FastScan™ and TestKompress® tools. It automatically performs an analysis of the circuit to determine the most effective settings, similar to an experienced DFT engineer. Some of the automated operations include assessment of logic complexity, RAM test possibilities, contention issues, and clocking schemes. The tool will even notice inefficiencies during an ATPG run, momentarily halt it, adjust settings, and continue.
Extending the Life of ATE
Though ATPG advances have greatly improved the quality of test, there is a price to pay-much larger pattern sets. Complete incorporation of new fault models can make test pattern sizes explode, severely impacting test time. Further, test data volume can quickly exceed the memory of existing ATE systems.
In one of the most significant developments in DFT technology-test pattern compression-was introduced nearly four years ago with the release of new EDA tools such as TestKompress . It ushered in an era where improvements of 100X could be achieved in both test time and test data volume (figure 3). Not only does test pattern compression provide the mechanism for implementing the larger test pattern sets required by new fault models, it also allowed chip manufacturers to extend the life of their existing ATE systems. For these reasons, it's not surprising to see the very high adoption rate of this technology by the IC industry.
There are some key reasons why test pattern compression has quickly become integrated into the mainstream IC development process. First, the implementation of test pattern compression requires little additional effort beyond that of ATPG. Therefore, there are neither severe shifts in methodology, nor a need for extensive retraining. Further, all of the additional advances in ATPG equally apply to test compression.
Another key aspect of test compression is that it avoids several pitfalls of some earlier logic BIST methodologies. Proper handling of scan responses enables continued high coverage even with the existence of X-states. This means that there is no need to modify the functional design to rid it of the X-states. This test process is flexible enough to fit into almost any process flow without any significant changes. As an example, the TestKompress tool requires only minimal additional test logic-typically 25 gates per scan channel-which is inserted at the RTL level. It even enables complete insertion within logic blocks using only one scan channel for top level routing.
Figure 3: Test pattern compression makes the design look smaller in terms of test time and data volume to the tester.
Keeping Pace with Design Complexity
Even beyond test pattern compression, ATPG tools have seen a number of performance improvements over the past few years. As an example, the ATPG engines of the most advanced DFT tools have seen significant performance improvements in nearly every aspect operation: design flattening, design rule checking (DRC), improved at-speed pattern runtime and pattern count, and reduced memory requirements.
Another development is the concept of modular test compression. This can be especially valuable in the design of reusable cores. An important aspect of this technology is the ability to complete the insertion of the test compression circuitry into individual logic blocks without any top level logic.
An exciting new development is ATPG process distribution, where a virtually unlimited number of network resources can drastically cut runtime. This system works by a "master" farming out individual ATPG processes to any number of "slaves." It's important that during this operation the end results are consistent and repeatable regardless of how many slave processes are employed. The distribution mechanism must be robust enough to continue operation even if slave resources drop out during the overall ATPG run. This promising technology has shown speed-up improvements approaching 20X while generating consistent pattern sets regardless of the number of slaves deployed.
There are still new challenges facing ATPG. Nanometer designs have seen a higher occurrence of bridging, open, and resistive defects. New fault models will need to be employed to uncover these defects. Again, this will put a further strain on ATPG tools, and EDA vendors will have to respond to the new demands.
Yield issues are now a primary focus of IC manufacturers. Test is a critical component of any yield monitoring strategy. Traditionally, effective tests were only needed to provide a "go/no-go" result. Now, failure analysis requires that failing devices be analyzed to determine the cause of failure. There are many difficult aspects to this problem. Many times, failure logs from ATE systems are truncated, therefore limiting the information that a scan diagnostics tool can use. Further, in most cases, even an exact logical location does not provide enough resolution to make an accurate assessment of the nature of the failure. A particularly interesting problem is accurately diagnosing a problem with the scan chain itself. New analyses are being developed that will provide accurate failure location for these type of failures.
It is clear that scan test has met the challenge brought forth in the 1997 ITRS report. ATPG and test compression have now become common techniques that meet the challenges brought forth by nanometer designs. Ironically, the success of scan test has made it a tool being asked to do even more difficult tasks. Continual improvement of ATPG tools will become more difficult as design sizes continue to grow and failure mechanisms become more complex. Now it is even more critical for ATPG tool vendors to work closely with their customers to construct a product development roadmap that matches upcoming technology.
By Mark Chadwick, Product Marketing Manager, Mentor Graphics Corp.
Chadwick has worked in EDA and DFT for over 17 years, most recently at LTX-Credence before joining Mentor Graphics. He has a BSEE from the University of Wisconsin.
 Matthias Beck, Infineon Technologies AG, et al., Logic Design for On-Chip Test Clock Generation - Implementation Details and Impact on Delay Test Quality, DATE Conference 2005.
 Frank Poehl, Infineon Technologies AG, et al., Industrial Experience with Adoption of EDT for Low-Cost Test without Concessions, In proc. of International Test Conference, pp. 1211-1220, 2003.
All references are also accessible at mentor.com.
Go to the Mentor Graphics Corp. website to learn more.