Reducing ATE Test Time by Voltage and Frequency Scaling
Type of Degreedissertation
MetadataShow full item record
During wafer sort, the fabricated chips are subjected to tests that verify if they meet the design specification. Test application time plays a critical role while verifying large volume of dice in a given period of time. These tests are carried out on an automatic test equipment (ATE). The time spent on the ATE directly affects the final cost of the device. Hence it is paramount to reduce test application time such that the device can be verified reliably while keeping the test time to a minimum. While reducing test application time is important, power dissipation is also important while considering reduction in test time. Power dissipation is often a trade off when deciding the test frequency and becomes a major limiting factor. One of the major approaches to test time reduction during circuit design is to implement multiple scan chains. This approach reduces test time drastically when compared to a same device implemented using a single scan chain. Other approaches involve manipulating test hardware and test patterns to reduce test time and testing many dice in parallel. The objective of this thesis is to obtain an optimum solution to the trade off and the feasibility of such approaches which can lead to new test methods in hardware and software. The problem is approached in two ways (i) by scaling the supply voltage, and (ii) by scaling the test frequency. Additionally, the two methods can be combined to reduce test time further. These methods can be used in tandem with existing methods to provide additional gain in test time reduction. The proposed methodologies are verified by simulation and through experiments. The experiments were carried on the Advantest T2000 ATE located at Auburn University, Alabama. The simulations were performed using ISCAS'89 benchmark circuits and results show up to 50% reduction in test time.