Delay Test Scan Flip-flop (DTSFF) Design and Its Applications for Scan Based Delay Testing
Type of DegreeDissertation
Electrical and Computer Engineering
MetadataShow full item record
Scan based delay testing is currently mostly implemented using launch-on-capture (LOC) delay tests. Launch-on-shift (LOS) tests are generally more effective, achieving higher fault coverage with significantly fewer test vectors, but require a fast scan enable signal, which is not supported by most designs. A low cost solution is presented for implementing LOS tests by adding a small amount of logic (six transistors) in each flip-flop to align the slow scan enable signal to the clock edge. This new scan cell design which is called Delay Test Scan Flip-flop (DTSFF) can support full LOS and LOC testing, achieving an average TDF (Transition Delay Fault) coverage of 95.78% in this combined mode for the ISCAS89 benchmarks. Mixed LOC/LOS tests can be further applied to increase coverage for ISCAS89 benchmarks. In addition, a partial DTSFF scheme, which replaces only 20-40% carefully chosen scan flip-flops in the scan chain with the new DTSFF can achieve most of the coverage benefits of a full DTSFF design while minimizing area overhead. Our partial scan scheme for modified scan flip-flops can also be applied to enhanced scan designs that support high coverage TDF testing but with significant overhead. A flip-flop selection strategy presented for partial enhanced scan designs shows a very favorable trade-off between coverage and overhead. Experimental results using commercial ATPG tools show that 60-90% of the TDF coverage benefits of enhanced scan can be achieved using only 10-30% enhanced flip-flops. The architectural restrictions of scan further greatly limit the effectiveness of traditional scan based delay tests. It has been recently shown that additional testing for delays on short paths using fast clocks can significantly lower DPM (Defect Per Million). However, accurately obtaining the needed timing information for such tests from simulation is extremely difficult. The simulations must not only accurately account for the effects of process parameter variations, but also power supply noise and crosstalk from the excessive switching activity of scan tests. We present a methodology for learning signal timing information on silicon to ""calibrate"" such tests which can be much more accurate and cost effective. Such an approach requires that the outputs of the applied tests be hazard-free to avoid learning incorrect timing due to a glitch at the output. Simulation results presented here indicate that such output hazard-free tests can be obtained with an average coverage only about 10 % below the transition delay fault coverage for both launch-on-shift and launch-on-capture modes.