BUFFER DESIGN:
Power dissipation and propagation delay in CMOS buffers driving large capacitive loads and proposes a CMOS buffer design for improving power dissipation at optimized propagation delay. The reduction in power dissipation is achieved by minimizing short circuit power and subthreshold leakage power which is predominant when supply voltage (VDD) and threshold voltage (Vth) are scaled for low voltage applications in deep sub micron (DSM) region.
Large capacitive loads are often present in CMOS integrated circuits and tapered buffers are used to drive these large capacitive loads at high speed, while ensuring that the load placed on previous stages of the signal path is not too large.These buffers are used in the memory access path as word-line drivers, to drive large off-chip capacitances in I/O circuits, and in clock trees to ensure that skew constraints are satisfied. But,deployment of these buffers in high-performance systems imposes a power overhead on each instance regardless to its actual performance.High-performance VLSI design is attracting much attention because of emerging need for miniaturization, and hence design optimization for trading-off power and performance in nano meter scale integrated circuits is the need of the present scenario, which demands a decrease in both supply voltage VDD(to maintain low power dissipation) and threshold voltage Vth (to sustain propagation delay reduction), but the fact is that the decrease in Vth not only increases leakage power but also short circuit power. while working in nano scale technology the total power dissipation of clock.
No comments:
Post a Comment