Digitally Assisted Performance Tuning of Analog/RF Circuits with an On-Chip FFT Engine

Digitally Assisted Performance Tuning of Analog/RF Circuits with an On-Chip FFT Engine

Marvin Onabajo (Northeastern University, USA), Yong-Bin Kim (Northeastern University, USA), Yongsuk Choi (Northeastern University, USA), Hari Chauhan (Northeastern University, USA), Chun-hsiang Chang (Northeastern University, USA) and In-Seok Jung (Northeastern University, USA)
DOI: 10.4018/978-1-4666-6627-6.ch010
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

A serious drawback associated with systems-on-a-chip integration and CMOS technology scaling trends is the increasing susceptibility to manufacturing process variations and aging effects. Consequently, it is critical to improve on-chip measurement and self-calibration capabilities as well as the testability of single-chip systems. This chapter describes a robust design methodology for enhanced reliability of analog front-end circuits in mixed-signal chips. The system under development is comprised of small blocks close to high-frequency analog circuits under test to down-convert signals to low frequencies such that they can be routed to an on-chip analog-to-digital converter architecture for built-in testing applications. An efficient Fast Fourier Transform (FFT) engine is used to calculate the frequency spectrum of the signal with significantly less chip area compared to existing FFT engines. The self-contained system provides the measurement results in digital form to support digital calibration approaches, particularly those that involve digitally assisted analog blocks.
Chapter Preview
Top

Introduction

Overview

Mixed-signal information processing and communication chips continue to be essential in our lives to the point that many vital situations depend on their reliable operation. Consequently, there is an increasing incentive to incorporate self-test and correction features to improve the reliability of integrated circuits. This is especially true in medical and military applications where life-saving information is transmitted and received. Although new technologies allow the design of smaller chips with more functionality, their manufacturing process variability and post-production aging effects pose growing design and test challenges. Therefore, the development of adaptive single-chip systems is essential for high reliability in modern nanometer complementary metal-oxide-semiconductor (CMOS) technology.

Portable computational devices are becoming more complex and are employed in a more diverse range of systems, many of which contain analog circuits to process signals from sensors or antennas. Thus, the trend to integrate analog circuits, mixed-signal interfaces, and digital computational resources on single chips will continue. A critical drawback associated with system-on-a-chip (SoC) integration and CMOS technology scaling trends is the increasing susceptibility to manufacturing process variations and aging effects. One aim of the discussed research is the development of a robust design methodology for enhanced reliability of analog front-end circuits in mixed-signal chips. Parameter variability is a fundamental contributor to production yield and reliability problems, particularly for mixed-signal SoCs. As a consequence, design for optimum performance alone is insufficient. It is critical to improve on-chip measurement and self-calibration capabilities as well as the testability of single-chip systems during high volume production to increase yields and to lower the cost of manufacturing. Both yield and cost improvement have been identified as needs in the International Technology Roadmap for Semiconductors (International Roadmap Committee, 2011), and provide the incentive for built-in calibration and alternative test strategies.

In digital-intensive systems, design techniques have been instigated with adaptive body bias, supply voltage, and clock frequency. The benefits of these techniques range from yield and speed improvements to more efficient on-chip power and temperature management. On the other hand, built-in testing and calibration approaches for analog systems typically involve adjustments of bias conditions or programmable circuit-level elements within specific analog blocks to increase the production yield, to tune for enhanced performance, and to extend the lifetime of chips with higher reliability. In the analog case, the information for the calibration can also be extracted with local on-chip measurement circuitry, or in some cases from system-level data analysis routines in the digital signal processor. However, the measurement of analog/RF circuit-level variations frequently involves electrical peak or power detectors connected to the signal path, which share few commonalities with process-voltage-temperature (PVT) variation monitors for digital circuits. Furthermore, on-chip calibration schemes for system-level optimizations necessitate the measurement of circuit-level performance parameters rather than just electrical process parameters. In the case of analog blocks, the additional challenge is that the signal gain and linearity characteristics normally have to be monitored directly in the signal path of interest, requiring more specialized on-chip measurement circuitry. Since system-level digital variation monitoring and calibration control is an effective way to optimize SoC performance and reliability, there is an increasing need for more efficient on-chip measurement techniques to evaluate the analog components of mixed-signal SoCs.

Complete Chapter List

Search this Book:
Reset