How Does a CDMA Technology Work?


CDMA stands for Code-Division Multiple Access and it refers to any of several protocols used in second-generation (2G) and third-generation (3G) wireless communications. It is a form of multiplexing, which allows numerous signals to occupy a single transmission channel, optimizing the use of available bandwidth. The technology is used in ultra-high-frequency (UHF) cellular telephone systems in the 800-MHz and 1.9-GHz bands.

CDMA employs analog-to-digital conversion (ADC) in combination with spread spectrum technology. Audio input is first digitized into binary elements. The frequency of the transmitted signal is then made to vary according to a defined pattern (code), so it can be intercepted only by a receiver whose frequency response is programmed with the same code, so it follows exactly along with the transmitter frequency. There are trillions of possible frequency-sequencing codes, which enhances privacy and makes cloning difficult.

The CDMA channel is nominally 1.23 MHz wide. CDMA networks use a scheme called soft handoff, which minimizes signal breakup as a handset passes from one cell to another. The combination of digital and spread-spectrum modes supports several times as many signals per unit bandwidth as analog modes. CDMA is compatible with other cellular technologies; this allows for nationwide roaming. The original CDMA standard, also known as CDMA One, offers a transmission speed of only up to 14.4 Kbps in its single channel form and up to 115 Kbps in an eight-channel form. CDMA2000 and Wideband CDMA deliver data many times faster.

The CDMA2000 family of standards includes 1xRTT, EV-DO Rev 0, EV-DO Rev A and EV-DO Rev B (renamed Ultra Mobile Broadband -- UMB). People often confuse CDMA2000 (a family of standards supported by Verizon and Sprint) with CDMA (the physical layer multiplexing scheme).

Via: [Tech Target]

Previous Post
Next Post

About Author

0 comments: