A new type of computer chip developed by researchers at MIT, the University of California, Berkeley, and the University of Colorado, Boulder, could lead to faster data transmissions and potentially more energy efficient data centers. The technology is based on using light rather than streams of electrons to transmit data.
The researchers said the chip (Image: Glenn J. Asakawa) can transmit 50 to 100 times the amount of data per second as a traditional electronic chip.
“Data transport across short electrical wires is limited by both bandwidth and power density, which creates a performance bottleneck for semiconductor microchips in modern computer systems — from mobile phones to large-scale data [centers],” the researchers wrote in this week’s issue of the scientific journal Nature.
But attempts to build a light-based chip have been stymied by the difficulty of integrating both electronic and photonic components on the same chip. So far, that has limited the abilities of manufacturers to produce chips that include only limited numbers of optical devices on mostly electronic platforms.
Integrating Photonics with Electronics
Traditionally, researchers have attempted to develop a separate custom process for manufacturing a chip’s photonic devices. However, this approach usually complicates the process of integrating high-end electronic devices on the same chip at scale. The manufacturing process the MIT team developed, on the other hand, uses the same standard microfoundry techniques used to manufacturer normal electronic components.
The new approach has allowed them to build an electronic-photonic system on a single chip consisting of more than 70 million transistors and 850 photonic components that work together to provide logic, memory, and interconnect functions.
Using the new process to produce a prototype, the researchers said they were able to achieve data transfer speeds at a rate of 300 gigabits per second per square millimeter.
Ready for Market in 2 Years
Some 20 to 30 percent of the electricity used in enterprise data centers comes from transferring data between processors, memory, and networking cards, according to Chen Sun, a researcher at UC Berkeley and a co-author on the Nature paper. By implementing these sorts of electronic-photonic chips, organizations could realize enormous electricity savings in their data centers, he noted.
Data centers in the United States will consume 140 billion kilowatt-hours of electricity a year by 2020, costing $13 billion and emitting 100 million metric tons of carbon, according to an analysis by the Natural Resources Defense Council.
Although the researchers’ paper only describes a prototype, commercializing the process might not be too far off. Sun has already founded a company to develop the technology for commercial applications targeted at large data centers. Sun estimated that products based on the new technology could begin appearing in markets in as little as two years.
“This demonstration could represent the beginning of an era of chip-scale electronic-photonic systems with the potential to transform computing system architectures, enabling more powerful computers, from network infrastructure to data [centers] and supercomputers,” the researchers said.
This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.