Optical Computing




With the growth of computing technology the need of high performance computers (HPC) has significantly increased. Optics has been used in computing for a number of years but the main emphasis has been and continues to be to link portions of computers, for communications, or more intrinsically in devices that have some optical application or component (optical pattern recognition etc.)                         
Optical computing was a hot research area in 1980’s.But the work tapered off due to materials limitations that prevented optochips from getting small enough and cheap enough beyond laboratory curiosities. Now, optical computers are back with advances in self-assembled conducting organic polymers that promise super-tiny of all optical chips.
Optical computing technology is, in general, developing in two directions. One approach is to build computers that have the same architecture as present day computers but using optics that is Electro optical hybrids. Another approach is to generate a completely new kind of computer, which can perform all functional operations in optical mode. In recent years, a number of devices that can ultimately lead us to real optical computers have already been manufactured. These include optical logic gates, optical switches, optical interconnections and optical memory.
Current trends in optical computing emphasize communications, for example the use of free space optical interconnects as a potential solution to remove ‘Bottlenecks’ experienced in electronic architectures. Optical technology is one of the most promising, and may eventually lead to new computing applications as a consequence of faster processing speed, as well as better connectivity and higher bandwidth.

NEED FOR OPTICAL COMPUTING

The pressing need for optical technology stems from the fact that today’s computers are limited by the time response of electronic circuits. A solid transmission medium limits both the speed and volume of signals, as well as building up heat that damages components.
One of the theoretical limits on how fast a computer can function is given by Einstein’s principle that signal cannot propagate faster than speed of light. So to make computers faster, their components must be smaller and there by decrease the distance between them. This has resulted in the development of very large scale integration (VLSI) technology, with smaller device dimensions and greater complexity. The smallest dimensions of VLSI nowadays are about 0.08mm. Despite the incredible progress in the development and refinement of the basic technologies over the past decade, there is growing concern that these technologies may not be capable of solving the computing problems of even the current millennium. The speed of computers was achieved by miniaturizing electronic components to a very small micron-size scale, but they are limited not only by the speed of electrons in matter but also by the increasing density of interconnections necessary to link the electronic gates on microchips.
The optical computer comes as a solution of miniaturization problem.Optical data processing can perform several operations in parallel much faster and easier than electrons. This parallelism helps in staggering computational power. For example a calculation that takes a conventional electronic computer more than 11 years to complete could be performed by an optical computer in a single hour. Any way we can realize that in an optical computer, electrons are replaced by photons, the subatomic bits of electromagnetic radiation that make up light.

If you like this please Link Back to this article...



Post a Comment