While the majority of computer technology today relies on digital processing, the information around us is often collected in analog form through sensors - such as images from cameras, temperature readings, and sound recordings - and must be converted to digital for accuracy. However, imagine a self-driving car that needs to quickly gather data from its surroundings and make split-second decisions - this data must be converted rapidly and with precision.
But what if there were analog chips specifically designed to offer the precision of digital computing while also possessing the energy efficiency and speed of analog computing? In
A memristor is a small but powerful component of a circuit that can efficiently store and process data. In a previous study, researchers were able to fine-tune a memristor to achieve unprecedented levels of precision. Now, with the help of the lab of J. Joshua Yang, professor of Electrical and Computer Engineering at USC Viterbi School of Engineering, a new circuit and architecture have been developed to achieve even greater precision with the same memristors. This could expand the potential uses of this technology beyond its traditional role in low-precision tasks like neural networks.
In addition to this, Yang explains that the innovation is also applicable to other types of memory technologies, including magnetic and phase change memories. These could be used in applications such as magnetic hard disk drives and compact disks.
In general, it is quite difficult to quickly and precisely program an analog device to a specific target value. However, Yang's lab has developed a circuit architecture and corresponding algorithms to do just that. This advancement makes analog computing with analog devices much more attractive for a wide range of applications.
According to Yang, this innovation offers "greater efficiency and speed with the same level of accuracy as digital systems." This type of progress is crucial as it can be applied to train neural networks, which are essential for the development of artificial intelligence and machine learning. So far, these tasks have only been achievable with expensive digital systems. The improved precision offered by this innovation will also pave the way for new applications in fields like scientific computing, including weather forecasting.