A Landmark Move to Mitigate AI Systems’ Buckling Energy Needs

There is plenty to like and appreciate about the human knowhow, but at the same time, nothing deserves that treatment more than our tendency to improve at a consistent clip. This tendency, in particular, has brought the world some huge milestones, with technology emerging as quite a major member of the group. The reason why we hold technology in such a high regard is, by and large, predicated upon its skill-set, which guided us towards a reality that nobody could have ever imagined otherwise. Nevertheless, if we look beyond the surface for one hot second, it will become abundantly clear how the whole runner was also very much inspired from the way we applied those skills across a real world environment. The latter component, in fact, did a lot to give the creation a spectrum-wide presence, and as a result, initiated a full-blown tech revolution. Of course, the next thing this revolution did was to scale up the human experience through some outright unique avenues, but even after achieving a feat so notable, technology will somehow continue to bring forth the right goods. The same has turned more and more evident in recent times, and assuming one new discovery ends up with the desired impact, it will only put that trend on a higher pedestal moving forward.

The researching teams at Northwestern University, Boston College, and Massachusetts Institute of Technology have successfully developed a new synaptic transistor, which can do higher-level thinking in an energy-efficient manner. To understand the significance behind such a development, we must acknowledge that, while recent advances in artificial intelligence have made it possible for us to develop computers that operate more like a human brain, the limited prowess of current data processing systems has massively slowed down our progression with the technology. You see, these systems are essentially made up of separate processing and storage units, and like you can guess, such a fragmented setup ends up demanding energy in spades, especially when asked to conduct data-intensive tasks. Take this limitation into account and place it alongside the fact that smart devices today are continuously collecting enormous quantities of data, thus causing the researching community to urgently find new methods which can process the information without consuming an increasing amount of power. Mind you, other studies in the past have yielded concrete results to some extent, but despite all the efforts, their most developed technology for performing a combined processing and memory function has been a memory resistor, or “memristor” system. So, what’s the problem with a memristor? Well, even though it has the means to conduct data-intensive tasks, memeristor systems still can’t shake off a downside in energy costly switching.

“For several decades, the paradigm in electronics has been to build everything out of transistors and use the same silicon architecture. Significant progress has been made by simply packing more and more transistors into integrated circuits. You cannot deny the success of that strategy, but it comes at the cost of high power consumption, especially in the current era of big data where digital computing is on track to overwhelm the grid. We have to rethink computing hardware, especially for AI and machine-learning tasks,” said Mark C. Hersam, a researcher at Northwestern University, who co-led the research.

As for how did the researchers land themselves a breakthrough here, the answer comes from how they leveraged new advances in the physics of moiré patterns, a type of geometrical design that arises when two patterns are layered on top of one another. You layer the pattern this way because when two-dimensional materials are stacked, new properties emerge that do not exist in one layer alone. Furthermore, when those layers are twisted to form a moiré pattern, it straight up produces unprecedented tunability of electronic properties. In their case, the teams combined two different types of atomically thin materials: bilayer graphene and hexagonal boron nitride. Upon interacting, the materials formed a moiré pattern. Once that was achieved, the researchers rotated one layer relative to the other, a move which helped them achieve different electronic properties in each graphene layer, even though they are separated by only atomic-scale dimensions. Only asked to pick and choose the right type of twist, the researchers were able to utilize moiré physics for neuromorphic functionality at room temperature.

“The brain has a fundamentally different architecture than a digital computer,” said Hersam. “In a digital computer, data move back and forth between a microprocessor and memory, which consumes a lot of energy and creates a bottleneck when attempting to perform multiple tasks at the same time. On the other hand, in the brain, memory and information processing are co-located and fully integrated, resulting in orders of magnitude higher energy efficiency. Our synaptic transistor similarly achieves concurrent memory and information processing functionality to more faithfully mimic the brain.”

The next step involved training the transistor to recognize similar, but not identical, patterns. The researching teams showed the device one pattern: 000 (three zeros in a row). Then, they asked the AI to identify similar patterns, such as 111 or 101. This they would follow up with some experiments meant to test the device’s learning. Going by the available details, their new transistor effectively recognized similar patterns, providing a strong testament of its associative memory. Making the discovery all the more important is that the device displayed a similar success rates even when the researchers complicated things a little by giving it incomplete patterns.

“If we trained it to detect 000 and then gave it 111 and 101, it knows 111 is more similar to 000 than 101. 000 and 111 are not exactly the same, but both are three digits in a row. Recognizing that similarity is a higher-level form of cognition known as associative learning,” said Hersam. “Current AI can be easy to confuse, which can cause major problems in certain contexts. Imagine if you are using a self-driving vehicle, and the weather conditions deteriorate. The vehicle might not be able to interpret the more complicated sensor data as well as a human driver could. But even when we gave our transistor imperfect input, it could still identify the correct response.”

The new transistor technology comes shortly after Hersam and his team’s development of a separate nanoelectronic device. This nanoelectronic device already has an ability to analyze and categorize data in an energy-efficient manner, but at the same time, it still cannot claim to have the efficiency of their latest brainchild.

Copyrights © 2024. All Right Reserved. Engineers Outlook.