The rapid progression of technology has led to a huge increase in ラーメンベット 入金スピード usage to process the massive troves of data generated by devices. But researchers in the ラーメンベット 禁止ゲーム of Engineering at The University of Texas at Austin have found a way to make the new generation of smart computers more ラーメンベット 入金スピード efficient.

Traditionally, silicon chips have formed the building blocks of the infrastructure that powers computers. But this ラーメンベット 入金スピード uses magnetic components instead of silicon and discovers new information about how the physics of the magnetic components can cut energy costs and requirements of training algorithms — neural networks that can think like humans and do things like recognize images and patterns.

"Right now, the methods for training your neural networks are very ラーメンベット 入金スピード-intensive," said Jean Anne Incorvia, an assistant professor in the ラーメンベット 禁止ゲーム's Department of Electrical and ComputerEngineering. "What our work can do is help reduce the training effort and ラーメンベット 入金スピード costs."

The researchers' findings were published this week in IOP Nanotechnology. Incorvia led the study with first author and second-year graduate student Can Cui. Incorvia and Cui discovered that spacing magnetic nanowires, acting as artificial neurons, in certain ways naturally increases the ability for the artificial neurons to compete against each other, with the most activated ones winning out. Achieving this effect, known as “lateral inhibition,” traditionally requires extra circuitry within computers, which increases costs and takes more ラーメンベット 入金スピード and space.

lateral inhibition magnetic processing

Incorvia said their method provides an ラーメンベット 入金スピード reduction of 20 to 30 times the amount used by a standard back-propagation algorithm when performing the same learning tasks.

The same way human brains contain neurons, new-era computers have artificial versions of these integral nerve cells. Lateral inhibition occurs when the neurons firing the fastest are able to prevent slower neurons from firing. In computing, this cuts down on ラーメンベット 入金スピード use in processing data.

Incorvia explains that the way computers operate is fundamentally changing. A major trend is the concept of neuromorphic computing, which is essentially designing computers to think like human brains. Instead of processing tasks one at a time, these smarter devices are meant to analyze huge amounts of data simultaneously. These innovations have powered the revolution in machine learning and artificial intelligence that has dominated the technology landscape in recent years.

This ラーメンベット 入金スピード focused on interactions between two magnetic neurons and initial results on interactions of multiple neurons. The next step involves applying the findings to larger sets of multiple neurons as well as experimental verification of their findings.

The ラーメンベット 入金スピード was funded by a National Science Foundation CAREER Award and Sandia National Laboratories, with resources from UT’s Texas Advanced Computing Center.