Researchers at Penn State have developed the first artificial, multisensory integrated neuron. The team, led by Saptarshi Das, published their work in Nature Communication. The aim of the research was to mimic how our brains work and combine sensors to create a more efficient and effective method for artificial intelligence (AI). Currently, robots make decisions based on information from individual sensors that do not communicate with each other. This requires a central unit to process the information and make a decision, consuming more energy. However, by allowing sensors to communicate directly with each other, the process becomes more efficient, especially when the inputs from both sensors are subtle. The researchers integrated a tactile sensor and a visual sensor, allowing the output of one sensor to modify the other. They simulated touch input using the triboelectric effect and visual input using a transistor that can remember visual input. The results showed that when both visual and tactile signals were weak, the sensory response of the artificial neuron increased. The researchers hope that this artificial multisensory neuron system will enhance sensor technology’s efficiency and pave the way for more eco-friendly AI applications in robots, drones, and self-driving vehicles.


>Source link>

>>Join our Facebook Group be part of community. <<

By hassani

Leave a Reply

Your email address will not be published. Required fields are marked *