Modern computer models, such as those used in complicated, powerful AI applications, test the limits of classical digital computer operations. New computing architectures that mimic the operating principles of biological neural networks promise quicker, more energy-efficient data processing. A group of researchers has created an event-based architecture that uses photonic processors to transfer and analyze data via light. It, like the brain, allows for ongoing adaptation of the connections within the neural network. Changeable connections serve as the foundation for learning processes.
Artificial neurons that are activated by external excitatory inputs and have connections to other neurons are required for a neural network in machine learning. Synapses are the biological term for the connections between these artificial neurons. The researchers used a network of nearly 8,400 optical neurons made of waveguide-coupled phase-change material for their study, and they demonstrated that the connection between two of these neurons can indeed become stronger or weaker (synaptic plasticity) and that new connections can be formed or existing ones can be eliminated (structural plasticity). In contrast to previous investigations, the synapses in this study were coded as a result of the attributes of the optical pulses – that is, as a function of the wavelength and strength of the optical pulse. It enabled the integration of thousands of neurons on a single chip and optically connecting them.
Light-based computers have substantially better bandwidth than typical electronic processors, allowing for executing complicated computational tasks while using less energy. This innovative event-based architecture is based on fundamental research. The goal is to create an optical computing architecture that allows AI applications to be computed quickly and efficiently in the long run.
Related Content: Tunable Optical Chip paves the way for New Quantum Devices