Biology inspires perceptive machines

Biology inspires perceptive machines

Teaching a machine to sense its environment is one of the most intractable problems of computer science, but one European project is looking to nature for help in cracking the conundrum. It combined streams of sensory data to produce an adaptive, composite impression of surroundings in near real-time.

The team brought together electronic engineers, computer scientists, neuroscientists, physicists, and biologists. It looked at basic neural models for perception and then sought to replicate aspects of these in silicon.

"The objective was to study sensory fusion in biological systems and then translate that knowledge into the creation of intelligent computational machines," says Martin McGinnity, Professor of Intelligent Systems Engineering and Director of the Intelligent Systems Engineering Laboratory (ISEL) at the University of Ulster's Magee Campus and coordinator of the Future and Emerging Technologies(FET) initiative-funded SENSEMAKER project of the IST programme.

SENSEMAKER took its inspiration from nature by trying to replicate aspects of the brain's neural processes, which capture sensory data from eyes, ears and touch, and then combines these senses to present a whole picture of the scene or its environment. For example, sight can identify a kiwi, but touch can help tell if that kiwi is ripe, unripe or over-ripe.

What's more, if one sense is damaged, or if a sensory function is lost due to environmental factors, say because it can't see in the dark, the brain switches more resources to other senses, such as hearing or touch. Suddenly those faculties become comparatively hypersensitive. When it goes dark the brain pours resources into these two senses, along with hearing and smell, to extract the maximum possible data from the environment.

Modelling human perception

To explore these aspects of biological perception SENSEMAKER first developed a model of human perception, based on the best available data from the biological and neurological sciences.

Biological neurons use short and sudden increases in voltage to send information. These signals are more commonly known as action potentials, spikes or pulses. Computer science calls the phenomenon Spiking Neural Networks. More traditional or classical artificial neural networks use a simpler model. "The traditional model of an artificial neural network is quite removed from biological neurons, while the spiking neural networks we used are more faithful to what happens in the real biological brain," says Professor McGinnity.

Similarly, adaptation is another aspect of the biological model, known as plasticity, where data flows through new routes in the brain to add further resources to data capture. If repeated over time, this plasticity becomes learning, where well-travelled routes through the brain become established and reinforce the information that passes.

As the model was being established, the team developed hardware demonstrators to implement and test components of the overall sensory fusion system. One project partner, the Ruprecht Karl Universitaet in Heidelberg, focused on implementations based on classical traditional neural networks – essentially large arrays of simple threshold devices. In parallel the ISEL group used Field Programmable Gate Arrays (FPGAs) to implement large arrays of spiking neural networks for emulation of a number of components of the sensory system, particularly the visual processing element.

"FPGAs are hardware computing platforms that can be dynamically reconfigured and as such, are ideal for exploring artificial representations of biological neurons, since their ability to reconfigure can be exploited, to some extent to mimic the plasticity of biological networks of neurons," says Professor McGinnity.

Biologically-plausible hardware implementations

Spiking neurons are more biologically compatible compared to traditional classical neural networks, such as the McCulloch-Pitts threshold neuron, because the time between spikes and their cumulative effect determine when the neuron fires. By using an advanced FPGA computing platform, ISEL were able to implement large networks of spiking neurons and synapses, and test the biological approaches for sensory fusion. The FPGA approach allows for flexibility, both in terms of rapid prototyping and the ease with which different neuron models can be implemented and tested.

However, dedicated analogue or mixed analogue-digital circuitry allows for greater integration and lower power operations. To exploit these properties, the Heidelberg group developed a spiking neuron Application Specific Integrated Circuit device, so as to be able to emulate larger constituent components of biological sensory systems. A prototype device had been submitted for fabrication when the project completed, but when fabricated will be exploited in a follow-up European project.

These circuits process data in a similar manner to the biological brain, focusing resources on the most data-rich sensory stream. A user interface on a PC lets researchers engage with the system.

The team concentrated on two particular senses, namely sight and touch. The experimental touch-sensor system, developed in Heidelberg and used by the SENSEMAKER partner Trinity College, Dublin, is itself quite novel. It features an array of small, moveable spring-loaded pins. This enabled psychophysical experiments on touch and vision to be conducted on humans and was a very valuable tool in exploring human responses to sensory integration. The results from these experiments helped to inform the sensory fusion model.

Modelling sensory fusion

The project has created a sophisticated, biologically-inspired model of sensory fusion, for tactile and visual senses. Perhaps the greatest achievement of the project is the creation of a framework which allows extensive experimentation in terms of sensory integration. The project’s work can easily be extended into other sensory modalities; for example the project partners are currently planning to extend the work to auditory senses. The hardware implementation(s) of the model, which allow for extremely rapid learning as compared to biological timescales, will be exploited in follow-up projects.

"Using these systems we were able to show that the merging of tactile and visual information, or sensory fusion, improved overall performance," says Professor McGinnity. The ultimate outcome of this type of research is to implement perception capabilities in computer systems, with applications in a wide range of areas including robotics.

But a greater understanding of biological sensory fusion, and how to implement it in artificial systems, could do potentially much more.

"This type of research teaches us a lot about how biological systems work, and it could lead to new ways of treating people with sensory-related disabilities, though that kind of outcome will take a long time," says Professor McGinnity.

He says intelligent systems need to adapt to their environment without reprogramming; they need to be able to react autonomously in a manner that humans would describe as intelligent; for that they need a perception system that enables them to be aware of their surroundings.

Two other projects will carry aspects of their work further. The FACETS project, also funded by FET will continue to explore machine perception, focusing on vision. Meanwhile ISEL at Magee Campus is actively engaged in a major proposal to create a Centre of Excellence in Intelligent Systems. The Centre will progress a range of research problems related to the creation of intelligent systems, including sensory fusion, learning, adaptation, self-organisation, the implementation of large-scale biological neural sub-systems in hardware and distributed computational intelligence.

The SENSEMAKER project is an excellent example of the integration of a multidisciplinary team, combining basic science with very advanced engineering. Projects involving multidisciplinary teams are challenging, but when they work well can be extremely rewarding. Each discipline brings its own particular expertise, experimental approach, technical language, viewpoint, and problem solving technique to the discussions. This rich mix of knowledge and methodologies allows for exhilarating technical discussions and leads to approaches being adopted that would not otherwise have been considered.

In the case of SENSEMAKER, this was certainly the case – as well as the concrete technical benefits, the project has brought to the team’s biologists and neuroscientists a greater knowledge and understanding of the engineering approach to problem solving and system design; conversely the engineers on the team benefited from a vastly improved insight into the world of biological system modelling. Overall the project has contributed to an improved understanding of how biological systems merge multimodal sensory information. This is one of the most difficult problems in science today; the results of the SENSEMAKER project are being disseminated in high quality international journals, reflecting the fact that the research performed in this project is at the state-of-the-art. Both biological and neurological science on the one hand, and machine intelligence and computer science on the other have benefited from its successful conclusion.

Source: IST Results

Citation: Biology inspires perceptive machines (2006, February 9) retrieved 18 April 2024 from https://phys.org/news/2006-02-biology-perceptive-machines.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Machine learning sifts through vast data from X-ray diffraction techniques to find new materials

0 shares

Feedback to editors