reseize logo;

Who is what? What is where? Where am I? Are you there?

You have hit the other collection, a newslog designed for the curious.

Google
reseize web
  • Netvibes
  • Writely
  • Bubbleshare
  • CalendarHub
  • Rallypoint
  • MyLinkVault bookmarking service
  • YubNub
  • Techcrunch Blog
  • The best Web 2.0 software
  • Blog Directory
  • Videolan
  • Open Office
  • Mozilla
  • Hazard Cards
  • kei-koo
  • Laboranova
  • Ajax by Joel Parish
  • The grand old dame of social tags ;-)
  • another bid on social bookmarking
  • Kaspersky
  • MediaMonkey
  • Flock
  • Google Blog
  • Google News


  • Powered by FeedBlitz

  • Home of Radi8
  • Radi8 at Garageband
  • Terminator Ted at Garageband
  • Coralie at Garageband
  • Radi8 at CD Baby
  • OpenWengo
  • VOIP Now
  • VOIP News
  • VOIP Preview.org
  • VOIP User.org
  • Skype
  • Google Talk
  • Free CA - by Barmala
  • Web Of Trust auch auf Deutsch
  • The Minstrel web of trust
  • Thawte Web Of Trust
  • 10 Punkte Web Of Trust Notar
  • Boing Boing
  • Engadget
  • Basenotes
  • Radi8 Mirror
  • Orkut
  • Blogarama
    Subscribe in Bloglines

    Subscribe in NewsGator Online

    Add to Google

    Subscribe in FeedLounge

    Add to My AOL

    Subscribe in Rojo

    Sunday, February 12, 2006

    Teaching a machine to sense its environment

    Teaching a machine to sense its environment is one of the most intractable problems of computer science, but one European project is looking to nature for help in cracking the conundrum. It combined streams of sensory data to produce an adaptive, composite impression of surroundings in near real-time.
    SENSEMAKER took its inspiration from nature by trying to replicate aspects of the brain's neural processes, which capture sensory data from eyes, ears and touch, and then combines these senses to present a whole
    picture of the scene or its environment. For example, sight can identify kiwi, but touch can help tell if that kiwi is ripe, unripe or over-ripe. What's more, if one sense is damaged, or if a sensory function is lost due to environmental factors, say because it can't see in the dark, the brain switches more resources to other senses, such as hearing or touch. Suddenly those faculties become comparatively hypersensitive. When it goes dark the brain pours resource into these two senses, along with hearing and smell, to extract the maximum possible data from the environment.
    To explore these aspects of biological perception SENSEMAKER first developed a model of human perception, based on the best available data from the biological and neurological sciences.
    Biological neurons use short and sudden increases in voltage to send information. These signals are more commonly known as action potentials, spikes or pulses. Computer science calls the phenomenon Spiking Neural Networks. More traditional or classical artificial neural networks use a simpler model. "The traditional model of an artificial neural network is quite removed from biological neurons, while the spiking neural networks we used are more faithful to what happens in the real biological brain," says Professor McGinnity.
    Similarly, adaptation is another aspect of the biological model, known as plasticity, where data flows through new routes in the brain to add further resources to data capture.
    If repeated over time, this plasticity becomes learning, where well-travelled routes through the brain become established and reinforce the information that passes.
    As the model was being established, the team developed hardware demonstrators to implement and test components of the overall sensory fusion system. One project partner, the Ruprecht Karl Universitaet in Heidelberg, focused on implementations based on classical traditional neural networks – essentially large arrays of simple threshold devices. In parallel the ISEL group used Field Programmable Gate Arrays (FPGAs) to implement large arrays of spiking neural networks for emulation of a number of components of the sensory system, particularly the visual processing element. "FPGAs are hardware computing platforms that can be dynamically reconfigured and as such, are ideal for exploring artificial representations of biological neurons, since their ability to reconfigure can be exploited, to some extent to mimic the plasticity of biological networks of neurons," says Professor McGinnity.

    Spiking neurons are more biologically compatible compared to traditional classical neural networks, such as the McCulloch-Pitts threshold neuron, because the time between spikes and their cumulative effect determine when the neuron fires. By using an advanced FPGA computing platform, ISEL were able to implement large networks of spiking neurons and synapses, and test the biological approaches for sensory fusion. The FPGA approach allows for flexibility, both in terms of rapid prototyping and the ease with which different neuron models can be implemented and tested.

    Read more about the project here & at Roland Piquepaille's Emerging Technology Trends

    Comments on "Teaching a machine to sense its environment"

     

    post a comment

    Links to "Teaching a machine to sense its environment"

    Create a Link