Projects
Neural Network Visual Binding
- Projects
- Last Updated: Monday, 03 April 2017 23:45
Visual binding in animal vision is the process by which visual signals are separated and grouped according to the object in the visual field generating them. The process by which this is accomplished in biology is not well understood though many complicated models have been proposed. With this work we demonstrate one method by which this process can be accomplished in a neural network using biologically inspired algorithms to calculate wide-field motion, color, and orientation perception. This research is ongoing and we hope to expand this simple network to be able to recognize and separate more complex visual stimuli.
This work has so far led to two publications (the first and the second, back-to-back in the same journal).Chronic monitoring of human sleep
- Projects
- Last Updated: Tuesday, 06 September 2016 19:29
Biomimetic Visual Navigation
- Projects
- Last Updated: Monday, 29 February 2016 18:55
Visual navigation is something that almost all animals do exceedingly well and something that robots are exceedingly poor at replicating. We are using principles of biologically inspired engineering in an attempt to utilize what we have learned about how organisms use visual information to perceive and navigate in the world around them to develop a system that will enable robots to be more successful in the same.
Our current test platform is the highly modified radio-controlled car shown at top left, but we are hoping to test some of our high-speed algorithms on a flying platform in the near future. Stay tuned!
Rat "cognition"
- Projects
- Last Updated: Sunday, 21 June 2015 22:57
How does "thinking" work? Perhaps by studing how rats "think", we can understand how it works in humans. We are focusing on an area in the mammalian brain called the hippocampus, which has homologs in insects as well. By modeling this in a robotic framework, we hope to approach a practical understanding of machine intelligence.
Neuromorphic VLSI design ("Vision chips")
- Projects
- Last Updated: Sunday, 04 November 2012 19:08
You can find a list of relevant articles below.
Rule-based learning
- Projects
- Last Updated: Sunday, 04 November 2012 19:04
Parallel computing
- Projects
- Last Updated: Sunday, 04 November 2012 18:55
Biomedical Projects
- Projects
- Last Updated: Wednesday, 31 October 2012 00:47
Check out the NROS 415 laboratory!
- Projects
- Last Updated: Tuesday, 30 October 2012 05:56
Honeybee Speed Estimation
- Projects
- Last Updated: Saturday, 27 October 2012 20:07
The goal of this research is to mathematically describe how the brain processes and uses sensory information to generate appropriate behavioral responses. These mathematical models can then be used as a basis to understand higher-level behaviors or to design more intelligent robotic systems. The human brain contains around 10 billion neurons (the functional cells of the brain), making it a dauntingly large and complex structure to study. Because of this complexity, we study the honeybee, an organism with a much smaller brain (with around 1 million neurons) that still exhibits a variety of complex social, visual, and navigational behaviors. Of particular interest to us is the "waggle dance", in which a foraging honeybee communicates the location of a distant food source to other honeybees in the hive. Specifically, our research looks at how honeybees estimate the distance they have traveled based solely on a visual estimate of their flight speed. To accomplish this goal we combine information from multiple levels of analysis, from biophysics to neuroanatomy, to create a mathematical model of early visual processing. We then refine the model by studying the responses of tethered honeybees in a virtual flight arena. The model can then be programmed into a robotic system.
Dipteran Elementary Motion Detection
- Projects
- Last Updated: Saturday, 27 October 2012 20:07
In a collaborative project with the Strausfeld laboratory, a novel computational neuronal model of elementary motion detection based on anatomical, physiological, and behavioral observations of flies has been developed that serves as a working functional hypothesis as to how the underlying neuronal machinery may be organized. We are currently augmenting the existing EMD model with two important stages, working towards a more realistic model of the insect visual system. In the optics stage, light information is collected by each facet of the simulated compound eye. Collected light is then focused onto photoreceptors which further process the light information. A mathematical model of the photoreceptor stage is being used to simulate contrast adaptation under steady-state and dynamic conditions.
The Mothbot
- Projects
- Last Updated: Thursday, 25 October 2012 01:28
The field of neuroscience is moving toward understanding how sensory systems compute under closed-loop control. It is important to step away from open-loop experiments, i.e. where an animal cannot interact with its sensory inputs, because in the real world sensory neurons are passengers on a moving body whose sensory inputs are intimately related to its behavior. The challenge with performing these experiments under natural conditions is that conventional electrophysiology equipment is too bulky to be placed on a freely behaving animal. To solve this problem, we have designed a robotic electrophysiology instrument whose velocity is determined by bioelectrical signals from an animal, in our case the hawk moths and flies (model organisms for visual motion detection, olfaction, and insect flight). This robotic instrument allows us to perform electrophysiological experiments while a moth is onboard and controlling the robot, which, in engineering terms, closes the loop. With this instrument we will characterize visual motion detection neurons and investigate the use of these neurons as biosensors for robots.