Research

My lab uses non-verbal, associative learning procedures and three species (rats, pigeons, and humans) to explore similarities and differences in learning and memory. Regarding the content of memory, my research has found evidence to support the acquisition of spatial (where), temporal (when), and causal (how) information during associative procedures.  Play the video below for a look at some of the research conducted in the TCU comparative cognition lab

Selective interference effects in working memory. We are using a change detection task with pigeons, rats, and humans to make cross-species comparisons regarding the nature of working memory. We have recently developed an iPad App for behavioral investigations with non-human animals and validated the use of the iPad with rats.

Touchscreen Behavioral Evaluation System (TBES)

TBES has two components: the TBES itunes Store App (Cupertino, California) and a server program written in Microsoft Visual Basic. TBES is designed to serve as a visual interface for commonly used cognitive and behavioral tasks to be presented on an iPad.

For more information, click on TBES icon below


Operant Chambers


Open Field

Hierarchical control of search behavior. We are interested in testing hierarchical vs. configural theories of occasion setting within a spatial-search task. For example, pigeons learn to find food in one location relative to a landmark (to the left) if the background color of a computer display is red and at another location relative to the same landmark (to the right) if the background color is blue. The color of the display may modulate the spatial relationship between the landmark and food (hierarchical theory) or each background color may be represented as a unique configuration with the landmark. This project uses nearly identical methods to test pigeons and humans. The pigeons peck to a touchscreen-equipped monitor and humans use a blaster to select the location of a hidden reward (via a sensor bar).

Touchscreen


ARENA

Relational learning in children with autism. This project investigates how children with autism process visual information. We are currently using a modified match-to-sample task. The task may be solved by representing each display of sample and comparison items as a unique configuration or by learning to respond based on the relationship between the individual items.

Touchscreen

Hierarchical control of search behavior. We are interested in testing hierarchical vs. configural theories of occasion setting within a spatial-search task. For example, pigeons learn to find food in one location relative to a landmark (to the left) if the background color of a computer display is red and at another location relative to the same landmark (to the right) if the background color is blue. The color of the display may modulate the spatial relationship between the landmark and food (hierarchical theory) or each background color may be represented as a unique configuration with the landmark. This project uses nearly identical

 

Remote Response Apparatus