Partnerships for International
Research and Education
Partnerships for International
Research and Education

Aaron Seitz

Aaron Seitz
Professor of Psychology, University of California at Riverside
image_normal

Biography:

A central issue in neuroscience is how the brain selectively adapts to important environmental changes. While the brain needs to adapt to new environments, its architecture has to be protected from modification due to continual bombardment of undesirable information. Clarifying how the brain solves this so-called stability-plasticity dilemma in its sensory areas is the primary goal of my research.

Acquisition of Learning

A basic starting question is - how do we know what to learn? That is, how does a neural system know which information is behaviorally relevant and which is not? This question was most famously addressed by studies of conditioning conducted by Pavlov. He discovered that repeated temporal coincidence of an unconditioned stimulus, such as food, and a conditioned stimulus, such as a tone, can result in learning, where an association is formed between the tone and a reaction, such as salivation, which is normally elicited by food.

While conditioning is an influential model for stimulus-response learning, improvements of perceptual abilities in adults (perceptual learning) are classically thought to either arise from attentional selection of behaviorally relevant stimulus properties or to result from mere stimulus exposure. To test the hypothesis that the temporal coincidence of a stimulus and a reward will result in learning of that stimulus, we designed a new study in which a perceptually invisible motion stimulus (i.e. not attended by subject) was paired with the letter-target of a rapid serial visual presentation (RSVP) task. We found that learning effects previously thought to be either due to attention, or resulting from mere stimulus exposure, actually result from a reinforcement process, similar to that found in conditioning (Seitz and Watanabe 2003, Nature). A related study shows that this fails during that attentional blink of a target, suggesting that successful recognition of a target is necessary and may serve as an internal reward (Seitz et al., 2005, Current Biology). Additionally, when low contrast motion stimuli are used, subjects develop a perceptual bias (i.e. a conditioned visual response) in that they report seeing the paired-direction even when no stimulus is presented in the test (Seitz et al., 2005a, PNAS). This research led to model of perceptual learning where learning results from timely interactions between diffusive, reward-related, learning signals and bottom-up stimulus signals (see Seitz and Watanabe, 2005, TICS).

Consolidation of Learning

In addition to investigations of the neural mechanisms that allow for learning, we have conducted research studying mechanisms involved in the stabilization and consolidation of learning. In this series of studies we investigate if perceptual learning of a given stimulus feature can be disrupted by subsequent training with a different visual feature. We have so far demonstrated that perceptual learning of a hyperacuity stimulus can be disrupted when a second hyperacuity stimulus is learned, although a delay of an hour or more between the training sessions will ameliorate the effects of disruption. This interference is highly specific to particular features of the second hyperacuity stimulus and only occurs when retinotopic location and stimulus orientations of the two trained stimuli are matched (Seitz et al, 2005b, PNAS ).

Multisensory Learning

Another important line of research is regarding how multisensory interactions can be learned and play a role in learning. This research is in light of accumulating reports of crossmodal interactions in various settings, which show that interactions between modalities are the rule rather than the exception in human processing of sensory information. In fact our first investigation with a multisensory learning paradigm revealed that the presence of auditory features facilitated the learning of visual features (Seitz, Kim & Shams, 2006, Current Biology). We are following up on this early investigation to better understand what types of multisensory interactions are most productive to learning and how new multisensory associations can be learned. To investigate multisensory associations, we are using techniques of statistical learning, a fast learning paradigm where new associations develop after only a few minutes of exposure. Initial results indicate that unisensory auditory or visual associations can develop in parallel and independently of multisensory associations (Seitz et al., 2007, Perception).

Selected Publications

  • Shams and Seitz (2008). "Benefits of multisensory learning", Trends in Cognitive Science, Nov (Vol 12(11) 411-417
  • Seitz, Pilly, Pack (2008). "Interactions between contrast and spatial displacement in visual motion processing", Current Biology, Oct 14; 18(19):R904-6
  • Tsushima, Seitz, Watanabe (2008). "Task-irrelevant learning occurs only when the irrelevant feature is weak", Current Biology, Jun (Vol 18 (12) R516-7)
  • Seitz, Kim, Van Wassenhove, and Shams (2007). "Simultaneous and Independent Acquisition of Multisensory and Unisensory Associations", Perception, 36, 1445 - 1453
  • Seitz and Dinse (2007). "A common framework for perceptual learning", Current Option in Neurobiology, April (17(2) 148-153).
  • Seitz, Kim, Shams (2006). "Sound Fascilitates Visual Learning", Current Biology, Jul (Vol 16 (14) 1422-1427)
  • Seitz and Watanabe (2005). "A unified model for perceptual learning", Trends in Cognitive Science, Jul (Vol 9(7) 329-334).
  • Seitz, Yamagishi, Werner, Goda, Kawato, Watanabe (2005). "Task specific disruption of perceptual learning", PNAS, Oct 3; 10.1073/pnas.0505765102
  • Seitz, Lefebvre, Watanabe, Jolicoeur (2005). "The requirement of high-level processing in subliminal learning", Current Biology, Sept 20;15(18):R753-5
  • Seitz, Nanez, Holloway, Koyama, Watanabe (2005). "Seeing what is not there shows the costs of perceptual learning", PNAS, Jun 21;102(25):9080-5
  • Seitz and Watanabe (2003). "Is subliminal learning really passive?", Nature, (Vol 422(6927): 36). 1999