Learning and Cognition Lab Facilities
The Learning and Cognition Program operates three research laboratories: the Laboratory on Learning and Cognition, the Eye-Tracking Laboratory, and the Psychophysiology Laboratory. These laboratories are are overseen by department faculty and are used by both students and faculty to explore theoretical and applied research questions.
The Laboratory on Learning & Cognition
The Eye-Tracking Laboratory
The Psychophysiology Laboratory
The Multimedia and Instructional Design (MIND) Lab
The Advanced Instructional Systems and Technologies Laboratory
The Laboratory on Learning & Cognition:
The Laboratory on Learning and Cognition is directed by professors Dan J. Woltz and Michael K. Gardner. The Lab features eight (8) IBM PC compatible data collection computers, as well as office space for graduate students and research assistants. Reaction time experiments are programmed using the E-Prime experimental authoring system. Research subjects (for studies approved through the University of Utah's Institutional Review Board) are recruited from the Department's undergraduate educational psychology courses (see Subject Pool). Research conducted in the Laboratory on Learning and Cognition has involved the acquisition of cognitive skills, undetected errors in cognitive skills, and priming processes in memory.
The Eye-Tracking Laboratory:
The Eye-Tracking Laboratory (sometimes called the Reading Laboratory) is directed by professors Anne E. Cook and Douglas J. Hacker. The lab currently has an Applied Sciences Laboratory (ASL) Model 501 head-mounted eye tracker with high speed optics. The eye tracker is interfaced and controlled via a Hewlett-Packard 1.8GHz computer. Another HP 1.8GHz computer, with a 22" monitor (20" viewable) runs the experiments. Participants have free head and eye movement, and their head movements and orientation are recorded using the ASL EyeHead Integration system. The eye tracker has an accuracy range of one to one half degree of visual angle. Students receive extensive training before using the eye-tracker for research projects. Research conducted in the Eye-Tracking Laboratory has involved the psychology of reading, the psychology of writing, and the detection of deception.
The Psychophysiology Lab:
The Psychophysiology Laboratory is directed by John C. Kircher. The Laboratory features a Biopac MP 100 polygraph that allows data collection of up to sixteen channels of psychophysiological data. Software developed by Dr. Kircher is used to extract psychophysiological features from psychophysiological data streams. Dr. Kircher is the nation's leading expert on the computerized detection of deception. Research conducted in the Psychophysiology Laboratory has involved the detection of deception, childhood psychopathology, gambling behavior, and client/counselor interactions.
The Multimedia and Instructional Design (MIND) Lab:
The MIND Lab is directed by Kirsten R. Butcher. The MIND Lab features seven PCs, four Wacom Intuos drawing tablets, four iPads (Air2), Yeti recording microphones, and webcams. The MIND Lab is equipped with usability software (Morae Studio) capable of capturing a picture-in-picture view of learners and their actions on the computer screen. Research conducted in the MIND Lab is focused on understanding when, how, and why learning processes and outcomes are impacted by different forms of multimedia and interactive visualizations. Current and recent studies include research on: video learning with automatic educational supports, self-regulated learning environments, augmented reality support for hands-on learning, comprehension of medical information with and without visual supports, and strategies for online learning. Graduate students who wish to conduct research in the MIND Lab should contact Dr. Butcher for more information.
The Advanced Instructional Systems and Technologies Laboratory:
The Advanced Instructional Systems and Technologies (ASSIST) laboratory is directed by Assistant Professor Eric Poitras. The lab includes two Cybertron Hellion PCs equipped with low-cost sensors and SDK/API support for developing learner modeling solutions that leverage sensor data from multiple modalities, including self-report measures, physiological responses (NeuLog HR and GSR), biomechanics (Kinnect V2 and Xtion Pro Live), facial expressions (Logitech C920), eye fixation and movement patterns (Eye Tribe Tracker), audio signal processing (CAD U37 Studio Mics), as well as mouse cursor movements and user interaction data (Camtasia screen-recordings and log-file databases). Research conducted in the ASSIST lab is focused on the development and evaluation of adaptive computer-based learning environments. Current and past studies include research on intelligent tutoring systems in medicine and history, an intelligent web browser for teacher professional development, a content management system with eLearning modules, and a visualization dashboard to support teachers to find online resources. Undergraduate and graduate students are welcome to contact Dr. Poitras for more information on on-going opportunities to work in the lab.