EAGER/Collaborative Research: A New Science of Visual Experience

Information

  • NSF Award
  • 1548394
Owner
  • Award Id
    1548394
  • Award Effective Date
    9/1/2015 - 9 years ago
  • Award Expiration Date
    8/31/2017 - 7 years ago
  • Award Amount
    $ 195,845.00
  • Award Instrument
    Standard Grant

EAGER/Collaborative Research: A New Science of Visual Experience

The essence of human experience is interacting with the natural and man-made environments through the five human senses, and through vision in particular. The objective of this EArly-concept Grant for Exploratory Research (EAGER) project is to build analytical foundations for a new science of visual experience that will bridge basic approaches from cognitive science and systems engineering. Specifically, the research will build mathematical and computational models of a human navigating a three dimensional space such as a factory, museum, or retail store. The idea is to gain insights into the limits on observability and controllability in human-technology systems, and to improve the user's situational awareness. If the research is successful, researchers will be able to describe situations in terms of possibilities for action and access to information. Such quantitative tools will allow engineers to design environments to achieve outcomes such as increased focus, improved safety, better wayfinding, and improved experience. In time, it might be possible to engineer interactive environments that adapt to the personal attributes and identities of the humans that inhabit them. The results of the research have the potential to be used by many disciplines such as engineering, business, architecture, psychology, cognition, and human factors.<br/><br/>The multidisciplinary research team consisting of academics (engineering, psychology, and computer science) and two industry personnel suggests moving visual experience of a three dimensional environment from the realm of intuition and experience to analytical science. The specific focus of this grant will be on developing a general set of analytical models that emerge from the analysis of a variety of context-specific human-environment interactions (e.g., nurse in CCU, shopper in a retail store). With a solid grounding in the basic psychology of Perception-Action, the analytical models will integrate three-dimensional spatial relationships with the human eye's field of vision and the physical attributes of a human. Most significantly, the research will consider complex dynamics resulting from human movement in the space, with all the attendant changes in visual angles, and the appearance and disappearance of visual obstacles. Human performance will be empirically examined in a Virtual Environment in order to validate the analytical metrics of visual experience.

  • Program Officer
    Georgia-Ann Klutke
  • Min Amd Letter Date
    9/4/2015 - 9 years ago
  • Max Amd Letter Date
    9/13/2015 - 9 years ago
  • ARRA Amount

Institutions

  • Name
    Wright State University
  • City
    Dayton
  • State
    OH
  • Country
    United States
  • Address
    3640 Colonel Glenn Highway
  • Postal Code
    454350001
  • Phone Number
    9377752425

Investigators

  • First Name
    John
  • Last Name
    Flach
  • Email Address
    john.flach@wright.edu
  • Start Date
    9/11/2015 12:00:00 AM
  • First Name
    James
  • Last Name
    Munch
  • Email Address
    james.munch@wright.edu
  • Start Date
    9/4/2015 12:00:00 AM
  • End Date
    09/11/2015
  • First Name
    Thomas
  • Last Name
    Wischgoll
  • Email Address
    thomas.wischgoll@wright.edu
  • Start Date
    9/4/2015 12:00:00 AM
  • First Name
    Pratik
  • Last Name
    Parikh
  • Email Address
    pratik.parikh@wright.edu
  • Start Date
    9/4/2015 12:00:00 AM
  • First Name
    Jennie
  • Last Name
    Gallimore
  • Email Address
    Jennie.Gallimore@wright.edu
  • Start Date
    9/4/2015 12:00:00 AM
  • End Date
    09/11/2015

Program Element

  • Text
    EFRI RESEARCH PROJECTS
  • Code
    7633

Program Reference

  • Text
    CONTROL SYSTEMS
  • Text
    MECHATRONICS
  • Text
    SENSORS AND ACTUATORS
  • Text
    Smart and responsive structures
  • Text
    Dynamical systems
  • Text
    EAGER
  • Code
    7916
  • Text
    Advanced Materials Processing
  • Code
    8025