The subject matter disclosed herein relates to kinematics analysis of movement-oriented biometric data and more particularly relates to methods, systems, and apparatus for real-time determination of user intention based on kinematics analysis of movement-oriented biometric data.
A user viewing a computer interface does not always have a cursor indicating a present browsing location. For example, a user reading a digital book must locate the last read words when returning to the computer interface after looking away.
Additionally, when working in an environment with multiple windows and/or displays, users may encounter difficulty remembering where they left off on each window or display. This difficulty results in a loss of productivity as users determine the last browsing location for a widow or display. The overhead time involved in switching between different windows and/or displays may take several seconds and such adjustment time can add up over the course of a working day.
Apparatus for providing a last known browsing location cue using movement-oriented biometric data are disclosed. The apparatus for providing a last known browsing location cue using movement-oriented biometric data includes a biometric data module that receives movement-oriented biometric data, a judgment module that detects user distraction, and a location cue module that provides a visual cue indicating a last known browsing location. A method and computer program product also perform the functions of the apparatus.
A more particular description of the embodiments briefly described above will be rendered by reference to specific embodiments that are illustrated in the appended drawings. Understanding that these drawings depict only some embodiments and are not therefore to be considered to be limiting of scope, the embodiments will be described and explained with additional specificity and detail through the use of the accompanying drawings, in which:
As will be appreciated by one skilled in the art, aspects of the embodiments may be embodied as a system, method or program product. Accordingly, embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, embodiments may take the form of a program product embodied in one or more computer readable storage devices storing machine readable code, computer readable code, and/or program code, referred hereafter as code. The storage devices may be tangible, non-transitory, and/or non-transmission. The storage devices may not embody signals. In a certain embodiment, the storage devices only employ signals for accessing code.
Many of the functional units described in this specification have been labeled as modules, in order to more particularly emphasize their implementation independence. For example, a module may be implemented as a hardware circuit comprising custom VLSI circuits or gate arrays, off-the-shelf semiconductors such as logic chips, transistors, or other discrete components. A module may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices or the like.
Modules may also be implemented in code and/or software for execution by various types of processors. An identified module of code may, for instance, comprise one or more physical or logical blocks of executable code which may, for instance, be organized as an object, procedure, or function. Nevertheless, the executables of an identified module need not be physically located together, but may comprise disparate instructions stored in different locations which, when joined logically together, comprise the module and achieve the stated purpose for the module.
Indeed, a module of code may be a single instruction, or many instructions, and may even be distributed over several different code segments, among different programs, and across several memory devices. Similarly, operational data may be identified and illustrated herein within modules, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different computer readable storage devices. Where a module or portions of a module are implemented in software, the software portions are stored on one or more computer readable storage devices.
Any combination of one or more computer readable medium may be utilized. The computer readable medium may be a computer readable storage medium. The computer readable storage medium may be a storage device storing the code. The storage device may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, holographic, micromechanical, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
More specific examples (a non-exhaustive list) of the storage device would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Code for carrying out operations for embodiments may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Reference throughout this specification to “one embodiment,” “an embodiment,” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, appearances of the phrases “in one embodiment,” “in an embodiment,” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment, but mean “one or more but not all embodiments” unless expressly specified otherwise. The terms “including,” “comprising,” “having,” and variations thereof mean “including but not limited to,” unless expressly specified otherwise. An enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise. The terms “a,” “an,” and “the” also refer to “one or more” unless expressly specified otherwise.
Furthermore, the described features, structures, or characteristics of the embodiments may be combined in any suitable manner. In the following description, numerous specific details are provided, such as examples of programming, software modules, user selections, network transactions, database queries, database structures, hardware modules, hardware circuits, hardware chips, etc., to provide a thorough understanding of embodiments. One skilled in the relevant art will recognize, however, that embodiments may be practiced without one or more of the specific details, or with other methods, components, materials, and so forth. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of an embodiment.
Aspects of the embodiments are described below with reference to schematic flowchart diagrams and/or schematic block diagrams of methods, apparatuses, systems, and program products according to embodiments. It will be understood that each block of the schematic flowchart diagrams and/or schematic block diagrams, and combinations of blocks in the schematic flowchart diagrams and/or schematic block diagrams, can be implemented by code. These code may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the schematic flowchart diagrams and/or schematic block diagrams block or blocks.
The code may also be stored in a storage device that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the storage device produce an article of manufacture including instructions which implement the function/act specified in the schematic flowchart diagrams and/or schematic block diagrams block or blocks.
The code may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the code which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The schematic flowchart diagrams and/or schematic block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of apparatuses, systems, methods and program products according to various embodiments. In this regard, each block in the schematic flowchart diagrams and/or schematic block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions of the code for implementing the specified logical function(s).
It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. Other steps and methods may be conceived that are equivalent in function, logic, or effect to one or more blocks, or portions thereof, of the illustrated Figures.
Although various arrow types and line types may be employed in the flowchart and/or block diagrams, they are understood not to limit the scope of the corresponding embodiments. Indeed, some arrows or other connectors may be used to indicate only the logical flow of the depicted embodiment. For instance, an arrow may indicate a waiting or monitoring period of unspecified duration between enumerated steps of the depicted embodiment. It will also be noted that each block of the block diagrams and/or flowchart diagrams, and combinations of blocks in the block diagrams and/or flowchart diagrams, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and code.
The description of elements in each figure may refer to elements of proceeding figures. Like numbers refer to like elements in all figures, including alternate embodiments of like elements.
Generally, the methods, systems, apparatus, and computer program products perform real-time kinematics analysis of movement-oriented biometric data. In some embodiments, the kinematic analysis is used to interpret a user's intention. For example, the kinematics analysis may be used to interpret whether a short-range movement, such as a short edge-swipe, is intended or whether a long-range movement, such as a long edge-swipe, is intended by the user's movement.
In some embodiments, the kinematics analysis is used to interpret whether a user is paying attention to a computer interface, or whether the user has become distracted from the computer interface. The computer interface may be a display, a window, or any sub-element of a display or window. The nature of the computer interface may depend on the type of electronic device and the nature of applications being executed on the electronic device. For example, the computer interface may be a windowed browser on a laptop, desktop, or tablet computer. As another example, the computer interface may be the entire display of an electronic reader or a handheld device executing a reader application.
In some embodiments, the movement-oriented biometric data is used to determine movement and/or position values. In some embodiments, the movement and/or position values may be compared to a plurality of thresholds to interpret a user's intention. For example, where an acceleration threshold is exceeded and a jerk (also known as jolt) threshold is exceeded, a user's movement may be interpreted as a distraction movement. In some embodiments, the movement and/or position values may be compared to a plurality of profiles to interpret a user's intention. For example, where velocity values match a bell curve, a user's movement may be interpreted as a short range movement. In some embodiments, the movement and/or position values may be compared to thresholds and profiles to interpret a user's intention. For example, where velocity values match a bell curve and an acceleration value exceeds a threshold, a user's movement may be interpreted as a long-range movement. In some embodiment, an action is performed in response to determining the user's intention, the action selected based on the user's intention.
In some embodiments, the kinematics analysis is used to determine where a user is looking in relation to a computer interface. For example, the biometric data may be analyzed to determine a browsing location on a computer interface. Further analysis may determine, in real-time, when a user becomes distracted from the computer interface. After determining user distraction, a browsing location corresponding to the moment of distraction may be stored as a last browsing location. A visual cue may be provided at the last browsing location to aid the user in quickly identifying the last browsing location. For example, by highlighting words on a computer interface corresponding to the last browsing location, a user reading text on the computer interface will quickly identify the last-read words and be able to resume reading.
The processor 102 may comprise any known controller capable of executing computer-readable instructions and/or capable of performing logical operations on the biometric data. For example, the processor 102 may be a microcontroller, a microprocessor, a central processing unit (CPU), a graphics processing unit (GPU), an auxiliary processing unit, a FPGA, or similar programmable controller. In some embodiments, the processor 102 executes instructions stored in the memory 114 to perform the methods and routines described herein.
The display 104 may comprise any known electronic display capable of outputting visual data to a user. For example, the display 104 may be an LCD display, an LED display, an OLED display, a projector, or similar display device capable of outputting images, text, or the like to a user. The display 104 may receive image data for display from the processor 102, the user intention analysis device 110, and/or the browsing location cue device 112.
The input device 106 may comprise any known computer input device. For example, the input device 106 may be a touch panel, a button, a key, or the like. In some embodiments, the input device 106 may be integrated with the display 104, such as a touchscreen or similar touch-sensitive display. In some embodiments, movement-oriented biometric data may be generated by the input device 106. For example, biometric data relating to finger position may be received from the input device 106.
The biometric sensor 108 is a sensor that gathers movement-oriented biometric data. In some embodiments, the biometric sensor 108 is a camera system capable of tracking user gestures. In some embodiments, the biometric sensor 108 is a camera system capable of gathering eye gazing data and/or eye tracking data. Both eye gazing data and eye tracking data are examples of movement-oriented biometric data used to determine where a user's eye are looking.
As used herein, eye gazing data refers to movement-oriented biometric data that tracks eye movement by identifying the orientation of facial features in relation to a computer interface. Eye gazing data provides rough orientation information by using height, neck orientation, nose orientation, and other facial features. However, eye gazing data does not provide the precise location where the eye is looking. In contrast, eye tracking data refers to movement-oriented biometric data that tracks eye movement by identifying eye features, such as pupil location or retina location. Eye tracking data provides precise eye orientation information and is able to more precisely determine where the user is looking.
The user intention analysis device 110 operates on movement-oriented biometric data to interpret user intention from movement. The user intention analysis device 110 may be comprised of computer hardware and/or computer software. For example the user intention analysis device 110 may be circuitry or a processor configured to interpret user intention of a detected movement using the movement-oriented biometric data. In some embodiments, the user intention analysis device 110 comprises software code that allows the processor 102 to interpret user intention from the movement-oriented biometric data. The user intention analysis device 110 is discussed in further detail with reference to
The browsing location cue device 112 operates on movement-oriented biometric data to provide a last known browsing location cue. The browsing location cue device 112 may be comprised of computer hardware and/or computer software. For example the browsing location cue device 112 may be circuitry or a processor configured to provide a last known browsing location cue from the movement-oriented biometric data. As another example, the browsing location cue device 112 may comprise software code that allows the processor 102 to provide a last known browsing location cue from the movement-oriented biometric data. The browsing location cue device 112 is discussed in further detail with reference to
The memory 114 may be implemented as a computer readable storage medium. In the embodiment shown in
The stored biometric data 116 may comprise time values and one or more of: corresponding position values, corresponding velocity values, corresponding acceleration values, and corresponding jerk (also known as jolt) values. The types of values stored in stored biometric data 116 may depend on the types of data gathered by the input device 106, the biometric sensor 108 and/or the biometric data acquisition device 120. Alternatively, the data gathered by the input device 106, the biometric sensor 108 and/or the biometric data acquisition device 120 may be parsed or augmented to form 116. The user profile 118 comprises user-specific parameters and preferences. The parameters and preferences of the user profile 118 may be defined by a user or by an automated process, e.g., a calibration routine. In some embodiments, a separate profile is stored for each user of the electronic device 101.
The biometric data acquisition device 120 is communicatively coupled to the electronic device 101 and gathers movement-oriented biometric data. The biometric data acquisition device 120 may communicate movement-oriented biometric data with the electronic device 101 via a wired or wireless interface. The external biometric sensor 122 may be similar to the biometric sensor 108 described above. In some embodiments, the biometric data acquisition device 120 is external to, but physically coupled to the electronic device 101. For example, the biometric data acquisition device 120 may be an accessory, including a case or cover, which attaches to the electronic device 101.
In some embodiments the biometric data module 202 identifies the latest biometric data, for example the last N samples of biometric data, where N is a positive integer. The biometric data module 202 may limit the number of biometric data values to a predefined window size, the window size corresponding to a user reaction time. A window size significantly above the user reaction time can improve reliability as it ensures that the detected movement is a conscious movement (i.e., a reaction) and not an artifact or false positive due to noise, involuntary movements, etc.
The movement module 204 determines movement values from the movement-oriented biometric data. In some embodiments, the movement module 204 determines acceleration values from the movement-oriented biometric data. For example, where the biometric data comprises position values and time values, the movement module 204 may derive acceleration values corresponding to the time values. In some embodiments, the movement module 204 determines position, velocity, and/or jerk values from the biometric data. The movement module 204 may include circuitry for calculating integrals and/or derivatives to obtain movement values from the biometric data. For example, the movement module 204 may include circuitry for calculating second-derivatives of location data.
The evaluation module 206 interprets a user intention for a movement based on the movement values determined by the movement module 204. For example, the evaluation module 206 may determine if the user intends to perform a short-range action or a long-range action. In some embodiments, acceleration, velocity, position, and/or jerk values may be compared to a threshold and/or profile to interpret the user intention. For example, the evaluation module 206 may interpret a user's intention to be a distraction movement where an acceleration threshold is exceeded and a jerk threshold is exceeded. As another example, the evaluation module 206 may determine that a user intends to make a short-range movement where velocity values match a bell curve profile. In some embodiments, movement values (i.e., acceleration, velocity, position, and/or jerk values) may be compared to a combination of thresholds and profiles to interpret a user's intention. For example, where velocity values match a bell curve and an acceleration value exceeds a threshold, a user's movement may be interpreted as a long-range movement.
The evaluation module 206 may determine that a user intends to make a short-range (i.e., intra-interface) movement when the velocity value is at (or near) zero and the acceleration value is negative at the edge (or boundary) of the computer interface. On the other hand, the evaluation module 206 may determine that the user intends to make a long-range (i.e., extra-interface) movement when the velocity value is above zero at the edge (or boundary) of the computer interface or when the acceleration value is positive at the edge (or boundary) of the computer interface. The computer interface may be a windowed browser on a laptop, desktop, or tablet computer. As an example, the computer interface may be the entire display of an electronic reader or a handheld device executing a reader application. In the former, the boundaries of the computer interface correspond to the boundaries of the window in question, while in the latter, the boundaries of the computer interface correspond to the boundaries of the display itself.
The evaluation module 206 may determine that a user is reading when a velocity value matches a reading speed profile. The evaluation module 206 may determine user inattention when the velocity value drops below the reading speed profile for a certain amount of time. Additionally, the evaluation module 206 may determine user distraction when the velocity value is above the reading speed and the jerk value exceeds a jerk threshold. Additionally, or alternatively, user distraction may be determined when velocity values match a distraction profile. Profiles and thresholds specific to a user may be stored in the user profile 118.
The location module 302 identifies location or position values from the movement-oriented biometric data. In some embodiments, the location module 302 may store one or more position thresholds relating to the computer interface. For example, the location module 302 may store position thresholds corresponding to boundaries of the computer interface. As another example, the location module 302 may store position thresholds corresponding to specific regions of the computer interface, such as edges, input fields, and the like. In some embodiments, the stored position thresholds are used by the evaluation module 206 to determine user intention. In some embodiments, the location module 302 itself compares the position values to the position thresholds and outputs the results to the evaluation module 206.
The location module 302 may be an independent module or may be a sub-module of the movement module 204 and/or the evaluation module 206. In some embodiments, the location module 302 may store one or more position profiles used to categorize user movements. For example, the location module 302 may store a position profile corresponding to a short-range movement within the computer interface.
The velocity module 304 identifies velocity or speed values from the movement-oriented biometric data. In some embodiments, the velocity module 304 may store one or more velocity thresholds relating to the computer interface. For example, the velocity module 304 may store velocity thresholds corresponding to boundaries of the computer interface. In some embodiments, the velocity thresholds are general thresholds. In some embodiments, the stored velocity thresholds are used by the evaluation module 206 to determine user intention. In some embodiments, the velocity module 304 itself compares the velocity values to the velocity thresholds and outputs the results to the evaluation module 206.
The velocity module 304 may be an independent module or may be a sub-module of the movement module 204 and/or the evaluation module 206. In some embodiments, the velocity module 304 may store one or more velocity profiles used to categorize user movements. For example, the velocity module 304 may store a velocity profile corresponding to a short-range movement within the computer interface.
The jerk module 306 identifies jerk or jolt values from the movement-oriented biometric data. In some embodiments, the jerk module 306 may store one or more jerk thresholds relating to the computer interface. For example, the jerk module 306 may store jerk thresholds corresponding to specific regions of the computer interface, such as boundaries, edges, and the like. In some embodiments, the jerk thresholds are general thresholds. In some embodiments, the stored jerk thresholds are used by the evaluation module 206 to determine user intention. In some embodiments, the jerk module 306 itself compares the jerk values to the jerk thresholds and outputs the results to the evaluation module 206.
The jerk module 306 may be an independent module or may be a sub-module of the movement module 204 and/or the evaluation module 206. In some embodiments, the jerk module 306 may store one or more jerk profiles used to categorize user movements. For example, the jerk module 306 may store a jerk profile corresponding to a short-range movement within the computer interface.
The adaptation module 308 dynamically adjusts the threshold and/or profiles used by the user intention analysis device 110 responsive to changes in the computer interface. The adaptation module 308 may modify thresholds and/or profiles relating to position, velocity, acceleration, and/or jerk values of the movement-oriented biometric data. In some embodiments, the adaptation module 308 may adjust the thresholds and/or profiles in response to a change in the dimensions of the computer interface. For example, where the computer interface corresponds to a window, changes to the window size may cause the adaptation module 308 to adjust thresholds and/or profiles relating to boundaries or edges of the computer interface. As another example, changes to a window size may also cause the adaptation module 308 to adjust velocity, acceleration, and/or jerk threshold to account for the new dimensions of the window.
In some embodiments, the adaptation module 308 may adjust the thresholds and/or profiles when a distance between the user and the computer interface changes. For example, where the electronic device 101 is a handheld electronic device (e.g., a smartphone or tablet computer) the adaptation module 308 may adjust the thresholds and/or profiles when the user moves the handheld electronic device closer to the user's face. The adjustments may take into account a change in angle between the user and the dimensions of the computer interface as the dimensions of the computer interface appear different to the user even though, pixel-wise, the computer interface dimensions themselves have not changed.
In some embodiments, the calibration module 310 is used to measure a user's performance of a movement and to set initial thresholds and/or profiles used by the evaluation module 206 to interpret the movement-oriented biometric data. Calibration may occur a first time the user intention analysis device 110 is initialized, every time the user intention analysis device 110 is initialized, or it may be manually selected by the user. Calibration may be user-specific and may be stored in the user profile 118. The calibration module 310 allows for more accurate interpretation of movement-oriented biometric data as comparisons may be based on accurate models of user movement. In some embodiments, a user reaction time is calibrated by the calibration module 310. The user reaction time may be used to determine a sample size sufficiently large to distinguish reactive movement and voluntary movements from involuntary movements to as to more accurately interpret user movement.
In some embodiments, the movement-oriented biometric data may be used to determine if movement by the user's finger 402 is intended to initiate a short-range movement, for example a short-edge swipe, or a long-range movement, for example a long-edge swipe. The electronic device 101 may interpret the user's intention by comparing the movement-oriented biometric data, including location 404, to one or more thresholds and/or profiles, as discussed with reference to
In
The method 500 proceeds with identifying 504 acceleration values from the movement-oriented biometric data. The acceleration values may be identified via a movement module 204 of a user intention analysis device 110. In some embodiments, an evaluation module 206 interprets 506 a user intention based on the acceleration values. The user intention may be a short-range movement, a long-range movement, and/or a distraction movement. The user intention may be interpreted 506 through comparing the acceleration values to one or more acceleration thresholds and/or profiles. The thresholds and/or profiles may be specific to the user, to the computer interface, and or to the electronic device 101.
The determined movement values may be examined in determining 608 whether one or more triggers have been met in the current window. The triggers may be based on position, pressure, velocity, and/or acceleration and indicate to the user intention analysis device 110 that a movement in need of interpretation has occurred. Additionally, a trigger may be received from another program or module that uses a user intention analysis device 110 to interpret intentions of user movement. One or more triggers may need to be met to result in a positive determination 608.
Once the trigger(s) is met, the movement values of the current window are interpreted 610 to determine a user's intention. In some instances, the movement values indicate a short-range movement. In some instances, the movement values indicate a long rage movement. In some instances, the movement values indicate a distraction or inattention movement. Other movements and/or gestures may be interpreted as known in the art.
In some embodiments, the method 600 continues with performing 612 an action corresponding to the user intention. For example, an action corresponding to a swipe command (i.e., a close action, a menu action, a switching action) may be performed after interpreting the user intention. In some embodiments, a data value is returned to a calling program or stored in memory in response to interpreting the user intention.
The method 700 may identify 708 a short-range movement, and return an indicator of such, if the velocity threshold is not exceeded. Otherwise, if the velocity threshold is exceeded, the method continues to 710 where the movement values are compared to at least one jerk threshold. If the jerk threshold is exceeded, the method may identify 712 the movement as a distraction movement and return an indicator of such, otherwise the movement may be identified 714 as a long-range movement and an indicator of such returned. The thresholds may be selected according to the nature of the biometric data (e.g., eye gazing data or figure position data) and according to the results of other comparisons. Additionally, or alternatively, the movement values may be compared to one or more profiles in each of the comparison steps of the method 700.
The method 800 may determine 810 a normal (i.e., attentive) movement, and return an indicator of such, if any of the thresholds is unmet. The thresholds may be selected according to the nature of the biometric data (e.g., eye gazing data or figure position data). Additionally, or alternatively, the movement values may be compared to one or more profiles in each of the comparison steps of the method 800.
At 904, an acceleration value corresponding to the determined moment in time is compared to an acceleration threshold. For example, an acceleration value at the computer interface boundary may be compared to the acceleration threshold. If the acceleration threshold is met, further comparisons are performed, otherwise the movement is identified 910 as a short-range movement. In some embodiments, the acceleration threshold is near zero.
At 906, a velocity value corresponding to the determined moment in time is compared to a velocity threshold. For example, a velocity value at the computer interface boundary may be compared to the velocity threshold. If the velocity threshold is met, the movement is identified 908 as a long-range movement. Otherwise, the movement is identified 910 as a short-range movement.
The biometric data module 1002 receives movement-oriented biometric data, for example from the input device 106, the biometric sensor 108, the memory 114, or the biometric data acquisition device 120. In some embodiments the biometric data module 1002 identifies the latest biometric data, for example the last N samples of biometric data, where N is a positive integer. The biometric data module 1002 may limit the number of biometric data values to a predefined window size, the window size corresponding to a user reaction time. A window size significantly above the user reaction time can improve reliability as it ensures that the detected movement is a conscious movement (i.e., a reaction) and not an artifact or false positive due to noise, involuntary movements, etc. The biometric data module 1002 may be similar to the biometric data module 202 discussed with reference to
The attention judgment module 1004 detects user distraction based on the biometric data. In some embodiments, the attention judgment module 1004 determines movement values from the biometric data. For example, the attention judgment module 1004 may determine position values, velocity values, acceleration values, jerk values, or other movement-related values from the movement-oriented biometric data. The attention judgment module 1004 may include circuitry for calculating integrals and/or derivatives to obtain movement values from the biometric data. For example, the attention judgment module 1004 may include circuitry for calculating second-derivatives of location data.
In some embodiments, the attention judgment module 1004 receives movement values from another device or module. For example, the attention judgment module 1004 may receive movement values from one or more of the input device 106, the biometric sensor 108, the user intention analysis device 110, the memory 116, the biometric data acquisition device 120, and/or the movement module 204.
In some embodiments, the attention judgment module 1004 analyzes the movement values to detect user distraction. In some embodiments, movement values (i.e., acceleration, velocity, position, and/or jerk values) may be compared to a threshold and/or profile to detect user distraction. For example, the attention judgment module 1004 may interpret a user's intention to be a distraction movement where an acceleration threshold is exceeded and a jerk threshold is exceeded. In some embodiments, movement values may be compared to a combination of thresholds and profiles to interpret a user's intention. In some embodiments, movement values at an edge or boundary of a computer interface may be analyzed to detect user distraction.
The computer interface may be a windowed browser on a laptop, desktop, or tablet computer. As an example, the computer interface may be the entire display of an electronic reader or a handheld device executing a reader application. In some embodiments, the attention judgment module 1004 receives an indication of user distraction from another module or device, such as the evaluation module 206.
In some embodiments, the attention judgment module 1004 may determine that a user is reading when a velocity value matches a reading speed profile. The attention judgment module 1004 may determine user distraction when the velocity value is above the reading speed and the jerk value exceeds a jerk threshold. Additionally, or alternatively, user distraction may be determined when velocity values match a distraction profile. Profiles and thresholds specific to a user may be stored in the user profile 118.
In some embodiments, the attention judgment module 1004 identifies a moment in time when the user is first distracted. The attention judgment module 1004 may store a value representing this moment in the memory 114 or may output this value to another module or device.
The location cue module 1006 provides a visual cue in the computer interface responsive to the attention judgment module 1004 determining that the user has become distracted. The visual cue may be any indicator suitable for indicating a last known browsing location, for example, a highlight, an underline, an icon, or the like. In some embodiments, the last known browsing location corresponds to a location on the computer interface where the user was looking just before becoming distracted. In some embodiments, the location cue module 1006 determines the last known browsing location from the biometric data. In other embodiments, the location cue module 1006 receives the last known browsing location from another module or device.
The location cue module 1006 may provide the visual cue immediately after receiving an indication that the user is distracted, or may present the visual cue in response to receiving additional triggers, such as the expiration of a timer. Additionally, in some embodiments, the location cue module 1006 may remove the visual cue after a predetermined amount of time or in response to receiving another trigger, such as an indication that the user is again attentive to the computer interface.
The browsing location module 1102 identifies a browsing location on a computer interface based on the movement-oriented biometric data. In some embodiments, the browsing location module 1102 identifies position values from the movement-oriented biometric data and correlates the position values to determine a location on the computer interface where the user is looking; the location being a browsing location. In some embodiments, the browsing location module 1102 uses eye tracking or eye gazing algorithms to determine the browsing location.
In some embodiments, the browsing location module 1102 receives a position value determined from the movement-oriented biometric data from another device or module, such as the input device 106, the biometric data sensor 108, the user intention analysis device 110, the memory 116, the biometric data acquisition device 120, the movement module 204, and/or the attention judgment module 1004, and interpolates a browsing location from the position value.
In some embodiments, the browsing location module 1102 stores a number of recent browsing locations. The recent browsing locations may be stored in the memory 114 or in the browsing location module 1102 itself. The number of recent browsing locations may be fixed or variable. In some embodiments, the number of recent browsing locations corresponds to a data window size used by the biometric data module 1002. In some embodiments, the browsing location module 1102 provides the recent browsing locations to the location cue module 1006. In some embodiments, the browsing location module 1102 determines a last known browsing location corresponding to a moment of distraction and provides the last known browsing location to the location cue module 1006.
The last location module 1104 identifies an inattention time corresponding to the detected user distraction. In some embodiments, the last location module 1104 receives an indication of user distraction from the attention judgment module 1004 and identifies a moment in time when the user is first distracted. The last location module 1104 may store a value representing this moment in the memory 114 or may output this value to another module or device, such as the location cue module 1006 or the browsing location module 1102, for use in determining a last known browsing location. In some embodiments, the last location module 1104 sends the inattention time to the location cue module 1006 for use in providing the last known browsing location.
The cue timer module 1106 initiates a marking timer in response to detecting user distraction. The marking timer counts down (or up according to implementation) a predetermined amount of time before sending a signal to another device or module. In some embodiments, the marking timer is adjustable and the amount of time is user specific. For example, a user may specify a marking timer amount. As another example, the cue timer module 1106 may automatically determine a marking timer amount based on data in the user profile 118. Upon expiration, the cue timer module 1106 sends a signal to the location cue module 1006 indicating that the visual cue should be displayed.
The cue dismissal module 1108 initiates a removal timer in response to detecting user distraction. The removal timer counts down (or up according to implementation) a predetermined amount of time before sending a signal to another device or module. In some embodiments, the removal timer is adjustable and the amount of time is user specific. For example, a user may specify a removal timer amount. As another example, the cue dismissal module 1108 may automatically determine a removal timer amount based on data in the user profile 118. In some embodiments, the cue dismissal module 1108 removes the visual cue in response to expiration of the removal timer. In other embodiments, the cue dismissal module 1108 sends a signal to the location cue module 1006 upon expiration of the removal timer indicating that the visual cue should be removed.
The attention renewal module 1110 detects whether user attention is returned to the computer interface subsequent to the user distraction. In some embodiments, the attention renewal module 1110 operates on the movement-oriented biometric data to determine that the user is again paying attention to the computer interface. In some embodiments, movement values (i.e., acceleration, velocity, position, and/or jerk values) may be compared to a threshold and/or profile to detect user attention. For example, the attention renewal module 1110 may determine that a user is attentive to the computer interface when a velocity value matches a reading speed profile. As another example, the attention renewal module 1110 may determine that a user is attentive to the computer interface when acceleration values are below a velocity threshold for a window of time and a browsing location corresponds to a location within the computer interface.
Upon detecting that the user's attention has returned to the computer interface, the attention renewal module 1110 signals the location cue module 1006 indicating that the visual cue should be provided. In some embodiments, the attention renewal module 1110 receives an indication of user attention from another device or module, such as the evaluation module 206, the movement threshold module 1112, or the movement profile module 1114, and signals the location cue module 1006 that the visual cue should be provided.
The movement threshold module 1112 compares the movement-oriented biometric data to at least one threshold to determine whether the user is attentive to the computer interface. The threshold may be a position threshold, a velocity threshold, an acceleration threshold, and/or a jerk threshold. For example, the movement threshold module 1112 may determine that a user is attentive to the computer interface when acceleration values are below a velocity threshold for a window of time and a browsing location corresponds to a location within the computer interface. In some embodiments, the movement threshold module 1112 operates in conjunction with the judgment module 1004 to determine whether a user is distracted. In some embodiment, the movement threshold module 1112 operates in conjunction with the location cue module 1006 to determine when to provide the visual cue.
The movement profile module 1114 compares the movement-oriented biometric data to at least one profile to determine whether the user is attentive to the computer interface. The profile may be an eye speed profile, an eye acceleration profile, and/or an eye jolt profile. For example, the movement profile module 1114 may determine that a user is attentive to the computer interface when a velocity value matches a reading speed profile. In some embodiments, the movement profile module 1114 operates in conjunction with the judgment module 1004 to determine whether a user is distracted. In some embodiment, the movement profile module 1114 operates in conjunction with the location cue module 1006 to determine when to provide the visual cue.
In
In
In
The visual cue 1208 may be any indicator suitable for indicating the last known browsing location. For example, the visual cue 1208 may be a highlight (e.g., highlighted text), an underline, a foreground mark, a background mark (e.g., a watermark), an icon, and the like. In some embodiments, the visual cue 1208 comprises animated text or color-differentiated text (i.e., text of a different color). In some embodiments, the visual cue 1208 may comprise bold or bright colors that attract the eye. In some embodiments, the visual cue is provided by fading text, images, or other display data except in the area surrounding the last known browsing location. For example, a word located at a last known browsing position and one or more nearby context words may be displayed in black lettering while all other words in the computer interface may be displayed in lighter shades of gray. As another example, a sentence located at a last known browsing position may be displayed in black lettering while all other words in the computer interface may be displayed in lighter shades.
In some embodiments, a trace may be provided that underlines or highlights words or locations on the computer interface 1206 corresponding to a current browsing location 1204 and fades to transparency with time or with progress (e.g., a word at the current browsing location is underlined with 0% transparency while the previous M words are underlines with increasing amounts of transparency). When user distraction is detected, the trace stops fading so that the underline or highlight indicates the last known browsing location.
In
Receiving 1302 movement-oriented biometric data may include receiving only the last N samples of biometric data, where N is a positive integer corresponding to a measurement window for biometric data. The measurement window may be user specific and the value of N may be prompted, may be automatically determined, may be retrieved from a user profile 118, and/or may be adjusted depending on the nature of the computer interface. In some embodiments, the movement-oriented biometric data is received in real-time and comprises a plurality of viewing position values and a plurality of timestamps, each timestamp corresponding to one of the plurality of viewing positions. In some embodiments, the movement-oriented biometric data is eye gazing data. In some embodiments, the movement-oriented biometric data is eye tracking data.
The method 1300 proceeds with detecting 1304 user distraction from a computer interface based on the movement-oriented biometric data. In some embodiments, movement values are identified via a judgment module 1004 of a browsing location cue device 112. The movement values may be compared to various thresholds and/or profiles to detect that a user has become distracted. In some embodiments, step 1304 comprises identifying a moment in time when the user is first distracted.
The method continues with providing 1306 a visual cue in the computer interface indicating a last known browsing location. The visual cue may be any indicator suitable for indicating a last known browsing location. The last known browsing location is a location on the computer interface where the user was looking just before becoming distracted. In some embodiments, the last known browsing location is determined from the biometric data. In other embodiments, the last known browsing location is received from another module or device. The visual cue may be presented immediately after detecting 1304 that the user is distracted, or may be presented in response to receiving additional triggers, such as the expiration of a timer. Additionally, in some embodiments, the visual cue may be removed after a predetermined amount of time or in response to receiving an indication that the user is again attentive to the computer interface.
In some embodiments, step 1402 comprises identifying position values from the movement-oriented biometric data and correlating the position values to locations on the computer interface to determine where the user is looking. In some embodiments, step 1402 comprises using eye tracking or eye gazing algorithms to determine the browsing location. In some embodiments, step 1402 comprises receiving a position value from another device or module, such as the input device 106, the biometric sensor 108, the user intention analysis device 110, the stored biometric data 116, the biometric data acquisition device 120, the movement module 204 and/or the attention judgment module 1004, and interpolating a browsing location from the position value.
Step 1406 involves determining whether user distraction has been detected. User distraction may be detected by comparing the biometric data to thresholds and/or profiles as discussed above. If user distraction is not detected, the method 1400 loops and step 1406 repeats. If user distraction is detected, an inattention time is identified 1408 corresponding to the detected user distraction. The inattention time is used to identify and assign 1410 a browsing location as the last known browsing location.
Step 1412 involves initiating a marking timer. The marking timer counts down a predetermined amount of time. The marking timer may be adjustable and may be user specific. Upon expiration of the marking timer, a visual cue is presented 1414 at the last known browsing location.
Step 1416 involves initiating a removal timer. In some embodiments, the removal timer is initiated upon detecting that the user is again attentive to the user interface. In some embodiments, the removal timer is initiated responsive to providing the visual cue. Upon expiration of the removal timer, the visual cue is removed 1418 from the computer interface.
In some embodiments, step 1502 comprises identifying position values from the movement-oriented biometric data and correlating the position values to locations on the computer interface to determine where the user is looking. In some embodiments, step 1502 comprises using eye tracking or eye gazing algorithms to determine the browsing location. In some embodiments, step 1502 comprises receiving a position value from another device or module, such as the input device 106, the biometric sensor 108, the user intention analysis device 110, the stored biometric data 116, the biometric data acquisition device 120, the movement module 204 and/or the attention judgment module 1004, and interpolating a browsing location from the position value.
Step 1506 involves determining whether user distraction has been detected. User distraction may be detected by comparing the biometric data to thresholds and/or profiles as discussed above. If user distraction is not detected, the method 1500 loops and step 1506 repeats. If user distraction is detected, an inattention time is identified 1508 corresponding to the detected user distraction. The inattention time is used to identify and assign 1510 a browsing location as the last known browsing location.
Step 1512 involves detecting user attention. The movement-oriented biometric data may be analyzed to detect that the user is again attentive to the computer display. In some embodiments, the analysis involves comparing the movement-oriented biometric data to thresholds and/or profiles as discussed above. Upon detecting user attention, a visual cue is presented 1514 at the last known browsing location.
Step 1516 involves initiating a removal timer. In some embodiments, the removal timer is initiated responsive to detecting that the user is again attentive to the user interface. In some embodiments, the removal timer is initiated responsive to providing the visual cue. Upon expiration of the removal timer, the visual cue is removed 1518 from the computer interface.
Embodiments may be practiced in other specific forms. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
Number | Name | Date | Kind |
---|---|---|---|
2510344 | Law | Jun 1950 | A |
2567654 | Siezen | Sep 1951 | A |
3418426 | Schlegal et al. | Dec 1968 | A |
3628854 | Jampolsky | Dec 1971 | A |
4082433 | Appeldorn et al. | Apr 1978 | A |
4190330 | Berreman | Feb 1980 | A |
4577928 | Brown | Mar 1986 | A |
5579037 | Tahara et al. | Nov 1996 | A |
5583702 | Cintra | Dec 1996 | A |
5583795 | Smyth | Dec 1996 | A |
5649061 | Smyth | Jul 1997 | A |
5731805 | Tognazzini et al. | Mar 1998 | A |
5786805 | Barry | Jul 1998 | A |
5831594 | Tognazzini et al. | Nov 1998 | A |
5850206 | Kashiwagi | Dec 1998 | A |
5886683 | Tognazzini et al. | Mar 1999 | A |
5898423 | Tognazzini et al. | Apr 1999 | A |
6046847 | Takahashi | Apr 2000 | A |
6120461 | Smyth | Sep 2000 | A |
6437758 | Nielsen et al. | Aug 2002 | B1 |
6467905 | Stahl et al. | Oct 2002 | B1 |
6577329 | Flickner | Jun 2003 | B1 |
6873314 | Campbell | Mar 2005 | B1 |
8160311 | Schaefer | Apr 2012 | B1 |
8493390 | Kalinli | Jul 2013 | B2 |
8564662 | Busch et al. | Oct 2013 | B2 |
8594374 | Bozarth | Nov 2013 | B1 |
8600362 | Kim | Dec 2013 | B1 |
8767014 | Vaught et al. | Jul 2014 | B2 |
8824779 | Smyth | Sep 2014 | B1 |
8885882 | Yin et al. | Nov 2014 | B1 |
8893164 | Teller | Nov 2014 | B1 |
8922480 | Freed et al. | Dec 2014 | B1 |
8941476 | Hill | Jan 2015 | B2 |
8941690 | Seder et al. | Jan 2015 | B2 |
8957847 | Karakotsios et al. | Feb 2015 | B1 |
9035874 | Fowers et al. | May 2015 | B1 |
9096920 | Gomez et al. | Aug 2015 | B1 |
9152221 | Denker et al. | Oct 2015 | B2 |
9400564 | Chou et al. | Jul 2016 | B2 |
20010030711 | Saito | Oct 2001 | A1 |
20020077169 | Kelly | Jun 2002 | A1 |
20030137586 | Lewellen | Jul 2003 | A1 |
20030140120 | Hartman | Jul 2003 | A1 |
20030146901 | Ryan | Aug 2003 | A1 |
20040100567 | Miller et al. | May 2004 | A1 |
20040103111 | Miller et al. | May 2004 | A1 |
20040160419 | Padgitt | Aug 2004 | A1 |
20040183749 | Vertegaal | Sep 2004 | A1 |
20050086610 | Mackinlay et al. | Apr 2005 | A1 |
20050243054 | Beymer et al. | Nov 2005 | A1 |
20060093998 | Vertegaal | May 2006 | A1 |
20060109237 | Morita et al. | May 2006 | A1 |
20060109238 | Lau et al. | May 2006 | A1 |
20060110008 | Vertegaal et al. | May 2006 | A1 |
20060139318 | Kariathungal et al. | Jun 2006 | A1 |
20060139319 | Kariathungal et al. | Jun 2006 | A1 |
20060256083 | Rosenberg | Nov 2006 | A1 |
20060256094 | Inagaki | Nov 2006 | A1 |
20060256133 | Rosenberg | Nov 2006 | A1 |
20070078552 | Rosenberg | Apr 2007 | A1 |
20070164990 | Bjorklund et al. | Jul 2007 | A1 |
20070233692 | Lisa et al. | Oct 2007 | A1 |
20080030466 | Keats et al. | Feb 2008 | A1 |
20080227538 | Kelly et al. | Sep 2008 | A1 |
20080266252 | Keates et al. | Oct 2008 | A1 |
20090065578 | Peterson et al. | Mar 2009 | A1 |
20090097705 | Thorn | Apr 2009 | A1 |
20090146775 | Bonnaud et al. | Jun 2009 | A1 |
20090204410 | Mozer et al. | Aug 2009 | A1 |
20090248692 | Tsukagoshi et al. | Oct 2009 | A1 |
20090259349 | Golenski | Oct 2009 | A1 |
20090315740 | Hildreth et al. | Dec 2009 | A1 |
20100007601 | Lashina et al. | Jan 2010 | A1 |
20100045596 | De Leon | Feb 2010 | A1 |
20100079508 | Hodge et al. | Apr 2010 | A1 |
20100171720 | Craig et al. | Jul 2010 | A1 |
20100211918 | Liang et al. | Aug 2010 | A1 |
20100220897 | Ueno et al. | Sep 2010 | A1 |
20110007142 | Perez et al. | Jan 2011 | A1 |
20110065451 | Danado et al. | Mar 2011 | A1 |
20110141011 | Lashina et al. | Jun 2011 | A1 |
20110175932 | Yu et al. | Jul 2011 | A1 |
20110213664 | Osterhout et al. | Sep 2011 | A1 |
20110213709 | Newman et al. | Sep 2011 | A1 |
20110276961 | Johansson et al. | Nov 2011 | A1 |
20120032983 | Nishibe et al. | Feb 2012 | A1 |
20120105486 | Landford et al. | May 2012 | A1 |
20120149309 | Hubner et al. | Jun 2012 | A1 |
20120169582 | Tschirhart | Jul 2012 | A1 |
20120200490 | Inada | Aug 2012 | A1 |
20120220311 | Rodriguez et al. | Aug 2012 | A1 |
20120256967 | Baldwin et al. | Oct 2012 | A1 |
20120268268 | Bargero | Oct 2012 | A1 |
20120272179 | Stafford | Oct 2012 | A1 |
20130012305 | Kelly et al. | Jan 2013 | A1 |
20130014052 | Frey et al. | Jan 2013 | A1 |
20130021459 | Vasilieff et al. | Jan 2013 | A1 |
20130027302 | Iwaizumi et al. | Jan 2013 | A1 |
20130044042 | Olsson et al. | Feb 2013 | A1 |
20130054622 | Karmarkar et al. | Feb 2013 | A1 |
20130057573 | Chakravarthula et al. | Mar 2013 | A1 |
20130128364 | Wheeler et al. | May 2013 | A1 |
20130135196 | Park et al. | May 2013 | A1 |
20130169754 | Aronsson et al. | Jul 2013 | A1 |
20130170755 | Dalton et al. | Jul 2013 | A1 |
20130176208 | Tanaka et al. | Jul 2013 | A1 |
20130198056 | Aldrey et al. | Aug 2013 | A1 |
20130201305 | Sibecas et al. | Aug 2013 | A1 |
20130208014 | Fleck et al. | Aug 2013 | A1 |
20130246663 | Raveendran et al. | Sep 2013 | A1 |
20130254716 | Mishra | Sep 2013 | A1 |
20130260360 | Baurmann et al. | Oct 2013 | A1 |
20130307771 | Parker et al. | Nov 2013 | A1 |
20130321265 | Bychkov et al. | Dec 2013 | A1 |
20130340005 | Kwan | Dec 2013 | A1 |
20130340006 | Kwan | Dec 2013 | A1 |
20140002352 | Jacob et al. | Jan 2014 | A1 |
20140038137 | Hill | Feb 2014 | A1 |
20140071163 | Kinnebrew et al. | Mar 2014 | A1 |
20140104197 | Khosravy et al. | Apr 2014 | A1 |
20140108309 | Frank et al. | Apr 2014 | A1 |
20140129987 | Feit et al. | May 2014 | A1 |
20140168054 | Yang et al. | Jun 2014 | A1 |
20140168056 | Swaminathan et al. | Jun 2014 | A1 |
20140168399 | Plummer et al. | Jun 2014 | A1 |
20140172467 | He et al. | Jun 2014 | A1 |
20140176813 | Conness et al. | Jun 2014 | A1 |
20140195918 | Friedlander | Jul 2014 | A1 |
20140204029 | Lopez et al. | Jul 2014 | A1 |
20140237366 | Poulos et al. | Aug 2014 | A1 |
20140247232 | George-Svahn et al. | Sep 2014 | A1 |
20140247286 | Chi | Sep 2014 | A1 |
20140266702 | Forster-Knight | Sep 2014 | A1 |
20140267034 | Krulce et al. | Sep 2014 | A1 |
20140267094 | Hwang et al. | Sep 2014 | A1 |
20140267400 | Mabbutt et al. | Sep 2014 | A1 |
20140268054 | Olsson et al. | Sep 2014 | A1 |
20140270407 | Balakrishnan et al. | Sep 2014 | A1 |
20140272810 | Fields et al. | Sep 2014 | A1 |
20140292665 | Lathrop et al. | Oct 2014 | A1 |
20140298257 | Grandhi | Oct 2014 | A1 |
20140306826 | Ricci | Oct 2014 | A1 |
20140310256 | Olsson et al. | Oct 2014 | A1 |
20140313120 | Kamhi | Oct 2014 | A1 |
20140315531 | Joong et al. | Oct 2014 | A1 |
20140317524 | VanBlon et al. | Oct 2014 | A1 |
20140333566 | Lee et al. | Nov 2014 | A1 |
20140344012 | Kamhi et al. | Nov 2014 | A1 |
20140354533 | Swaminathan | Dec 2014 | A1 |
20140361971 | Sala et al. | Dec 2014 | A1 |
20140364212 | Osman et al. | Dec 2014 | A1 |
20150042552 | Tsoref | Feb 2015 | A1 |
20150049012 | Liu et al. | Feb 2015 | A1 |
20150066980 | Kim | Mar 2015 | A1 |
20150070481 | S et al. | Mar 2015 | A1 |
20150074602 | VanBlon et al. | Mar 2015 | A1 |
20150084864 | Geiss et al. | Mar 2015 | A1 |
20150092056 | Rau et al. | Apr 2015 | A1 |
20150094118 | Rodolico | Apr 2015 | A1 |
20150113454 | McLaaughlin | Apr 2015 | A1 |
20150139508 | Ye | May 2015 | A1 |
20150153571 | Ballard et al. | Jun 2015 | A1 |
20150154001 | Knox et al. | Jun 2015 | A1 |
20150154134 | Beaumont et al. | Jun 2015 | A1 |
20150154983 | VanBlon et al. | Jun 2015 | A1 |
20150160461 | Starner et al. | Jun 2015 | A1 |
20150169048 | Peterson et al. | Jun 2015 | A1 |
20150178555 | Feng | Jun 2015 | A1 |
20150205350 | VanBlon et al. | Jul 2015 | A1 |
20150205577 | VanBlon et al. | Jul 2015 | A1 |
20160041406 | Flores et al. | Feb 2016 | A1 |
20160048223 | Taguchi et al. | Feb 2016 | A1 |
20160147297 | Rose et al. | May 2016 | A1 |
20160148342 | Waltermann et al. | May 2016 | A1 |
20160154555 | Perrin et al. | Jun 2016 | A1 |
Number | Date | Country |
---|---|---|
101566920 | Oct 2009 | CN |
101945612 | Jan 2011 | CN |
102802502 | Nov 2012 | CN |
10310794 | Sep 2004 | DE |
69937592 | Oct 2008 | DE |
0880090 | Nov 1998 | EP |
08278134 | Oct 1996 | JP |
1091378 | Apr 1998 | JP |
11110120 | Apr 1999 | JP |
2004051392 | Jun 2004 | WO |
Entry |
---|
Beamforming, Wikipedia, Definition, https://en.wikipedia.org/wiki/Beamforming, Printed from website Jan. 22, 2015. |
Electromyography, Wikipedia, Definition, https://en.wikipedia.org/wiki/Electromyography, Printed from website Jan. 27, 2015. |
Extended Display Identification Data, Wikipedia, Definition, https://en.wikipedia.org/wiki/Extended_Display_Identification_Data, Printed from website Oct. 10, 2014. |
Microphone, Wikipedia, Definition, https://en.wikipedia.org/wiki/Microphone, printed from website Jan. 22, 2015. |
Microphone Array, Wikipedia, Definition, https://en.wikipedia.org/wiki/Microphone_array, printed from webiste Jan. 22, 2015. |
MYO-Tech Specs, https://www.thalmic.com/en/myo/techspecs, printed from website Jan. 27, 2015. |
Arthur Davis et al., “Optical Design using Fresnel Lenses Basic Principles and some Practical Examples”, Optik & Photonik, Dec. 2007, No. 4, pp. 52-55. |
Darren Quick, “PixelOptics to launch ‘worlds first electronic focusing eyewear’”, http://www.gizmag.com/pixeloptics-empower-electronic-focusing-glasses/17569/, Jan. 2011. |
“Electronic-lens company PixelOptics is bankrupt”, Insight News, http://www.insightnews.com.au/blog/NEWS_NOW!/post/electronic-lens-company-pixeloptics-is-bankrupt/, Dec. 2013. |
Polarizer, Wikipedia, Definition, https://en.wikipedia.org/wiki/Polarizer, printed from website Jan. 14, 2015. |
“Raise to Speak Makes Siri Wonderfully Useful (Once You Know How to Use It)”, iSource, http://isource.com/2012/10/01/raise-to-speak-makes-siri-wonderfully-useful-once-you-know-how-to=use-it/, Oct. 2012. |
“Relationship Between Inches, Picas, Points, Pitch, and Twips”, Article ID 76388, http://wupport2.microsoft.com/KR/76388, printed Oct. 10, 2014. |
Smart glass, Wikipedia, Definition, https://en.wikipedia.org/wiki/Smart_glass, printed from website Jan. 14, 2015. |
“Taking Touch Screen Interfaces Into a New Dimension”, Tactus Technology White Paper, 2012, pp. 1-13. |
“Understanding EDID—Extended Display Identification Data”, Digital Connection, ExtroNews 20.3, 2009, Extron www.extron.com. |
“Understanding & Using Directional Microphones”, SOS Sound on Sound, Sep. 20000, http://www.soundonsound.com/sos/sep00/articles/direction.htm. |
“See the World in Superfocus Revolutionary Eyeglasses Give You the Power to Focus Your Entire View at Any Distance”, Superfocus, http://superfocus.com/eye-care-practitioners, printed from website Jun. 24, 2014. |
U.S. Appl. No. 14/132,663, Office Action Summary, dated Sep. 22, 2015. |
Qvarfort, et al., “Conversing with the User Based on Eye-Gaze Patterns”, CHI 2005, Proceeding of the SIGCHI Conference on Human Factors in Computer Systems, ACM, Apr. 2-7, 2005, pp. 221-230. |
Jacob, et al., “Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises”, The Mind's Eye: Cognitive and Applied Aspects of Eye Movement Research, Hynoa, Radach & Deubel (eds.) Oxford, England, 2003. |
Kern, et al., Making User of Drivers' Glance onto the Screen for Explicit Gaze-Based Interaction, Proceedings of the Second International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2010), Nov. 11-12, 2010, Pittsburgh, Pennsylvania, USA, pp. 110-113. |
U.S. Appl. No. 14/132,663, Office Action Summary, dated Jan. 13, 2016. |
U.S. Appl. No. 14/548,938, Office Action Summary, dated Dec. 2, 2015. |
U.S. Appl. No. 14/548,938 Office Action Summary, dated Mar. 10, 2016. |
U.S. Appl. No. 14/137,472, Office Action Summary, dated Mar. 11, 2016. |
U.S. Appl. No. 14/132,663 Office Action dated dated Dec. 5, 2015. |
U.S. Appl. No. 14/132,663, Office Action Summary, dated Jun. 29, 2016. |
U.S. Appl. No. 14/137,472, Office Action Summary, dated Aug. 11, 2016. |
U.S. Appl. No. 14/132,663, Office Action Summary, dated May 10, 2017. |
U.S. Appl. No. 14/639,263, Office Action Summary, dated May 12, 2017. |
Number | Date | Country | |
---|---|---|---|
20150177830 A1 | Jun 2015 | US |