Input devices supply data and/or control signals to computers, television sets, game consoles, and other types of electronic devices. Over the years, input devices have evolved considerably from the early days of computers. For example, early computers used punched card readers to read data from punched paper tapes or films. As a result, generating even a simple input was quite burdensome. Recently, mice, touchpads, joysticks, motion sensing game controllers, and other types of “modern” input devices have been developed with improved input efficiencies.
Even though input devices have evolved considerably, conventional input devices still do not provide a natural mechanism for operating electronic devices. For example, mice are widely used as pointing devices for operating computers. However, a user must mentally translate planar two-dimensional movements of a mouse into those of a cursor on a computer display. Touchpads on laptop computers can be even more difficult to operate than mice because of variations in touch sensitivity and/or limited operating surfaces. In addition, operating conventional input devices typically requires rigid postures that can cause discomfort or even illness in users.
Various embodiments of electronic systems, devices, and associated methods of operation are described below. The term “marker” is used throughout to refer to a component useful for indicating, identifying, and/or otherwise distinguishing at least a portion of an object carrying and/or otherwise associated therewith. The term “detector” is used throughout to refer to a component useful for monitoring, identifying, and/or otherwise recognizing a marker. Examples of markers and detectors are described below with particular configurations, components, and/or functions for illustration purposes. The term “temporal trajectory” generally refers to a spatial trajectory of an object over time. The spatial trajectory can be in a two- or three-dimension space. Other embodiments of markers and/or detectors in accordance with the present technology may also have other suitable configurations, components, and/or functions. A person skilled in the relevant art will also understand that the technology may have additional embodiments, and that the technology may be practiced without several of the details of the embodiments described below with reference to
The input device 102 can be configured to be touch free from the output device 106. For example, in the illustrated embodiment, the input device 102 is configured as a ring wearable on an index finger of a user 101. In other examples, the input device 102 may be configured as a ring wearable on other fingers of the user 101. In further examples, the input device 102 may be configured as an open ring, a finger probe, a finger glove, a hand glove, and/or other suitable item for a finger, a hand, and/or other parts of the user 101. Even though only one input device 102 is shown in
The input device 102 can include at least one marker 103 (only one is shown in
In other embodiments, the marker 103 can include a non-powered (i.e., passive) component. For example, the marker 103 can include a reflective material that emits the signal 110 by reflecting at least a portion of the illumination 114 from the optional illumination source 112. The reflective material can include aluminum foils, mirrors, and/or other suitable materials with sufficient reflectivity. In further embodiments, the input device 102 may include a combination of powered and passive components. In any of the foregoing embodiments, one or more markers 103 may be configured to emit the signal 110 with a generally circular, triangular, rectangular, and/or other suitable pattern.
The detector 104 is configured to monitor and capture the signal 110 emitted from the marker 103 of the input device 102. In the following description, a camera (e.g., Webcam C500 provided by Logitech of Fremont, Calif.) for capturing an image and/or video of the input device 102 is used as an example of the detector 104 for illustration purposes. In other embodiments, the detector 104 can also include an IR camera, laser detector, radio receiver, ultrasonic transducer and/or other suitable types of radio, image, and/or sound capturing component. Even though only one detector 104 is shown in
The output device 106 can be configured to provide textual, graphical, sound, and/or other suitable type of feedback to the user 101. For example, as shown in
The controller 118 can include a processor 120 coupled to a memory 122 and an input/output interface 124. The processor 120 can include a microprocessor, a field-programmable gate array, and/or other suitable logic processing component. The memory 122 can include volatile and/or nonvolatile computer readable media (e.g., ROM; RAM, magnetic disk storage media; optical storage media; flash memory devices, EEPROM, and/or other suitable non-transitory storage media) configured to store data received from, as well as instructions for, the processor 120. In one embodiment, both the data and instructions are stored in one computer readable medium. In other embodiments, the data may be stored in one medium (e.g., RAM), and the instructions may be stored in a different medium (e.g., EEPROM). The input/output interface 124 can include a driver for interfacing with a camera, display, touch screen, keyboard, track ball, gauge or dial, and/or other suitable types of input/output devices.
In certain embodiments, the controller 118 can be operatively coupled to the other components of the electronic system 100 via a hardwire communication link (e.g., a USB link, an Ethernet link, an RS232 link, etc.). In other embodiments, the controller 118 can be operatively coupled to the other components of the electronic system 100 via a wireless connection (e.g., a WIFI link, a Bluetooth link, etc.). In further embodiments, the controller 118 can be configured as an application specific integrated circuit, system-on-chip circuit, programmable logic controller, and/or other suitable computing framework.
In certain embodiments, the detector 104, the output device 106, and the controller 118 may be configured as a desktop computer, a laptop computer, a tablet computer, a smart phone, an electronic whiteboard, and/or other suitable types of computing devices. In other embodiments, the output device 106 may be at least a part of a television set. The detector 104 and/or the controller 118 may be integrated into or separate from the television set. In further embodiments, the controller 118 and the detector 104 may be configured as a unitary component (e.g., a game console, a camera, or a projector), and the output device 106 may include a television screen and/or other suitable displays. In additional embodiments, the input device 102, a computer storage medium storing instructions for the processor 120, and associated operational instructions may be configured as a kit. In yet further embodiments, the input device 102, the detector 104, the output device 106, and/or the controller 118 may be independent from one another or may have other suitable configurations.
The user 101 can operate of the controller 118 in a touch free fashion by, for example, swinging, gesturing, and/or otherwise moving his/her finger with the input device 102. The electronic system 100 can monitor the user's finger movements and correlate the movements with computing commands from the user 101. The electronic system 100 can then execute the computing commands by, for example, moving the computer cursor 108 from a first position 109a to a second position 109b. One of ordinary skill in the art will understand that the discussion below is for illustration purposes only. The electronic system 100 can be configured to perform other operations in addition to or in lieu of the operation discussed below.
In operation, the controller 118 can instruct the detector 104 to start monitoring the marker 103 of the input device 102 for commands based on certain preset conditions. For example, in one embodiment, the controller 118 can instruct the detector 104 to start monitoring the signal 110 when the signal 110 emitted from the marker 103 is detected. In another example, the controller 118 can instruct the detector 104 to start monitoring the signal 110 when the controller 118 determines that the signal 110 is relatively stationary for a preset period of time (e.g., 0.1 second). In further example, the controller 118 can instruct the detector 104 to start monitoring the signal 110 based on other suitable conditions.
After the detector 104 starts to monitor the markers 103 on the input device 102, the processor 120 samples a captured image of the input device 102 from the detector 104 via the input/output interface 124. The processor 120 then performs image segmentation by identifying pixels and/or image segments in the captured image corresponding to the emitted signal 110. The identification may be based on pixel intensity and/or other suitable parameters.
The processor 120 then identifies certain characteristics of the segmented image of the input device 102. For example, in one embodiment, the processor 120 can identify a number of observed markers 103 based on the segmented image. The processor 120 can also calculate a distance between individual pairs of markers 103 in the segmented image. In other examples, the processor 120 may also perform shape (e.g., a circle or oval) fitting based on the segmented image and know configuration of the markers 103. In further examples, the processor 120 may perform other suitable analysis on the segmented image.
The processor 120 then retrieves a predetermined pattern of the input device 102 from the memory 122. The predetermined pattern may include orientation and/or position parameters of the input device 102 calculated based on analytical models. For example, the predetermined pattern may include a number of observable markers 103, a distance between individual pairs of markers 103, and/or other parameters based on a known planar angle between the input device 102 and the detector 104. By comparing the identified characteristics of the segmented image and the retrieved predetermined pattern, the processor 120 can determine at least one of the possible orientations and a current distance from the detector of the input device 102.
The processor 120 then repeats the foregoing operations for a period of time (e.g., 0.5 seconds) and accumulates the determined orientation and/or distance in a buffer or other suitable computer memory. Based on the accumulated orientation and/or distance at multiple time points, the processor 120 can then construct a temporal trajectory of the input device 102 between. The processor 120 then compares the constructed temporal trajectory to a trajectory action model (
Once the user action is determined, the processor 120 can map the determined user action to a control and/or other suitable types of operation. For example, in the illustrated embodiment, the processor 120 may map the generally linear swing of the index figure to a generally linear movement of the computer cursor 108. As a result, the processor 120 outputs a command to the output device 106 to move the computer cursor 108 from the first position 109a to the second position 109b.
Several embodiments of the electronic system 100 can be more intuitive or natural to use than conventional input devices by recognizing and incorporating commonly accepted gestures. For example, left or right shift of the computer cursors 108 can include left or right shift of the index finger of the user 101. Also, several embodiments of the electronic system 100 do not require rigid postures of the user 101 when operating the electronic system 100. Instead, the user 101 may operate the electronic system 100 in any posture comfortable to him/her with the input device 102 on his/her finger. In addition, several embodiments of the electronic system 100 can be more mobile than certain conventional input devices because operating the input device 102 does not require a hard surface or any other support.
In other embodiments, the input device 102 may include more or fewer markers 103 with other suitable arrangements, as shown in
In operation, the input module 132 can accept data input 150 (e.g., images from the detector 104 in
The process module 136 analyzes data input 150 from the input module 132 and/or other data sources, and the output module 138 generates output signals 152 based on the analyzed data input 150. The processor 120 may include the display module 140 for displaying, printing, or downloading the data input 150, the output signals 152, and/or other information via the output device 106 (
The sensing module 160 is configured to receive the data input 150 and identify the marker 103 (
In one embodiment, the sensing module 160 includes a comparison routine that compares light intensity values of the individual pixels with a preset threshold. If a light intensity is above the preset threshold, the sensing module 160 can indicate that the pixel corresponds to one of the markers 103. In another embodiment, the sensing module 160 may include a shape determining routine configured to approximate or identify a shape of the segmented pixels in the still image. If the approximated or identified shape matches a preset shape of the markers 103, the sensing module 160 can indicate that the pixels correspond to the markers 103.
In yet another embodiment, the sensing module 160 can include a filtering routine configured to identify pixels with a particular color index, peak frequency, average frequency, and/or other suitable spectral characteristics. If the filtered spectral characteristics match a preset value of the markers 103, the sensing module 160 can indicate that the pixels correspond to the markers 103. In further embodiments, the sensing module 160 may include a combination of at least some of the comparison routine, the shape determining routine, the filtering routine, and/or other suitable routines.
The calculation module 166 may include routines configured to perform various types of calculations to facilitate operation of other modules. For example, the calculation module 166 can include a sampling routine configured to sample the data input 150 at regular time intervals along preset directions. In certain embodiments, the sampling routine can include linear or non-linear interpolation, extrapolation, and/or other suitable subroutines configured to generate a set of data, images, frames from the detector 104 (
The calculation module 166 can also include a modeling routine configured to determine an orientation of the input device 102 relative to the detector 104. In certain embodiments, the modeling routine can include subroutines configured to determine and/or calculate parameters of the segmented image. For example, the modeling routine may include subroutines to determine a quantity of markers 103 in the segmented image. In another example, the modeling routine may also include subroutines that calculate a distance between individual pairs of the markers 103.
In another example, the calculation module 166 can also include a trajectory routine configured to form a temporal trajectory of the input device 102. In one embodiment, the calculation module 166 is configured to calculate a vector representing a movement of the input device 102 from a first position/orientation at a first time point to a second position/orientation at a second time point. In another embodiment, the calculation module 166 is configured to calculate a vector array or plot a trajectory of the input device 102 based on multiple position/orientation at various time points. In other embodiments, the calculation module 166 can include linear regression, polynomial regression, interpolation, extrapolation, and/or other suitable subroutines to derive a formula and/or other suitable representation of movements of the input device 102. In yet other embodiments, the calculation module 166 can include routines to compute a travel distance, travel direction, velocity profile, and/or other suitable characteristics of the temporal trajectory. In further embodiments, the calculation module 166 can also include counters, timers, and/or other suitable routines to facilitate operation of other modules.
The analysis module 162 can be configured to analyze the calculated temporal trajectory of the input device 102 to determine a corresponding user action or gesture. In certain embodiments, the analysis module 162 analyzes characteristics of the calculated temporal trajectory and compares the characteristics to the action model 142. For example, in one embodiment, the analysis module 162 can compare a travel distance, travel direction, velocity profile, and/or other suitable characteristics of the temporal trajectory to known actions or gesture in the action model 142. If a match is found, the analysis module 166 is configured to indicate the identified particular user action or gesture.
The analysis module 162 can also be configured to correlate the identified user action or gesture to a control action based on the action-command map 144. For example, if the identified user action is a lateral move from left to right, the analysis module 162 may correlate the action to a lateral cursor shift from left to right, as shown in
The control module 164 may be configured to control the operation of the controller 118 (
As shown in
Another stage 204 of the method 200 includes processing the acquired input data to identify a temporal trajectory of the input device 102. In one embodiment, the identified temporal trajectory includes a vector representing a movement of the input device 102. In other embodiments, the identified temporal trajectory includes a vector array that describes position and orientation of the input device 102 at different time moments. In further embodiments, the identified movement can include other suitable representations of the input device 102. Certain embodiments of processing the acquired input data are described in more detail below with reference to
The method 200 then includes a decision stage 206 to determine if sufficient data are available. In one embodiment, sufficient data are indicated if the processed input data exceed a preset threshold. In another embodiment, sufficient data are indicated after a preset period of time (e.g., 0.5 seconds) has elapsed. In further embodiments, sufficient data may be indicated based on other suitable criteria. If sufficient data are not indicated, the process reverts to acquiring detector signal at stage 202; otherwise, the process proceeds to interpreting user action based on the identified temporal trajectory of the input device 102 at stage 208.
In certain embodiments, interpreting user action includes analyzing and comparing characteristics of the temporal trajectory with known user actions. For example, a position, position change, lateral movement, vertical movement, movement velocity, and/or other characteristics of the temporal trajectory may be calculated and compared with a predetermined action model. Based on the comparison, a user action may be indicated if characteristics of the temporal trajectory match those in the action model. An example of interpreting user action is described in more detail below with reference to
The method 200 further includes another stage 210 in which the identified user action is mapped to a command. The method 200 then includes a decision stage 212 to determine if the process should continue. In one embodiment, the process is continued if further movement of the input device 102 is detected. In other embodiments, the process may be continued based on other suitable criteria. If the process is continued, the process reverts to acquiring sensor readings at stage 202; otherwise, the process ends.
Another stage 221 of the method 204 includes modeling the segmented image to determine at least one of an orientation and position of the input device 102 (
Optionally, the process can also include signal sampling at stage 222. In one embodiment, the models (e.g., position and/or orientation) of the input device 102 generated based on the acquired input data are sampled at regular time intervals along x-, y-, or z-direction by applying linear interpolation, extrapolation, and/or other suitable techniques. In other embodiments, the image model of the acquired detector signal is sampled at other suitable time intervals. In further embodiments, the image sampling stage 222 may be omitted. After the optional signal sampling, the process returns to the method 200 of
The acquired image of the input device 102 at time ti, is then segmented to identify pixels or image segments Pt
As described above with reference to
B=D*bi/di
where bi is an observed distance between two marker projections; D is the predetermined distance between the center of the input device 102 and the detector 104; and di is a predetermined distance between two marker projections.
The foregoing operations can be repeated to form a temporal trajectory that can be interpreted as certain command and/or data input.
In one embodiment, if all of the first, second, and third characteristics of the temporal trajectory are identified, the user action may be recognized as a click, a selection, a double click, and/or other suitable commands. In other embodiments, only some of the first, second, and third characteristics may be used to correlate to a command. In further embodiments, at least one of these characteristics may be used in combination with other suitable characteristics to correlate to a command.
Even though the electronic system 100 in
The individual markers 102 may operate independently from one another or may be used in combination to provide command to the electronic system 100. For example, in one embodiment, the electronic system 100 may recognize that the first and second markers 102a and 102b are joined together in a closing gesture. In response, the electronic system 100 may correlate the closing gesture to a command to close a program, to a click, or to other suitable operations. In other embodiments, the individual markers 102 may have corresponding designated functions. For example, the electronic system 100 may recognize movements of only the second input device 102b as cursor shift. In further embodiments, the markers 102 may operate in other suitable fashions. In yet further embodiments, the user 101 (
From the foregoing, it will be appreciated that specific embodiments of the disclosure have been described herein for purposes of illustration, but that various modifications may be made without deviating from the disclosure. In addition, many of the elements of one embodiment may be combined with other embodiments in addition to or in lieu of the elements of the other embodiments. Accordingly, the technology is not limited except as by the appended claims.
This application claims priority to U.S. Provisional Application No. 61/517,159, filed on Apr. 15, 2011.
Number | Date | Country | |
---|---|---|---|
61517159 | Apr 2011 | US |