The subject disclosure relates to user interaction with computing devices, and more particular, to body-based user interaction with mobile devices.
Mobile devices, such as cell phones, portable media players, personal digital assistants (PDAs), messaging devices, portable game players, are ubiquitous. Many users carry multiple mobile devices with them. Input devices for these mobile devices however are usually inadequate for a number of reasons.
With traditional keypads, there is a tradeoff between size and number of buttons. For example, in today's cell phones, at one end of the spectrum, due to the desire to have a palm sized device, a full QWERTY keypad input drives the size of buttons to the limit of smallness, whereas at the other end of the spectrum, character groupings for buttons (e.g., ABC->1), while they save space and allow for bigger buttons, the buttons may require tedious multiple pressing of buttons to achieve input precision for a single character.
Moreover, since keypads are often integrated into the devices, user input often cannot be personalized for a particular user of the device. Therefore, people with limited eyesight, for instance, often have problems finding devices with large buttons to meet their needs. In addition, buttons on the embedded keypad often cannot be reassigned by the user to perform user-defined functions.
Audio input and command also suffers from disadvantages relating to precision of input, and may not be intuitive for different languages or dialects. Furthermore, audio input suffers from a lack of privacy and potential to control other nearby mobile devices within earshot. Thus, audio input is inappropriate in a number of different environments (e.g., a public bus, office cubicles).
In addition, today's mobile devices usually do not share their user input devices with other computing devices, such as the user's other mobile devices. For example, a user's portable media player is often unaware that a user is making or receiving a call on the user's cell phone. Thus, improved ways to interact with mobile devices are desirable.
The above-described deficiencies of interacting with mobile devices are merely intended to provide an overview of some of the problems of interacting with today's mobile devices, and are not intended to be exhaustive. Other problems with the state of the art may become further apparent upon review of the description of various non-limiting embodiments that follows.
The following presents a simplified summary in order to provide a basic understanding of some aspects of the invention. This summary is not an extensive overview of the invention. It is intended to neither identify key or critical elements of the invention nor delineate the scope of the invention. Its sole purpose is to present some concepts of the invention in a simplified form as a prelude to the more detailed description that is presented later.
Body movements or other body information are captured as input to one or more mobile devices, such as a cell phone. This type of interaction can be instead or a supplement to traditional keypads and audio interaction with a mobile device. The sensors can include orientation sensors that track movement of various parts of the body, which can be used on the mobile device to make input to the one or more mobile devices.
In various embodiments, sensors are placed on or proximate to one or more places of the body (e.g., fingers, legs, torso, head, etc.) capturing some or all of a user's body movements and conditions. The body movements sensed by the sensors can be used as to make input to one or more mobile devices for a variety of tasks, such as making finger graffiti in the air for phone input to make phone calls, playing games such as boxing that sense a punch is thrown, receiving sports instruction (e.g., how to improve a golf swing or a bowling ball roll), and a variety of other tasks and services as well.
The sensors are advantageously wireless coupled to one or more mobile devices although some or all of the sensors can be connected by wire to the mobile device. The sensors can also advantageously be included as part of jewelry (e.g., rings, watches, bracelets, necklaces, pins, cuff links, etc.) or other fashion accessories (e.g., shoes, belts, foot bands, socks, pocket squares, head bands, hats, etc.).
In one embodiment, as mobile devices become connected to other autonomous computing devices and networks, certain gestures by the user can also initiate a series of actions and/or communications that cause an external response. For instance, one might drive up to their garage door, and through a body movement gesture captured by body sensors communicatively coupled to the mobile device, initiate a sequence of operations that opens the garage door.
In one embodiment, one or more ring sensors for sensing finger movement are particularly advantageous because many people wear rings today, and can be used to determine what input is being conveyed for a virtual input device. For example, the rings sensor can determine what keys of a virtual “air” keyboard a user is pressing or where to move the cursor for a virtual mouse or trackball.
In at least some embodiments, the sensor array is able to be turned on/off, such as automatically depending on the environment for privacy reasons and/or to prevent undesired actions from being performed. For example, the array can be automatically turned off when the user enters a bathroom if a sensor is available to detect a user's location in a bathroom. Similarly, body-based movement interaction can be turned off when it is detected that the user is performing an activity that involves body-movement unrelated to input to the mobile device (e.g., driving, dancing, playing an instrument, etc.).
To the accomplishment of the foregoing and related ends, certain illustrative aspects of the invention are described herein in connection with the following description and the annexed drawings. These aspects are indicative, however, of but a few of the various ways in which the principles of the invention may be employed and the present invention is intended to include all such aspects and their equivalents. Other advantages and novel features of the invention may become apparent from the following detailed description of the invention when considered in conjunction with the drawings.
The present invention is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It may be evident, however, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing the present invention.
As discussed in the background, today, input devices for cell phones are inadequate due to the technological and engineering tradeoffs involved. In consideration of these limitations on input techniques for mobile devices, the invention captures body movements or other body information as input to a mobile device, such as a cell phone. In various embodiments, sensors are placed on one or more places of the body, e.g., fingers, legs, torso, head, etc. everywhere, capturing all of a user's body movements and conditions. The body movement sensed by the sensors can be used as to make input to a mobile device for a variety of tasks, such as making finger graffiti in the air for phone input to make phone calls, playing games such as boxing that sense a punch is thrown, receiving sports instruction, e.g., how to improve a golf swing, or communicating a need to drink more fluids, etc., and a variety of other tasks and services as well.
Turning to
With reference to
Computer 110 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 110. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile as well as removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CDROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 110. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
The system memory 130 may include computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) and/or random access memory (RAM). A basic input/output system (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during start-up, may be stored in memory 130. Memory 130 typically also contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation, memory 130 may also include an operating system, application programs, other program modules, and program data.
The computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. For example, computer 110 could include a flash memory that reads from or writes to non-removable, nonvolatile media, a magnetic disk drive that reads from or writes to a removable, nonvolatile magnetic disk, and/or an optical disk drive that reads from or writes to a removable, nonvolatile optical disk, such as a CD-ROM or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, digital versatile disks, digital video tape, solid state RAM, solid state ROM and the like.
A user may enter commands and information into the computer 110 through input devices. Input devices are often connected to the processing unit 120 through user input 140 and associated interface(s) that are coupled to the system bus 121, but may be connected by other interface and bus structures, in a wired or wireless manner, such as a parallel port, game port, a universal serial bus (USB), wireless USB, or Bluetooth. A graphics subsystem may also be connected to the system bus 121. One or remote sensors, including orientation sensors 145 are also connected to system bus 121 via input interface 140. At least one of the sensors is attached or proximate to the user's body and each sensor is communicatively coupled to computer via wired or wireless means. A monitor or other type of remote output devices 155 may also connected to the system bus 121 via an interface, such as output interface 150, which may in turn communicate with video memory. In addition to a monitor, computer 110 may also include other peripheral output devices, which may be connected through output interface 150.
The computer 110 may operate in a networked or distributed environment using logical connections to one or more other remote computers, such as remote computer 170, which may in turn have capabilities different from device 110. The logical connections depicted in
When used in a PAN networking environment, the computer 110 is connected to the PAN through a network interface or adapter, such as a Bluetooth or Wireless USB adapter. When used in a LAN networking environment, the computer 110 is connected to the LAN through a network interface or adapter. When used in a WAN networking environment, the computer 110 typically includes a communications component, such as a modem, or other means for establishing communications over the WAN, such as the Internet. A communications component, such as a network interface card, which may be internal or external, wired or wireless, may be connected to the system bus 121 via the user input interface of input 140, or other appropriate mechanism. In a networked environment, program modules, or portions thereof, may be stored in a remote memory storage device.
Otherwise, device 210 continues to monitor the input from the sensors S1 to S7. In one embodiment, the sensors include orientation sensors that track movement of various parts of the body, which can be used on the mobile device to make input to the mobile device. The device 210 can be a wearable computer or a more traditional handheld mobile device that is carrier in a purse, pocket, or belt holder/holster.
In at least one embodiment, one or more body-movement sensors are shared between multiple mobile devices. Advantageously, this allows each mobile device to respond appropriately based on the user's movement. For example, assume that a user has both a cell phone and portable video player. When body movements indicate the user is answering the cell phone, the portable video player can paused. Once the user hangs up on the cell phone, the portable video player can resume playback. One will appreciate that this functionality can be used in other scenarios and with different device pairs, such as a portable media player and a portable game player, a cell phone and a portable game player, etc.
Sensors that sense body position or other user information can be advantageously placed in or attached to jewelry, such as a rings, watches or necklaces, fashion accessories, such as belts, headbands, or hats, or garments, such as shoes. Accordingly, there can be no need to look for or hold a special device when body-based movement is used for input.
One will also appreciate that one or more sensors can be embedded or attached to traditional input/output devices for mobile devices that are worn by the user. For example, one or more sensors can be embedded into a Bluetooth or other wired/wireless headset or wireless/wired headphones.
As mobile devices become connected to other autonomous computing devices and networks, certain gestures by the user can also initiate a series of actions and/or communications that cause an external response. For instance, one might drive up to their garage door, and through a body movement gesture captured by body sensors communicatively coupled to the mobile device, initiate a sequence of operations that opens the garage door. As a second example, one might operate remote-controlled devices (e.g., television, DVR, DVD, VCR, stereo, set-top box, etc.) through a body movement gesture captured by body sensors communicatively coupled to the mobile device and initiate a sequence of operations on a nearby universal remote that operates a remote-controlled device.
This aspect is illustrated in
For example, the mobile device can interpret pointing at an appliance or non-portable computing device as a desire to control the appliance or device. Combining the orientation of a finger/hand with a map (e.g., a 3D map) of nearby objects, one can activate/operate the appliance or computing device with a natural pointing gesture. For example, one can point to an air conditioner thermostat at a distance to turn the air conditioning on/off in an indoor environment even without a specific remote controller at hand. This communication can include the command to perform and may also include identifying information for the mobile device 510.
The communicative coupling of the sensors to the mobile device can be different than the communicative coupling of the mobile device 510 to either the appliance or the non-portable computing device. For example, the sensors can be connected to the mobile device via Bluetooth and the mobile device to the appliance via wireless Ethernet.
Finger or arm movement sensors communicatively coupled to a mobile device are conducive to recognizing such unique hand gestures. In this manner, multiple sensors for different body locations can together form a single input to a mobile device. As a simple example, doing “jumping jacks” requires movement of both the arms and legs in a certain harmonic manner, which could be detected as a unique gesture input to the mobile device by positioning sensors on both the arms and legs. In this fashion, unique gestures may be defined by a user of the mobile device for performing common actions with the mobile device, such as “Call my friend Lee.”
One will appreciate however, that other body gestures can be used as well. For example, actions can be initiated based on a number of taps of fingers, feet, or hands. Some body gestures may involve the use of two different body areas, such as the “jumping jacks” movement above.
In addition, the gesture-based input language can be customized to the user's needs. For example, a person lacking one or more fingers may choose to use leg gestures instead of hand or finger gestures for gesture-based input. As a second example, a handicapped person may use a standardized sign language as a gesture-based input language.
One or more ring sensors for sensing finger movement are particularly advantageous because many people wear them today, and can be used to determine, for example, which keys of a virtual “air” keyboard a user is pressing. Key input of a mobile device is thus advantageously complemented or replaced by the supplemental mobile device input sensors of the invention.
Although
One will also appreciate that although a virtual input device is described as being displayed to the user on an output display device, in other embodiments, the virtual input device can be constructed out of paper printout of an input device if the layout is previously known to the mobile device and can automatically determined (e.g., via picture recognition or via barcode or other machine-readable tag). Accordingly, virtual keypads of any size, shape, or button configuration can be created.
Turning briefly to
At 1110, the user's current environment is determined based on one or more sensors, such as global positioning sensors or sensors that read location-specific identifiers (e.g., RFID tags or wireless network gateway MAC addresses, etc.). This can be performed periodically, such as once every minute, or be determined based on the amount or nature of body movements. At 1120, it is determined if the current user environment is appropriate for body-based movement. At 1130, depending on the current state of body-based movement input and the determination made at 1120, body-based movement is turned on/off as appropriate.
The present invention has been described herein by way of examples. For the avoidance of doubt, the subject matter disclosed herein is not limited by such examples. In addition, any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs, nor is it meant to preclude equivalent exemplary structures and techniques known to those of ordinary skill in the art. Furthermore, to the extent that the terms “includes,” “has,” “contains,” and other similar words are used in either the detailed description or the claims, for the avoidance of doubt, such terms are intended to be inclusive in a manner similar to the term “comprising” as an open transition word without precluding any additional or other elements.
Various implementations of the invention described herein may have aspects that are wholly in hardware, partly in hardware and partly in software, as well as in software. As used herein, the terms “component,” “system” and the like are likewise intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution. For example, a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on computer and the computer can be a component. One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention. In the case of program code execution on programmable computers, the computing device generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Furthermore, the invention may be described in the general context of computer-executable instructions, such as program modules, executed by one or more components. Generally, program modules include routines, programs, objects, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments. Furthermore, as will be appreciated various portions of the disclosed systems above and methods below may include or consist of artificial intelligence or knowledge or rule based components, sub-components, processes, means, methodologies, or mechanisms (e.g., support vector machines, neural networks, expert systems, Bayesian belief networks, fuzzy logic, data fusion engines, classifiers . . . ). Such components, inter alia, can automate certain mechanisms or processes performed thereby to make portions of the systems and methods more adaptive as well as efficient and intelligent.
Additionally, the disclosed subject matter may be implemented at least partially as a system, method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer or processor based device to implement aspects detailed herein. The terms “article of manufacture,” “computer program product” or similar terms, where used herein, are intended to encompass a computer program accessible from any computer-readable device, carrier, or media. For example, computer readable media can include but are not limited to magnetic storage devices (e.g., hard disk, floppy disk, magnetic strips . . . ), optical disks (e.g., compact disk (CD), digital versatile disk (DVD) . . . ), smart cards, and flash memory devices (e.g., card, stick). Additionally, it is known that a carrier wave can be employed to carry computer-readable electronic data such as those used in transmitting and receiving electronic mail or in accessing a network such as the Internet or a local area network (LAN).
The aforementioned systems have been described with respect to interaction between several components. It can be appreciated that such systems and components can include those components or specified sub-components, some of the specified components or sub-components, and/or additional components, according to various permutations and combinations of the foregoing. Sub-components can also be implemented as components communicatively coupled to other components rather than included within parent components, e.g., according to a hierarchical arrangement. Additionally, it should be noted that one or more components may be combined into a single component providing aggregate functionality or divided into several separate sub-components, and any one or more middle layers, such as a management layer, may be provided to communicatively couple to such sub-components in order to provide integrated functionality. Any components described herein may also interact with one or more other components not specifically described herein but generally known by those of skill in the art.
This non-provisional application claims benefit under 35 U.S.C. §119(e) of U.S. provisional Application No. 60/910,109, filed Apr. 4, 2007.
Number | Date | Country | |
---|---|---|---|
60910109 | Apr 2007 | US |