The present invention relates to electronic devices and, more particularly, to user interfaces for electronic devices, and methods and computer program products for providing user interfaces for electronic devices.
Many electronic devices, such as wireless communication terminals (e.g., cellular telephones), personal digital assistants (PDAs), palmtop computers, and the like, include monochrome and/or color display screens that may be used to display webpages, images and videos, among other things. Portable electronic devices may also include Internet browser software that is configured to access and display Internet content. Thus, these devices can have the ability to access a wide range of information content, including information content stored locally and/or information content accessible over a network such as the Internet.
As with conventional desktop and laptop computers, portable electronic devices have been provided with graphical user interfaces that allow users to manipulate programs and files using graphical objects, such as screen icons. Selection of graphical objects on a display screen of a portable electronic device can be cumbersome and difficult, however. Early devices with graphical user interfaces typically used directional keys and a selection key that allowed users to highlight and select a desired object. Such interfaces can be slow and cumbersome to use, as it may require several button presses to highlight and select a desired object.
More recent devices have employed touch sensitive screens that permit a user to select a desired object by pressing the location on the screen at which the object is displayed. However, such devices have certain drawbacks in practice. For example, the digitizer of a touch screen can “drift” over time, so that the touch screen can improperly interpret the location that the screen was touched. Thus, touch screens may have to be recalibrated on a regular basis to ensure that the digitizer is properly interpreting the location of touches.
Furthermore, while the spatial resolution of a touch screen can be relatively high, users typically want to interact with a touch screen by touching it with a fingertip. Thus, the size of a user's fingertip limits the actual available resolution of the touchscreen, which means that it can be difficult to manipulate small objects or icons on the screen, particularly for users with large hands. Furthermore, when using a touchscreen, the user's finger can undesirably block all or part of the display in the area being touched. System designers are faced with the task of designing interfaces that can be used by a large number of people, and thus may design interfaces with icons larger than necessary for most people. Better touch resolution can be obtained by using a stylus instead of a touch screen. However, users may not want to have to use a separate instrument, such as a stylus, to interact with their device.
Some attempts have been made to provide alternate means of interacting with display screens. For example, attempts have been made to use cameras to image hand gestures which are interpreted as commands. For example, one approach uses a camera to recognize when a thumb and forefinger have been joined together, thus creating a new “object” (i.e. the oval region bounded by the user's thumb, forefinger and hand) in the display field. However, in this approach, unless a new “object” is created in the display field, no recognition or control occurs.
Yet another approach uses a touch pad on the back side of a device, opposite the display screen, which a user can touch to select icons on the display screen. A camera positioned away from an electronic device images the user's fingers on the touch pad. The image of the user's fingers is superimposed onto the display screen. However, the resolution of such a system is still limited by the size of the user's fingertip.
An electronic device according to some embodiments includes a display screen, a controller that is coupled to the display screen and that is configured to display an object on the display screen and to superimpose a moving picture of a pointing object that may be external to the electronic device onto the display screen, and a user input management unit that is coupled to the controller and that is configured to interpret a plurality of features of the pointing object as selection pointers so that a movement of the pointing object relative to the display screen is interpreted by the user input management unit as movement of a plurality of selection pointers.
The user input management unit may be configured to interpret movement of the plurality of selection pointers relative to one another as a selection command.
The user input management unit may be configured to interpret movement of two of the plurality of selection pointers into contact with each other as a selection command. In the case where a user's fingers are used as selection pointers, more than one finger can be used to generate a selection command. For example, a circle formed by the user's index finger and thumb can be used as a selection object. Furthermore, multiple fingertips can be interpreted as defining a selection area. The distance of a selection pointer from the camera (i.e. the z-axis) can be used to interpret a selection command. For example, a “button push” selection command can be recognized when the selection pointer is moved close to/away from the camera. It will be further appreciated that a “selection command” can be an intermediate command. For example, a selection command can open a pop-up menu or selection window that permits the user to make a further selection.
The user input management unit may be configured to interpret magnification of the pointing object as a zoom command, and to increase a magnification of an image on the display screen in response to magnification of the pointing object.
The controller may be configured to magnify a portion of the object on the display screen in response to a selection region between the selection pointers being moved over the object on the display screen. A magnification level of the portion of the object on the display screen may be determined in response to a spacing between two of the selection pointers.
Portions of the display screen around the object may be magnified, and a level of magnification of portions of the display screen around the object may be proportional to a distance from the object.
The user input management unit may include a software object implemented by the controller.
The electronic device may include a housing including a front side and a reverse side opposite the front side. The display screen may be positioned on the front side and the camera may include a lens that is positioned on the reverse side, opposite the front side, at a point on the reverse side corresponding to a center of the display screen. In some embodiments, the lens may be positioned at a location that is offset from the center of the display screen.
Some embodiments provide methods of operating an electronic device including a user input device and a display screen. The methods include superimposing a moving picture of a pointing object that is external to the electronic device onto the display screen, and interpreting a plurality of features of the pointing object as selection pointers so that a movement of the pointing object relative to the display screen may be interpreted as movement of a plurality of selection pointers.
The methods may further include interpreting movement of the plurality of selection pointers relative to one another as a selection command and/or interpreting movement of two of the plurality of selection pointers into contact with each other as a selection command. Furthermore, gestures of one or more pointer movements can be interpreted as a command. For example, a gesture forming a circle could be interpreted as a command.
The methods may further include interpreting magnification of the pointing object as a zoom command, and increasing a magnification of an image on the display screen in response to magnification of the pointing object.
The methods may further include magnifying a portion of the object on the display screen in response to a selection region between the selection pointers being moved over the object on the display screen. A magnification level of the object on the display screen may be determined in response to a spacing between the selection pointers.
The methods may further include magnifying portions of the display screen around the object. A level of magnification of portions of the display screen around the object may be proportional to a distance from the object.
A computer program product for operating a portable electronic device including a user input device and a display screen according to some embodiments includes a computer readable storage medium having computer readable program code embodied in the medium. The computer readable program code includes computer readable program code configured to superimpose a moving picture of a pointing object that is external to the electronic device onto the display screen, and computer readable program code configured to interpret a plurality of features of the pointing object as selection pointers so that a movement of the pointing object relative to the display screen may be interpreted by the user input management unit as movement of a plurality of selection pointers.
Other systems, methods, and/or computer program products according to embodiments of the invention will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional systems, methods, and/or computer program products be included within this description, be within the scope of the present invention, and be protected by the accompanying claims.
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate certain embodiment(s) of the invention. In the drawings:
The present invention now will be described more fully with reference to the accompanying drawings, in which embodiments of the invention are shown. However, this invention should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout.
As used herein, the term “comprising” or “comprises” is open-ended, and includes one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Furthermore, as used herein, the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. If used herein, the common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this disclosure and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
It will be understood that when an element is referred to as being “coupled” or “connected” to another element, it can be directly coupled or connected to the other element or intervening elements may also be present. In contrast, when an element is referred to as being “directly coupled” or “directly connected” to another element, there are no intervening elements present. Furthermore, “coupled” or “connected” as used herein may include wirelessly coupled or connected.
The present invention may be embodied as methods, electronic devices, and/or computer program products. Accordingly, the present invention may be embodied in hardware (e.g. a controller circuit or instruction execution system) and/or in software (including firmware, resident software, micro-code, etc.), which may be generally referred to herein as a “circuit” or “module”. Furthermore, the present invention may take the form of a computer program product on a computer-usable or computer-readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system. In the context of this document, a computer-usable or computer-readable medium may be any medium that can electronically/magnetically/optically retain the program for use by or in connection with the instruction execution system, apparatus, controller or device.
Embodiments according to the present invention are described with reference to block diagrams and/or operational illustrations of methods and communication terminals. In this regard, each block may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It is to be understood that each block of the block diagrams and/or operational illustrations, and combinations of blocks in the block diagrams and/or operational illustrations, can be implemented by radio frequency, analog and/or digital hardware, and/or program instructions. These program instructions may be provided to a controller, which may include one or more general purpose processors, special purpose processors, ASICs, and/or other programmable data processing apparatus, such that the instructions, which execute via the controller and/or other programmable data processing apparatus, create means for implementing the functions/acts specified in the block diagrams and/or operational block or blocks. In some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the operational illustrations. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
These computer program instructions may also be stored in a computer-usable or computer-readable memory that may direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer usable or computer-readable memory produce an article of manufacture including instructions that implement the function specified in the flowchart and/or block diagram block or blocks.
The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, or semiconductor system, apparatus, or device. More specific examples (a nonexhaustive list) of the computer-readable medium include the following: hard disks, optical storage devices, magnetic storage devices, a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), and a compact disc read-only memory (CD-ROM).
An electronic device can function as a communication terminal that is configured to receive/transmit communication signals via a wireline connection, such as via a public-switched telephone network (PSTN), digital subscriber line (DSL), digital cable, or another data connection/network, and/or via a wireless interface with, for example, a cellular network, a satellite network, a wireless local area network (WLAN), and/or another communication terminal.
An electronic device that is configured to communicate over a wireless interface can be referred to as a “wireless communication terminal” or a “wireless terminal.” Examples of wireless terminals include, but are not limited to, a cellular telephone, personal data assistant (PDA), pager, and/or a computer that is configured to communicate data over a wireless communication interface that can include a cellular telephone interface, a Bluetooth interface, a wireless local area network interface (e.g., 802.11), another RF communication interface, and/or an optical/infra-red communication interface.
A portable electronic device may be portable, transportable, installed in a vehicle (aeronautical, maritime, or land-based), or situated and/or configured to operate locally and/or in a distributed fashion at any other location(s) on earth and/or in space.
Some embodiments of the present invention will now be described below with respect to
Although the portable electronic device 10 is illustrated as having a separate keypad 60, it will be appreciated that the keypad 60 could be implemented as soft keys on a touch-sensitive display screen 20.
As illustrated in
It will be appreciated that while the camera 27 is shown as integrated within the housing 12, the camera 27 can be separate from the housing 12 and can communicate with the electronic device 10 wirelessly and/or over a wired interface.
The term “superimpose” is used herein to denote that the image captured by the camera 27 is displayed on the display screen 20 at the same time as an object, such as an icon or other image is displayed on the display screen 20. The image that is superimposed on the display screen 20 can appear to be over or under the displayed object, and one or both of the image or the displayed object can be at least partially transparent, so that both the image and the object can be visible at the same location on the display screen 20. It will be appreciated, however, that the image does not have to be a superimposed image. In this user input mode, the background may be completely removed, i.e. transparent. Furthermore, the image the camera records does not have to be used in its original form. For example, it may be transformed into pointers only, or stylized version of fingers with only a shadow where they are, or even a 3D rendering of them.
According to some embodiments, the electronic device 10 further includes a user input management unit 40 (
As shown in
To facilitate recognition of the user's hand, it may be desirable for the camera to be configured with a relatively short focal length and a relatively short depth of field (DOF) while operating in a control mode, so that objects in the background appear out of focus, while an object, such as the user's hand, that is held at arm's length or closer to the lens 27A, can remain in focus. Furthermore, the device 10 can be configured to automatically set the DOF to a desired level when entering the control mode. It will be appreciated that DOF can be affected by a number of aspects of camera design and configuration, including aperture size, focal length and magnification. Configuration of a camera to have a desired DOF at a desired focal distance is within the ordinary skill of a camera designer. In some embodiments, the device 10 can recognize the presence of fingertips in the camera view, and can adjust the camera settings as desired to facilitate image recognition.
In some embodiments, the camera 27 can be configured to image infrared heat signals, so that the heat signal from a user's hand can be used to generate a thermal image that can be easily distinguished from background heat noise.
Furthermore, object recognition techniques are well known to those skilled in the art and can be used to recognize the presence of a user's hand within the field of view 27B of the camera 27 and track the motion of the user's hand 62 and fingertips 62A, 62B within the field of view 27B. One way of increasing the effectiveness of interpreting fingers is to mark them with color markers, stickers or special gloves.
In some embodiments, the user input management unit 40 can interpret fingers differently depending on how they are held relative to the camera. For example, in some embodiments, when the back of the user's hand is held toward the camera with the user's fingernails showing, a gesture such as a pinching motion over an icon could be interpreted as a command to invoke an object, such as a program or file, associated with the icon. However, when the front of the user's hand is held toward the camera, a similar gesture could be interpreted as a “grab” or “select and hold” command, so that the icon itself could then be moved around the screen.
Accordingly, the user input management unit 40 can be configured to recognize the presence of a pointing object, such as a user's hand 62, within the field of view 27B of the camera 27. The user input management unit 40 can “clip” the pointing object 62 from the image captured by the camera 27 and superimpose the clipped pointing object 62 onto the display screen 20. Alternatively, the user input management unit 40 can display the entire image from the camera 27 on the display screen 20 without clipping. In further embodiments, the user input management unit 20 can superimpose an object representative of the imaged pointing object onto the display screen. For example, the user input management unit 20 can display a hand-shaped object that is representative of the imaged pointing object. It will be appreciated that when the image of the pointing object 62 is superimposed onto the display screen 20, it can be displayed above or below icons or objects displayed on the display screen 20 from the perspective of a user looking at the display screen 20.
Finger interaction does not have to be limited to a pointer integrated in the user interface (UI). It may be a UI object in itself that even may interact with the UI similarly to the “physical world”, e.g. when pointing a finger, an icon moves along with it, with similar physical properties of weight and friction. Furthermore, the Z-axis in the camera may relate to the z-axis position in a 3D UI, e.g. moving the fingers further away moves tha hand lower in the window stack, thus graying out top windows and highlighting the ones below. This effect can provide 3D navigation in a 3D menu (or any 3D application, e.g. a map application)
Furthermore, where the image of the pointing object 62 (or image representative of the pointing object 62) is superimposed over an object, icon, or other image displayed on the display screen, the pointing object 62 can be displayed with a desired level of transparency, so that the object, icon, or other image beneath the pointing object 62 can remain at least partially visible beneath the pointing object 62. In this manner, the objects, icons, and/or other images displayed on the display screen 20 may not be blocked from view by the image of the pointing object 62. It will be appreciated that the image of the pointing object 62 can be treated as a layer that can be provided with a selected level of transparency and inserted above or below other layers of images displayed on the display screen 20. Hereafter, the pointing object 62 will be assumed to be a user's hand. However, as discussed above, it is understood that other types of pointing objects could be used.
While the portable electronic device 10 is illustrated in
Some operations that can be performed using a device 10 as described above are illustrated in
As shown in
To select the highlighted icon 54A, the user can make a selection gesture, such as pinching his/her fingertips together. That is, the gesture of pinching the fingertips 62A, 62B together over an icon 52A can be interpreted by the user input management unit as a selection command. When a selection command is interpreted by the user input management unit 40, a selection indication can be displayed. The selection indication can take many different forms. For example, as shown in
As illustrated in
A user's hand 62 is superimposed onto the map image 58 shown on the display screen 20. As shown in
The fingertips of the user's hand 62 are interpreted by the user input management unit 62 as selection pointers, so that the map can be manipulated using the locations of the fingertips as anchor points. For example, the map image 58 can be rotated about a point defined by the user's fingertips as the user's hand 62 is rotated.
Referring to
Many different kinds of user gestures can be interpreted as various commands by the user input management unit 40. For example, the pointing object 62 could be interpreted as a stylus for text and/or drawing entry in a manner that emulates drawing on a physical surface. For example, when the pointing object 62 is held at a first distance from camera 27 (i.e., the “pen is up”), motion of the pointing object 62 is not interpreted as a draw command. When the pointing object 62 is farther away (i.e., “the pen is down”), motion of the pointing object 62 is interpreted as a draw command, and writing can be done.
As a further example, a mobile device configured according to some embodiments can act as an wireless mouse that can control a remote device. For example, the device 10 can track the motion of the pointing object 62 with the camera 27 and translate movements of the pointing object 62 into mouse movements and/or mouse commands, but instead of displaying the pointing object 62 on the screen 20, the actual commands as well as mouse coordinates corresponding to the location and/or movement of the pointing device 62 can be sent to the remote device.
Other possible applications of embodiments of the invention include controlling a menu on a television set, sorting pictures on a server using a television monitor as a display, etc.
Operations according to some embodiments are illustrated in
Further operations detect movement or change in size of the pointing object (Block 74). Movement of the features can be recognized as a selection command (Block 76). For example, as explained above, movement of the features together in a pinching motion over an icon can be interpreted as a command to select the icon. A change in size of the pointing object can be interpreted as a zoom command (Block 78). For example, in embodiments where the pointing object is imaged by a camera, movement of the pointing object toward or away from the camera can result in an apparent change of size of the pointing object. In response, an image displayed on the display screen along with the image of the pointing object can be zoomed in or out to a different scale. Furthermore, if the back side of the hand is turned towards the camera, it may in some instances be regarded as invisible or not active.
Furthermore, an area of the display screen between features of the pointing object can be magnified (Block 80). In some embodiments, a region around the area between the features can also be magnified by an amount that is proportional to distance from the area.
Referring to
The portable electronic device 10 may be a mobile radiotelephone forming a part of a radiotelephone communication system 2 as illustrated in
The portable electronic device 10 in the illustrated embodiments includes a portable housing assembly 12, a controller circuit 30 (“controller”), a communication module 32, and a memory 34. The portable electronic device 10 further includes a user interface 22 (i.e., a man machine interface) including a display screen 20 and a camera 27. The user interface 22 can further include a speaker 24, and at one or more input devices 26. The input device 26 may include a keyboard, which may be a numerical keyboard including keys that correspond to a digit as well as to one or more characters, such as may be found in a conventional wireless telephone. In some embodiments, the input device 26 may include a full QWERTY keyboard that may be operated, for example, using thumbs. More than one input device 26 may be included.
The camera 27 can include a digital camera having a CCD (charge-coupled device), CMOS (complementary MOS) or other type of image sensor, and can be configured to record still images and/or moving images and convert the images into a format suitable for display and/or manipulation.
The display screen 20 may be any suitable display screen assembly. For example, the display screen 20 may be a liquid crystal display (LCD) with or without auxiliary lighting (e.g., a lighting panel). In some cases the portable electronic device 10 may be capable of playing video content of a particular quality. For example, a portable electronic device 10 may be configured to display a video stream having a particular aspect ratio, such as 16:9 or 4:3. A number of standard video formats have been proposed for mobile terminals, including Quarter VGA (QVGA, 320×240 pixels), Common Intermediate Format (CIF, 360×288 pixels) and Quarter Common Intermediate Format (QCIF, 180×144 pixels). Moreover, some mobile terminals may have multiple display screens having different display capabilities. Thus, a portable electronic device 10 may be capable of displaying video in one or more different display formats.
The display screen 20 can include a touch-sensitive display screen that is configured to detect touches and convert the detected touches into positional information that can be processed by the controller 30.
The user interface 22 may include any suitable input device(s) including, for example, a touch activated or touch sensitive device (e.g., a touch screen), a joystick, a keyboard/keypad, a dial, a directional key or keys, and/or a pointing device (such as a mouse, trackball, touch pad, etc.). The speaker 24 generates sound responsive to an input audio signal. The user interface 22 can also include a microphone 25 (
The controller 30 may support various functions of the portable electronic device 10, and can be any commercially available or custom microprocessor. In use, the controller 30 of the portable electronic device 10 may generate and display an image on the display screen 20. In some embodiments, however, a separate signal processor and/or video chip (not shown) may be provided in the portable electronic device 10 and may be configured to generate a display image on the display screen 20. Accordingly, the functionality of the controller 30 can be distributed across multiple chips/devices in the portable electronic device 10.
The memory 34 is configured to store digital information signals and data such as a digital multimedia files (e.g., digital audio, image and/or video files).
The communication module 32 is configured to communicate data over one or more wireless interfaces to another remote wireless terminal as discussed herein. The communication module 32 can include a cellular communication module, a direct point-to-point connection module, and/or a WLAN module.
The portable electronic device 10 can include a cellular communication module that allows the device 10 to communicate via the base transceiver station(s) 3 of the network 5 using one or more cellular communication protocols such as, for example, Advanced Mobile Phone Service (AMPS), ANSI-136, Global Standard for Mobile (GSM) communication, General Packet Radio Service (GPRS), enhanced data rates for GSM evolution (EDGE), code division multiple access (CDMA), wideband-CDMA, CDMA2000, and Universal Mobile Telecommunications System (UMTS). The cellular base stations may be connected to a Mobile Telephone Switching Office (MTSO) wireless network, which, in turn, can be connected to a PSTN and/or another network.
A direct point-to-point connection module may include a direct RF communication module or a direct IR communication module. The direct RF communication module may include a Bluetooth module. With a Bluetooth module, the portable electronic device 10 can communicate via an ad-hoc network through a direct point-to-point interface.
With a WLAN module, the wireless terminal 10 can communicate through a WLAN using a communication protocol that may include, but is not limited to, 802.11a, 802.11b, 802.11e, 802.11g, and/or 802.11i.
The communication module 32 can include a transceiver typically having a transmitter circuit and a receiver circuit, which respectively transmit outgoing radio frequency signals (e.g., to the network 5, a router or directly to another terminal) and receive incoming radio frequency signals (e.g., from the network 5, a router or directly to another terminal), such as voice and data signals, via an antenna. The communication module 32 may include a short range transmitter and receiver, such as a Bluetooth transmitter and receiver. The antenna may be an embedded antenna, a retractable antenna or any antenna known to those having skill in the art without departing from the scope of the present invention. The radio frequency signals transmitted between the portable electronic device 10 and the network 5, router or other terminal may include both traffic and control signals (e.g., paging signals/messages for incoming calls), which are used to establish and maintain communication with another party or destination. The radio frequency signals may also include packet data information, such as, for example, cellular digital packet data (CDPD) information. In addition, the transceiver may include an infrared (IR) transceiver configured to transmit/receive infrared signals to/from other electronic devices via an IR port.
The portable electronic device 10 may also be configured to electrically communicate with another terminal via a wireline or cable for the transmission of digital communication signals therebetween,
Although
Furthermore, elements such as the camera 27 that are shown as integral to the device 10 can be separated from the device 10 with a communication path provided therebetween.
Many different applications/variations will be apparent to a skilled person having knowledge of the present disclosure. In the drawings and specification, there have been disclosed typical embodiments of the invention and, although specific terms are employed, they are used in a generic and descriptive sense only and not for purposes of limitation, the scope of the invention being set forth in the following claims.