Field of the Invention
This invention is in the field of improved user interfaces for computerized devices.
Description of the Related Art
Users with limited vision that desire to read small text and other printed indicia often use magnifier lenses to magnify the size of the text. As computer technology has advanced, and more users with limited vision attempt to read display screens, so various types of assistive technology have been developed that allow users to apply software implemented virtual magnifiers to various parts of the computer display screen.
In a related development, as smartphones and other handheld computerized devices with small but high resolution touch screens have developed, the problem of how best to view and interact with small sized text and touch sensitive icons, with normal sized fingers, has also become apparent. This is often referred to as the “fat finger” problem. Prior art methods at finding solutions to the “fat finger” problem have been less satisfactory. Generally solutions have focused on autocorrect mechanisms to correct typing problems, and the like.
Previous work in this area includes Kocienda et. al., U.S. Pat. No. 8,661,362. Other work includes Zaman, U.S. Pat. No. 8,176,438; and Foulk, US patent publication 2009/0027334. Various types of magnifiers have also been implemented in popular operating systems such as the Microsoft Windows Magnifier, present in Windows 7.0 and other versions of Windows.
The invention is based, in part, on the insight that at least with regards to touchscreens, prior art computerized magnifier technology tended to suffer from the problem that if the user's hand was used to designate the portion of the touchscreen that was to be magnified, then this hand would also tend to both obscure the touchscreen, and also make it difficult to select narrowly separated interactive icons on the screen for further interaction.
This invention is also based, in part, on the insight that the inventor's teaching of computerized devices that have both touchscreens, and also additional touchpads mounted in locations away from the touchscreen (such as in the rear of a handheld computerized device, behind a front mounted touchscreen) can be used to provide an improved interactive magnifier system for the user.
As a simple example, consider the situation where the user is attempting to interact with a smartphone, such as the popular Apple iPhone 4-7 series of smartphones, and magnify a portion of the screen so as to make it easier to type on a virtual keypad (with many tiny keys). Such prior art smartphones typically only have front mounted touchscreens. However if the smartphone is also equipped with a rear mounted touchpad, such as taught by applicant in his previous U.S. Pat. No. 9,310,905, the entire contents of which are incorporated herein by reference, then with appropriate software, this rear mounted touchpad may be used to help provide an improved interactive magnifier for the user.
The concept is not limited to detachable touchpads, however. Alternatively computerized devices with built in touchpads, or even remote touchpads, may also be used according to the present disclosure.
In some embodiments, the user may use the touchpad to indicate a position of interest (provide magnifier target data) pertaining to a location on the front mounted touchscreen that the user wishes to have magnified. The computerized device's processor can receive this magnifier target data, optionally display a magnification target marker in this location, and magnify that portion of the screen proximate (around) the location designated by this magnifier target data, optionally along with any associated touch sensitive control areas. The system can then produce a magnified portion of the screen around this location, and place the magnified version in a different location on the display screen (e.g. often displaced from the location designated by the magnifier target data).
In a preferred embodiment, the system will not only magnify a portion of the screen image, but also magnify and reproduce any touch sensitive control areas associated with this image so that the user, perhaps wishing to activate a key or other icon within the magnified image, can interact directly with the magnified image rather than the unmagnified original target, and have the results used by the system as if the user had correctly touched the corresponding unmagnified portion of the screen.
Thus in some embodiments, the invention may be an improved interactive software magnifier for computerized devices equipped with both touch sensitive display screens, and touchpads located in locations separate from the touch sensitive display screens. In one embodiment, touch location information from the touchpad can be used to select an area of the screen to magnify, and any touch sensitive control areas from this area of the screen, which might normally be located too close together for easy selection, can also be magnified. The user can then use the touch sensitive display screen to more conveniently select between these close together control regions by instead interacting with the magnified version of these control regions on the magnified region of the screen.
In this disclosure, the term “touchscreen” is considered to be the main viewing screen, often located on the front of a computerized device, by which the user receives visual information from the computerized device. This touchscreen typically will also be configured to receive touch information from the user and to report on the screen coordinates where this touch information was detected.
The term “touchpad” is considered to be a supplemental touch input system, often located on the back panel of the computerized device, also configured to receive touch information from the user and to report on the touchpad coordinates where this touch information was detected. Note that in some embodiments, the touchpad may also have display capability, and in these embodiments, the system may be considered to have multiple touchscreens. In these embodiments, the secondary touchscreen used to input the magnifier target location (to be described) will be considered to be functioning as the “touchpad”.
As shown in
The computerized device may either have, or be retrofitted to have, at least one touchpad (101), at least one display screen (often a touch sensitive display screen 102), at least one processor, memory, and software. An example of the circuitry of this type of device is shown in
Further assume that the front mounted touchscreen (102) is exactly in front of the rear mounted touchpad (101), and that any bezel surrounding the front mounted touchscreen is not shown. Here the user is gripping the computerized device with a left hand, and one of the user's fingers (110) is touching the rear mounted touchpad at a contact point (110a).
According to some embodiments of the invention, when this happens, the system processor may recognize that an initialization touch sequence designed to implement magnifier functionality is now present. In other embodiments, alternative methods to implement magnifier functionality, such as specific tap sequences, icon presses, physical key input, sound input (e.g. voice recognition), or other magnifier trigger methods may be used.
Here the target area of interest (magnifier target data 120) is located on at least one selectable object (122), which is here a small virtual keyboard composed of a plurality of selectable objects (here the objects are the individual, touch sensitive, virtual keys). The magnifier target data (120) is designating a target area between the “d” and “f” virtual keys. In this example, the virtual keys are shown being placed too closely together to allow for easy finger typing on the unmagnified section of the display screen, which is a common problem for smartphones and other small handheld computerized devices.
The user's finger (110) is not shown. The dashed lines (130) show the region proximate (close to) the magnification target and marker (124), which is being magnified by the device processor and shown in the magnified portion of the display screen (126)
In
In other embodiments, the touchpad can be completely separate from the computerized device, and may communicate touchpad information to the computerized device using a wired or wireless connection.
Expressing the invention in methods terminology, in some embodiments, the computerized device may use its at least one processor (e.g. computer processor with at least one core, often based on the popular ARM, MIPS, x86, or other series of microprocessors) to display at least one object (122) on the display screen (102), (702) without magnification (e.g. at whatever level of magnification the user has selected as being preferred for that screen at that particular time).
This object or objects (122) may be any of text, images, icons and the like. These will usually be displayed on top of an underlying screen background image. This background image will often also be magnified by the invention's magnifier as well, thus preserving the overall appearance of the target region to be magnified, but otherwise this background image will not be discussed further, since our interest is focused on the specific objects (and in particular specific selectable objects) on top of the background image. When this object has associated touch sensitive areas associated with it that activate a software function when the system receives touchscreen input from in these areas, then the objects are termed “selectable objects”.
The display screen (preferably a touchscreen) will usually be a bit mapped display, often an electronic paper, LCD or OLED display, configured to obtain display information from the system processor(s) or memory, and often capable of outputting images at a resolution of 72 dots per inch or more, such as even hundreds of dots per inch. These images may be black and white images, gray scale images, or color images depending upon the type of display and output mode chosen.
In a preferred embodiment, at least some of the objects may be touch sensitive “selectable” objects that, in response to user touch input, will be recognized by the system processor as instructing the processor to perform a user selected function. Typically, the device processor and software recognize that when the system touchscreen is touched in an area that corresponds to the displayed object, the user may be intending to activate a software function associated with that particular object. Often the extent of this touch sensitive object activation area (here termed “corresponding touch sensitive areas”) matches the size and shape of the displayed object exactly. However this need not always be the case, and occasionally the size and shape of the touch sensitive object activation area (corresponding touch sensitive areas) may be somewhat larger or smaller than its corresponding image. Indeed in some cases, there need not be any corresponding image (i.e. hidden touch screen sensitive areas). For the purposes of this discussion, it does not matter. It should suffice that the system software that produces a magnified image of a given object or object of interest (as well as any background areas) also produces a magnified version of the corresponding touch sensitive area associated with that particular object (or background area) of interest, so that both the appearance and the touch activation functionality of the target region of interest are accurately reproduced on the magnified portion of the screen (in magnified and location displaced form) in a manner that preserves both the appearance and functionality of the original target region.
In this disclosure, we will often use a virtual keypad as an example of “at least one selectable object”, but this particular example is not intended to be limiting. This virtual keyboard is also an example of a plurality of selectable objects, because each virtual key on the virtual keyboard can be considered to be an individual selectable object. However non-selectable objects (e.g. objects not responsive to touchpad input (101) or touch sensitive display input 102) will often also be magnified, along with background regions as previously discussed.
The system will typically obtain the target location where the user wishes to magnify (e.g. the magnifier target data 120) directly or indirectly from the system touchpad(s) (101), which as shown in
Thus the user's finger (110) may press a touchpad (101) at location (110a) and the system processor will convert this signal to the corresponding magnifier target data (120). In effect, the coordinates of the user's touch (110a) on the touchpad (101) are mapped by the processor onto the coordinates of the device's touch screen (display screen) producing the magnifier target data (120).
The system will typically send touchpad data (110a) or (120), (704, 706) to its processor(s), and will use the processor(s) to generate the magnifier target data (120) as well as often a visible marker of the magnification target (124, 708) (visible target marker) corresponding to a portion of the display screen that in turn corresponds to the magnifier target data (120, 704, 706).
Put alternatively, in some embodiments, the location of the magnification target (124) will also be made visible (708) on the display screen by a marker (e.g. a small circle, dot, cross, or other indicia) as is shown in
The system will then use its processor(s) to generate (710) a magnified portion (126) of the region of the display screen proximate (e.g. near, often surrounding this target.) See
By “proximate”, this can, for example, be that portion of the display screen that is within a given radius (132) of the location of the magnification target (124), in which case the magnified portion (126) may be a circular magnified portion (shown in
Other magnification portion shapes (e.g. virtual magnifying lens shapes) are also contemplated as well. Just as real magnifying lenses can be rectangles and other types of shapes, so the magnifying portion shapes can also be various types of rectangles as well as other shapes. These other shapes, as well as other factors such as the amount of magnification, the area being magnified (130, 132), and the location of the magnified portion (See
Although typically the magnification factor will be positive (e.g. the magnified image will be larger than the target image), this is not intended to be limiting. Just as some types of real lenses can shrink the appearance of the target image in the magnifier, so to in some embodiments, the software magnification factor may be negative. This alternate embodiment may be useful for generating overviews of large images that are too big to otherwise fit in the device's display screen all at once.
The system will then typically display (716) this magnified portion (126) on a touch sensitive display screen (102), suitably matched with the magnified corresponding touch sensitive areas. As previously discussed, often, this magnified portion (126) will be displayed on a location of the display screen that is different (e.g. offset, see
Although in some embodiments, the system may merely show the magnified portion of the screen (126) without also replicating and magnifying any touch sensitive control areas associated with the magnification target (124), in a preferred embodiment, the display screen's original touch sensitive control areas (
Here the system will often operate by obtaining (
Differences between magnifier target data and magnifier selection data.
The magnifier target data lets the system know what portion or area of the display screen to magnify. By contrast, the magnifier selection data represents user touches onto the magnified portion as displayed on the touchscreen, usually for purposes to select a selectable object that has been magnified by the system's virtual magnifier.
So to summarize, the magnifier target data (
By contrast, the magnifier selection data (128) obtained from the touch sensitive display screen (102) is typically obtained using a different portion of the user's hand (such as the thumb) (
According to this terminology, the magnifier selection data (see
This is shown in more detail in
As previously discussed, this magnifier selection data (128) will typically be associated with at least one alternative contact point of the user's hand or hands (e.g. the thumb 108) that falls within the magnified portion (126). This is shown in more detail in
As one example, consider
Here the user may use a “finger one” portion of the user's hand (the finger 110 at touchpad contact point 110a) to, via the system processor and
The user may also use an alternative contact point of the user's hand(s) (here the thumb 108) to select the desired control area (e.g. one of the keys “d”, “f” in
Using this sort of scheme, the system can then use this magnifier selection data (128) to select the at least one selectable object (122) (here one of the keys) that is displayed in higher magnification in the magnified portion (126) on the display screen (102). See also
Put alternatively, as is shown in
The user can further select at least one selectable object (here represented by the various keys in the virtual keypad 122 in
As previously discussed, in some embodiments, the magnified portion of the display screen (126) produced by magnifying images (and often touch sensitive control areas such as
As shown in
Typically, the system will be configured so that the user can switch the magnifier functionality (and the corresponding magnified portion of the display screen 126) on or off by appropriate software selection steps. In a preferred embodiment, the location of the magnified portion (such as the displacement (
Often, the location of the magnified portion (126) may be selected to be convenient to at least one thumb (108) or other user finger that the user favors for touch control purposes while the user is holding the computerized device (100).
Various software and hardware techniques may be used to generate the images and selectable areas on both the display screen and the magnified portion of the display screen (126). In some embodiments, the system may use its processor(s) to display at least one selectable object (122),
Here, as shown in
The device processor(s) can then overlay a portion of the data used to generate the display screen with the data used to generate the magnified portion of the display screen (126). This thus creates composite display screen data on touchscreen (102) that contains the unmagnified display screen images and touch sensitive regions in the background (130, 136, 138) overlaid by the magnified portion of the display screen (126, 136a, 138a).
This application is a continuation in part of U.S. patent application Ser. No. 15/280,796, filed Sep. 29, 2016; application Ser. No. 15/280,796 claimed the priority benefit of U.S. provisional application 62/234,497, filed Sep. 29, 2015; This application is also a continuation in part of U.S. patent application Ser. No. 15/239,652, filed Aug. 17, 2016; application Ser. No. 15/239,652 claimed the priority benefit of U.S. provisional application 62/205,891 filed Aug. 17, 2016; application Ser. No. 15/280,796 is also a continuation in part of U.S. Non-Provisional patent application Ser. No. 14/341,326, filed Jul. 25, 2014; application Ser. No. 14/341,236 claims the priority benefit of U.S. Provisional Patent Application No. 61/858,223, filed Jul. 25, 2013; application Ser. No. 14/341,326 is also a continuation-in-part of U.S. patent application Ser. No. 14/289,260, filed May 28, 2014; application Ser. No. 14/289,260 claims the priority benefit of U.S. provisional application 61/828,683 filed on May 30, 2013; this application is also a continuation-in-part of U.S. patent application Ser. No. 14/284,068 filed on May 21, 2014; application Ser. No. 14/284,068 claims the priority benefit of U.S. provisional application 61/825,621 filed on May 21, 2013; this application is also a continuation-in-part of U.S. patent application Ser. No. 14/282,331 filed on May 20, 2014; this application is also a continuation-in-part of U.S. patent application Ser. No. 14/268,926 filed on May 2, 2014; application Ser. No. 14/268,926 claims the priority benefit of U.S. provisional application 61/819,615 filed on May 5, 2013; application Ser. No. 14/268,926 is also a continuation-in-part of U.S. patent application Ser. No. 14/260,195 filed Apr. 23, 2014, now U.S. Pat. No. 9,430,147; application Ser. No. 14/268,926 also claims the priority benefit of U.S. provisional application 61/815,058 filed on Apr. 23, 2013; application Ser. No. 14/286,926 is also a continuation-in-part of U.S. patent application Ser. No. 13/770,791 filed Feb. 19, 2013, now U.S. Pat. No. 9,311,724; application Ser. No. 13/770,791 is a continuation in part of application Ser. No. 12/773,075 filed on May 4, 2010, now U.S. Pat. No. 8,384,683; application Ser. No. 12/773,075 claims the priority benefit of U.S. provisional patent application 61/327,102 filed on Apr. 23, 2010; application Ser. No. 13/770,791 is also a continuation-in-part of U.S. patent application Ser. No. 13/223,836 filed on Sep. 1, 2011, now U.S. Pat. No. 9,310,905; the contents of all of these applications are incorporated herein by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
61858223 | Jul 2013 | US | |
61828683 | May 2013 | US | |
61825621 | May 2013 | US | |
61819615 | May 2013 | US | |
61815068 | Apr 2013 | US | |
61327102 | Apr 2010 | US | |
62234497 | Sep 2015 | US | |
62205891 | Aug 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15280796 | Sep 2016 | US |
Child | 15365807 | US | |
Parent | 14341326 | Jul 2014 | US |
Child | 15280796 | US | |
Parent | 14289260 | May 2014 | US |
Child | 14341326 | US | |
Parent | 14284068 | May 2014 | US |
Child | 14289260 | US | |
Parent | 14282331 | May 2014 | US |
Child | 14284068 | US | |
Parent | 14286926 | May 2014 | US |
Child | 14282331 | US | |
Parent | 14260195 | Apr 2014 | US |
Child | 14286926 | US | |
Parent | 13770791 | Feb 2013 | US |
Child | 14289260 | US | |
Parent | 12773075 | May 2010 | US |
Child | 13770791 | US | |
Parent | 13223836 | Sep 2011 | US |
Child | 13770791 | US | |
Parent | 15239652 | Aug 2016 | US |
Child | 13223836 | US |