The present invention relates to touch panels and, more particularly, to touch panels that may have dynamic zooming capabilities and that may have a low profile bezel.
Electronic devices such as cellular telephones, camera, and computers often use touch panels as user input devices. Various technologies have been employed for identifying the position of a user's touch on a touch panel. These technologies include optical imaging methods, in which image sensors located along edges of a touch panel detect the presence of a user's touch on the touch panel. The optical components required in implementing conventional optical imaging technologies in a touch panel protrude above the surface of the panel, necessitating an elevated bezel around the perimeter of the touch panel. As a result, conventional touch panels have undesirably elevated bezels. In addition, conventional optical imaging technologies are incapable of providing high positional accuracy (i.e., resolution) together with a fast refresh rate (e.g., imager frame rate), especially when the conventional optical imaging technologies are used in relatively larger (e.g., greater than 14 inch) touch panels.
Touch panels are widely used in electronic devices. An electronic device with a touch panel is shown in
Device 6 may include input-output devices 7 such as a touch pad, a touch screen display, a touch panel (e.g., a touch screen that may or may not have display functionality), a camera (e.g., an imager), etc. Input-output devices 7 may include devices such as projectors, keypads, and input-output ports. Touch panel 7 may sense touch events using optical technology (e.g., touch panel 7 may include one or more image sensors and associated lenses used in detecting touch events). Pixels in the image sensors include photosensitive elements that convert the light into digital data. Image sensors may have any number of pixels (e.g., hundreds, thousands, millions or more).
Device 6 may include storage and processing circuitry 8, some of which may be associated with touch panel 7. Storage and processing circuitry 8 may include display processing circuitry, touch panel processing circuitry, image processing circuitry, microprocessors, microcontrollers, digital signal processors, application specific integrated circuits, volatile and nonvolatile memory (e.g., random-access memory, flash memory, hard drives, solid state drives, etc.), and other circuitry. Still and video image data from camera sensors in touch panel 7 (e.g., camera sensors used in detecting touch events) may be provided to storage and processing circuitry 8. Image processing circuitry in storage and processing circuitry 8 may be used to perform processing functions such as detecting touch events, adjusting exposure, etc.
Electronic device 6 typically provides a user with numerous high level functions. In a touch panel display, for example, a user may be provide with displayed images and video and a user may be provided with the ability to provide user input via touch (and near-touch) inputs.
An illustrative touch panel 7 is shown in
If desired, touch panel 7 may include cover member 20 over display 10. Cover member 20 may, as examples, be formed from glass, plastic, or other suitably transparent materials. Cover member 20 may sometimes be referred to herein as cover glass 20.
With some suitable arrangements, touch panel 7 may include light blocking material 60 in the peripheral regions of cover glass 20. Light blocking material 60 may be, as examples, a layer of opaque material formed above, within, or below cover glass 20. Light blocking material 60 may improve the aesthetics of touch panel 7 by blocking unsightly components underlying cover glass 20 from the view of users of touch panel 7. As an example, light blocking material 60 may partially or completely hide one or more cameras 100 and associated components from the view of users of touch panel 7. While
As shown in
Each camera 100 may include an image sensor array 30, one or more lenses 40, and a turning element 50. Each camera 100 may capture images of objection 80 using light rays 72 (e.g., scattered infrared light rays 72). Light rays 72 may pass through cover glass 20 towards turning element 50 (sometimes referred to herein as turning plate 50). Light rays 72 may be redirected, by turning plate 50, towards imaging lens 40. Imaging lens 40 may focus light rays 72 and thereby form an image on image sensor 30. Light blocking mask 60 may serve to prevent unwanted light rays (e.g., light rays from objects not above the touch sensitive surface of touch panel 7 such as objects outside the periphery of display 10) from reaching cameras 100.
By including light turning element 50 in touch cameras 100, touch panel 7 may have a low height bezel (e.g., a low or zero profile bezel). With one suitable arrangement, the bezel of touch panel 7 (e.g., the peripheral regions of cover glass 20, such as the regions that include light blocking material 60) may be flush with the central (e.g., active) region of touch panel 7. In other words, touch panel 7 may have a zero height bezel that does not protrude above the plane of the active region of touch panel 7.
A top view of touch panel 7 of
If desired, each camera 100 may have a horizontal field of view (e.g., a field of view parallel to the surface of touch panel 7) sufficient to detect touch events on the entire active surface of touch panel 7 (e.g., over the active surface of display 10). Field of view 120A may correspond to touch camera 100A, while field of view 120B may correspond to touch camera 100B. The fields of view 120A and 120B of touch cameras 100A and 100B may each be approximately 90 degrees.
The outputs of touch cameras 100 may be provided to image processor 110 (e.g., an image processor in circuitry 8 of
An example of a touch camera 100 that may be used in providing touch panel 7 with optical touch sensing capabilities is shown in
Turning film 54 may, if desired, be attached to a structural support structure such as glass plate 52. The side of turning film 54 opposite glass plate 52 may, if desired, have a lenticular prism array. The lenticular prism array may be formed from extruded plastic, as an example. The side of turning film 54 facing glass plate 54 may have a flat surface (attached to plate 54). Any desired turning film may be used for film 54. As an example, the commercially available right angle film Vikuiti™, produced by the 3M Corporation of St. Paul, Minn. may be used.
Turning film 54 may have a symmetrically shaped prism structure with a pitch (point-to-point separate) of 50 microns; may be made out of modified acrylic resin applied to a polyester substrate; may have a nominal thickness of 155 microns; may have accept light rays within approximately 20 degrees of approximately 71 degrees (from both sides) from the normal of film 54 (e.g., may accept light from both sides in the window of approximately 90 degrees to approximately 51 degrees from the normal of the surface of substrate 52); may output rays within an output range of approximately 0 to 20 degrees from the normal of film 54. The resent of light blocking material 60 may prevent unwanted light rays (e.g., light rays coming from the opposite direction as rays 74, 75, and 76) from entering camera 100.
The use of light turning film 54 can, in some arrangements, present chromatic aberrations that present imaging difficulties. In particular, light passing through turning film 54 may be separated into its constituent wavelengths (e.g., light rays along a particular path, such as the path of rays 75, may be turned by a different amount depending on their color). These effects may be diminished or eliminated by including one or more filters in camera 100. As an example, camera 100 may include one or more filters 42 on the back of imaging lens 40. Filter 42 may, if desired, be located in other locations such as on the front of lens 40, as part of turning element 50, on the front or rear surfaces of substrate 54, or in any other desired location. With some suitable arrangements, filter 42 may be an infrared band filter (e.g., a filter that blocks light at frequencies above or below desired infrared frequencies). If desired, display 10 may emit infrared light at frequencies passed by filter 42 (e.g., display 10 may include one or more narrow-band infrared light sources). As one example, filter 43 may transmit light only between 800 and 900 nanometers. Since only a narrow set of wavelengths is allowed to reach light sensor 30, the chromatic aberration effect produced by light turning film 54 is minimized or eliminated.
As shown in
Lens 40 may focus light rays (e.g., project rays 74, 75, and 76) onto an imaging sensor such as sensor 30. Lens 40 may have a sufficiently large aperture and sensor 30 may be sufficiently sensitive such that camera 100 is capable of achieving frame rates on the order of 200 frames per second (the IR illumination provided by display 10 may also effect the frame rate of camera 100). Image sensor 30 may be a CMOS imaging sensor such as Aptina Imaging Corporation's MI-0380 CMOS imaging sensor. Image sensor 30 may have a maximum (i.e., native) VGA resolution of 640 by 480 pixels and the individual pixels of image sensor 30 may be approximately 2.2 microns in size, as examples.
As described above, display panel 10 may emit light that scatters of objects 80 that are touching, or nearly touching, cover glass 20 (and the light may be received by cameras 100, which detect touch events). In at least some arrangement, display panel 10 may emit light in wavelengths outside the visible spectrum, such as infrared light, which may illuminate objects 80 for cameras 100. By providing light in wavelengths outside the visible spectrum, display 10 is able to provide a constant level of brightness in the light wavelengths used by cameras 100 for detecting touch events (e.g., the light level in the wavelengths used by cameras 100 may be independent of the visible image displayed on display 10).
An example of a display 10 based on LCD technology is shown in
Display 10 may include a broadband illuminating light source (not shown) behind the pixels of
If desired, cameras 100 may be configured to detect light in one or more visible wavelengths, rather than in infrared wavelengths. In such arrangements, infrared pixels 110 could be omitted from display 10. While cameras 100 would face lighting levels that varied with the image display by display 10, such arrangements could still be implemented.
With some suitable arrangements, touch panel 7 may implement dynamic zooming techniques. Illustrative dynamic zooming techniques are illustrated in
When implementing a dynamic zooming technique, touch panel 7 may detect touch events (e.g., touches and near-touches) using the following processes. As a first step, the IR pixels of a first section may be opened, illuminating any objections 80 that are touching, or nearly touching display 10 above that section. For clarify, the first section is identified as section 122 of view 172. The infrared pixels in all of the other sections may be closed at this time. At subsequent steps the IR pixels of subsequent sections, such as sections 124, 126, 128, and 130 are sequentially opened one-by-one (such that only the IR pixels of a single section are open at a time). This process may continue until each of the sections of view 172 has had its IR pixels open (or may stop as soon as cameras 100 detect an object in one of the sections, if touch panel 7 is configured to support only single touch events).
As the preceding process progresses, cameras 100 may capture one or more images during the period in which each respective sections' IR pixels are illuminated (e.g., cameras 10 may capture at least one image while the IR pixels of section 122 are illuminated, at least one image while the IR pixels of section 124 are illuminated, etc.). If desired, the framerate of cameras 100 may be synchronized with the segmental illumination of the sections of display 10. With some suitable arrangements, the image sensing pixels of cameras 100 may be binned with an 8×8 binning, as an example, to reduce the resolution, but significantly increase sensitivity and minimize data sizes. If desired, the binning applied to sensors 30 may be reduced in subsequent rounds. Cameras 100 may be able to quickly determine whether an object is touching, or nearly touching, an illuminated section, but may not be able to (at this time) determine exactly where within the illuminated section the object is touching.
If cameras 100 detect an object touching, or nearly touching, one or more illuminated sections, the touch panel 7 may dynamically zoom into the illuminated sections (e.g., when one or more illuminated sections produces a stronger image signal during segmental IR illumination). As shown in the
Once cameras determine which of the sub-sections of section 130 provide the strongest image signals (e.g., which sub-sections most closely correspond to the location at which the object is touching, or nearly, display 10), the touch panel 7 may be further dynamically zoomed into those sub-sections. This process may repeat in any desired number of rounds, until the exact location of the object is determined. As shown in the
In order to detect multiple simultaneous touch events (e.g., to provide multitouch capabilities), touch panel 7 may check each section at the current level (rather than proceeding to the next level upon detection of a strong signal). In addition, if touch panel 7 detects multiple sections at the current level with strong signals (e.g., image signals above a given threshold level for the current level), touch panel 7 may check the sub-sections of each of the multiple sections in the following round. With some suitable arrangements, touch panel 7 will redo the first round (e.g., check the entirety of touch panel 7 with a low resolution scan) every few rounds (e.g., every 4th round), to check for new touch events.
With dynamic zooming, it is possible to obtain accurate location information on touch events, without relying upon image sensors 30 resolving the location of the touch events. In some arrangements, image sensors 30 could even be replaced by sensors having as little as a single light sensor (e.g., an image sensor binned into a single pixel). In such arrangements, touch panel 7 determines the location of touch events by determining the sections that, when illuminated, create the largest signals in sensors 30 (e.g., detailed information from the sensors 30 “resolving” the touching objects is not needed). The touch resolution of touch panel 7 made be provided from the size of the smallest illuminated section of the dynamic zooming process, rather than the resolution of sensors 30.
A flowchart of illustrative steps involve in using dynamic zooming techniques for optical touch detection in a touch panel such as touch panel 7 is shown in
In step 182, the entirety of touch panel 7 may be selected as a region to search. Step 182 may sometimes be referred to herein as an initialization step.
In step 184, touch panel 7 divide each of the current regions to search (whether there is a single region to search or multiple regions to search) into any desired number of sections and may then search for touch events in each of those sections. As an example, when there are at least two regions to search, touch panel 7 may respectively divide each of the regions to search into any desired number of section. If desired, each region to search may be divided into a three-by-three grid (e.g., nine sections). When step 184 is performed after step 182, touch panel 7 may divide the entirety of touch panel 7 into a suitable number of sections (e.g., nine sections). When step 184 is performed after step 19 (e.g., as part of a zoomed-in search), touch panel 7 may divide each section of the previously searched region(s) in which at least one touch event was detected into a suitable number of sections (such as nine sections). Touch panel 7 may look for touch events such as near-touch events when objects 80 are in close proximity, but not actually touching, cover glass 20 and may look for touch events when objects 80 are touching cover glass 20.
In step 186 and if no touch events are detected in the divided sections of the one or more regions searched in step 184, touch panel 7 may reinitialize dynamic zooming (e.g., zoom back out to once again search the entirety of touch panel 7) in step 182.
In step 188 and if at least one touch event is detected in the in the divided sections of the one or more regions searched in step 184, touch panel 7 may perform the operations of step 190.
In step 190, touch panel 7 may determine if the current region(s) of interest are further divisible into subregions (e.g., whether or not the current region(s) are the last level of dynamic zooming available). If the current region(s) of interest are not further divisible into subregions (e.g., if the current region(s) of interest are the last level of dynamic zooming available) and as illustrated by step 192, then touch panel 7 may perform the operations of step 194. If the current region(s) of interest are further divisible into subregions (e.g., if the current region(s) of interest are not the last level of dynamic zooming available) and as illustrated by step 196, then touch panel 7 may perform the operations of step 198.
In step 194, touch panel 7 may process the one or more touch events detected in the current region(s) of interest. As an example, touch panel 7 may communicate with processing circuitry to inform the processing circuitry of the number, type (e.g., touch or near-touch), and location of each detected touch event.
In step 198, touch panel 7 may dynamically zoom into the one or more sections of the region(s) that were searched in step 184 and in which touch events were found. As an example, touch panel 7 may select the one or more sections of the region(s) searched in step 184 in which touch events were found in step 184 as the new region(s) of interest to search (in a subsequent iteration of step 184).
CMOS imager 200 is operated by a timing and control circuit 206, which controls decoders 203, 205 for selecting the appropriate row and column lines for pixel readout, and row and column driver circuitry 202, 204, which apply driving voltages to the drive transistors of the selected row and column lines. The pixel signals, which typically include a pixel reset signal Vrst and a pixel image signal Vsig for each pixel are sampled by sample and hold circuitry 207 associated with the column driver 204. A differential signal Vrst−Vsig is produced for each pixel, which is amplified by amplifier 208 and digitized by analog-to-digital converter 209. The analog to digital converter 209 converts the analog pixel signals to digital signals, which are fed to image processor 210 which forms a digital image.
Various embodiments have been described illustrating touch panel with dynamic zooming and low profile bezels.
An electronic device may have a touch panel. The touch panel may detect touch events using one or more cameras. The touch panel may include infrared light emitting elements to illuminate objects that may touch the touch panel with infrared light. The cameras may include infrared filters so that the cameras only receive infrared light.
The touch panel may include a planar exterior surface, such as a cover glass, that extends over an active region of a display and an inactive peripheral region. The cameras may be located underneath the inactive peripheral region. The cameras may include a light turning element to allow the cameras to detect touch events, without being raised above the exterior surface of the active region of the display (e.g., without having a raised profile).
The touch panel may detect touch events using dynamic zooming techniques. As an example, the touch panel may divide the active region into sections and search for touch events in each section. When the touch panel detects, using cameras, one or more objects touching (or nearly touching) one or more of the sections, the touch panel may dynamically zoom into those sections. In subsequent rounds, the touch panel may divide sections in which touch events were detected and search for touch events in each divided portion. The touch panel may continue in any desired number of rounds of dynamic zooming until the location of the touch events is determined with a desired level of accuracy (e.g., an accuracy of less than approximately 1.0 millimeters).
The foregoing is merely illustrative of the principles of this invention and various modifications can be made by those skilled in the art without departing from the scope and spirit of the invention. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of provisional patent application No. 61/551,358, filed Oct. 25, 2011, which is hereby incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4507557 | Tsikos | Mar 1985 | A |
5317140 | Dunthorn | May 1994 | A |
6091405 | Lowe | Jul 2000 | A |
6429856 | Omura | Aug 2002 | B1 |
6642936 | Engholm et al. | Nov 2003 | B1 |
7119800 | Kent et al. | Oct 2006 | B2 |
7557935 | Baruch | Jul 2009 | B2 |
7915652 | Lee | Mar 2011 | B2 |
8023051 | Tanabe et al. | Sep 2011 | B2 |
8081267 | Moscovitch et al. | Dec 2011 | B2 |
8259240 | Han | Sep 2012 | B2 |
8325147 | Brand | Dec 2012 | B2 |
8344311 | Tanaka | Jan 2013 | B2 |
8508488 | Woo | Aug 2013 | B2 |
8581874 | Brand | Nov 2013 | B2 |
8664582 | McCarthy | Mar 2014 | B2 |
20030156100 | Gettemy | Aug 2003 | A1 |
20040252091 | Ma | Dec 2004 | A1 |
20050162381 | Bell et al. | Jul 2005 | A1 |
20050190162 | Newton | Sep 2005 | A1 |
20060132460 | Kolmykov-Zotov et al. | Jun 2006 | A1 |
20080074401 | Chung | Mar 2008 | A1 |
20080111797 | Lee | May 2008 | A1 |
20080121442 | Boer | May 2008 | A1 |
20080158191 | Yang et al. | Jul 2008 | A1 |
20080259052 | Lin | Oct 2008 | A1 |
20090058832 | Newton | Mar 2009 | A1 |
20090195517 | Duheille | Aug 2009 | A1 |
20090278048 | Choe et al. | Nov 2009 | A1 |
20090295760 | Linge | Dec 2009 | A1 |
20090321865 | Kasano et al. | Dec 2009 | A1 |
20100117989 | Chang | May 2010 | A1 |
20100156805 | Brand et al. | Jun 2010 | A1 |
20100271336 | Harada | Oct 2010 | A1 |
20100289885 | Lu et al. | Nov 2010 | A1 |
20100315372 | Ng | Dec 2010 | A1 |
20110007032 | Goertz | Jan 2011 | A1 |
20110109563 | Liu | May 2011 | A1 |
20110115747 | Powell | May 2011 | A1 |
20110141064 | Suggs | Jun 2011 | A1 |
20110187679 | Ko | Aug 2011 | A1 |
20120007804 | Morrison | Jan 2012 | A1 |
20120105380 | Morrison | May 2012 | A1 |
20120127128 | Large et al. | May 2012 | A1 |
20120133618 | Usukura | May 2012 | A1 |
Number | Date | Country | |
---|---|---|---|
20130100082 A1 | Apr 2013 | US |
Number | Date | Country | |
---|---|---|---|
61551358 | Oct 2011 | US |