Optical touch sensor

Information

  • Patent Grant
  • 12147630
  • Patent Number
    12,147,630
  • Date Filed
    Thursday, June 1, 2023
    a year ago
  • Date Issued
    Tuesday, November 19, 2024
    a month ago
Abstract
A method for detecting locations of objects in a plane, including emit light beams, one at a time, at locations along a first edge of a detection area, refract each light beam into multiple divergent light beams, direct each of the divergent beams to arrive at a respective pair of focusing lenses mounted along a second edge of the detection area, opposite the first edge, wherein an intensity profile of each divergent beam has maximum intensity along the center of the beam, for each of the focusing lenses, measure an intensity profile of that portion of the divergent beam that enters the focusing lens, for each pair of focusing lenses receiving a single divergent beam, compare the measured intensity profiles for light arriving at each lens, and determine a location of an object that partially blocks at least one of the divergent beams, based on the compares.
Description
FIELD OF THE INVENTION

The field of the present invention is touchscreens, particularly optical touchscreens.


BACKGROUND OF THE INVENTION

Reference is made to FIG. 1, which is a simplified illustration of a first prior art optical touchscreen, described in U.S. Pat. No. 9,471,170, entitled LIGHT-BASED TOUCH SCREEN WITH SHIFT-ALIGNED EMITTER AND RECEIVER LENSES, and assigned to the assignee of the present invention. This touchscreen 900 is surrounded by lens arrays 300-303. Emitters 100-121 are arranged along two adjacent edges of the touchscreen, and detectors 200-223 are arranged along the remaining two edges. The emitters along edges of the touchscreen are shift-aligned with respect to the detectors along the opposite edges of the touchscreen. Lenses 300 and 301 are configured to spread light from each emitter to arrive at two opposite detectors, and lenses 302 and 303 are configured such that each detector receives light from two opposite emitters.



FIG. 1 shows horizontal light beams 400-407 and vertical light beams 408-421, each beam originating at a respective emitter and reaching two detectors. Each emitter is synchronously activated with its corresponding detectors by processor 901. Processor 901 also stores the detector outputs and calculates touch locations based on these outputs.


Reference is made to FIG. 2 which is a simplified illustration of a second prior art optical touchscreen, described in U.S. Pat. No. 9,063,614, entitled OPTICAL TOUCH SCREENS, and assigned to the assignee of the present invention. This touchscreen 902 is surrounded by lens arrays 304-307. Emitters 122-174 are arranged along two adjacent edges of the touchscreen, and detectors 224-278 are arranged along the remaining two edges. In contrast to lenses 300 and 301 in FIG. 1, lenses 305 and 306 are configured to spread light from each emitter to arrive at numerous detectors. Similarly, lenses 304 and 307 are configured such that each detector receives light from numerous emitters, e.g., #8 channels per diode. FIG. 2 illustrates the dense mesh of detected light beams 422-474. Each emitter is synchronously activated with its corresponding detectors by processor 903. Processor 903 also stores the detector outputs and calculates touch locations based on these outputs.


The touchscreen system illustrated in FIG. 1, has relatively narrow light channels which provide good signal levels for touch detection. Furthermore, each light beam is shaped to provide signal gradients across the width of the beam. These signal gradients enable identifying precisely where within the beam the blocking object is located by comparing the amounts of blocked light detected by different emitter-detector pairs. Thus, this system provides excellent information from each light channel. In addition, this system works well with a large pitch (denoted p) between components (e.g., p is between 15 mm and 18 mm). However, this system is not optimal for suppressing ghost touches, as the vertical light beams are nearly parallel, as are the horizontal light beams.


Reference is made to FIG. 3, which is a simplified illustration of the ghost touch problem. Blocked beams (detections) 725-728 create ambiguity as to which of locations 910-913 contain a touch object. In this case, the actual objects are located at locations 910 and 911.


The ghost touch problem exists in the touchscreen of FIG. 1. In contrast, the touchscreen system illustrated in FIG. 2 provides a range of different angles for the horizontal light beams and for the vertical light beams, e.g., ±20°. This enables the system to suppress ghost touches well. This system works best with slightly more components, e.g., p≈12.5 mm, making this system more expensive. The wide angle of the emitter beams provides lower signal levels for each emitter-detector pair, and furthermore, each emitter-detector light channel does not feature the signal gradients across the width of the beam as in the system of FIG. 1, making touch location calculation less exact. The touchscreen system of FIG. 2 also features many more light channels and therefore is more costly in terms of activating all of the emitter-detector channels and storing and processing the detection signals.


The following table summarizes the features of the systems illustrated in FIGS. 1 and 2.









TABLE I







Features of touchscreen systems in FIGS. 1 and 2








FIG. 1
FIG. 2





good gradients
no gradients


good signal levels
low signal levels


few components
intermediate number of components


few channels
many channels


bad angles for ghost touch
excellent angles for ghost touch


suppression
suppression









Publication No. US 2012/0188206 A1 (the “'206 publication”), entitled OPTICAL TOUCH SCREEN WITH TRI-DIRECTIONAL MICRO-LENSES is a publication of U.S. patent application Ser. No. 13/424,472, which is assigned to the assignee of the present invention. The '206 publication discusses an optical touchscreen in which each emitter beam is split into three separate beams, particularly with reference to FIGS. 83, 89, 90, 98 and 99 in the '206 publication. The motivation for the optical touchscreen in which each emitter beam is split into three separate beams, particularly with reference to FIGS. 83, 89, 90, 98 and 99 in the '206 publication went as follows. More information is needed in order to solve the ghost touch problem in the optical touchscreen of FIG. 1, namely the problem of trying to uniquely identify multiple touches based on insufficient positional data. The solution proposed in the '206 publication is to provide more detection channels, i.e., more grids of light beams. The more different the grids are, the better the information content. So, an added grid skewed at 45 degrees to the existing grid gives the most information without adding many channels.


However, the additional 45-degree grid requires that the detector photodiodes (PDs) be placed along all four edges of the screen, and thus, the configuration of FIG. 1 with LEDs on two edges of the touchscreen and PDs along the opposite edges of the touchscreen cannot be used. This leads to a configuration of alternating LEDs and PDs along the edges of the screen.


Then there is the choice of whether to use half-lenses on two of the four edges of the screen, or only standard lenses on all sides, namely, should the lenses along opposite edges of the screen be aligned, or shift-aligned as in FIG. 1. There are advantages to a shift-aligned configuration, as discussed inter alia in U.S. Pat. No. 9,471,170. However, a shift-aligned configuration is complicated in view of the additional 45-degree Channels. Thus, an aligned configuration is required in consideration of the additional 45-degree Channels.


The aligned configuration of lenses on opposite edges of the screen means that the LED and PD components on opposite edges of the screen are aligned as well. However, it had to be determined whether each LED is opposite another LED or opposite a PD. Assuming the central beam from each LED expands as it crosses the screen and therefore reaches three components along the opposite edge of the screen-if each LED is opposite another LED, then the central beam from one LED reaches two PDs. However, this causes trouble with bad information in the center, since the middle of this central beam is directed at an LED, not a PD. On the other hand, if each LED is opposite a PD, the channel is straight across from LED to PD, but the central beam spanning three opposite components arrives at only one PD (and the two LEDs on either side of that PD). In this case, there would be no overlap of channels and the possibilities of using interpolation of several signals (channels) would be severely limited. The benefits of interpolating overlapping channels is discussed inter alia in U.S. Pat. No. 9,471,170. The configuration aligning each LED opposite a PD and not featuring overlapping channels can be used for relatively large objects, as is indicated in '206 publication, paragraph [0332].


The way to provide overlapping channels is discussed in the '206 publication at paragraph [0333]; namely, interleaving different light channels using small facets to get an even distribution on the tri-directional pattern. The many small facets dilute the signal by spreading light in several directions and also by interleaving neighboring beams. Moreover, the system discussed in the '206 publication at paragraph [0333] is less flexible than the system discussed in U.S. Pat. No. 9,471,170, in terms of pitch width, as the '206 publication requires an even number of pitches on both sides of the screen. The signals are shaped by the focal length and the pitch, and although there is good information, it cannot be tailored much, and it looks like FIGS. 94 and 95 in the '206 publication.


The present invention addresses the shortcomings of the prior art. Other advantages of the present invention will become apparent from the description below.


SUMMARY

There is thus provided in accordance with an embodiment of the present invention an optical sensor for detecting locations of objects, including a plurality of lenses arranged along two opposite edges of a rectangular detection area, a circuit board mounted underneath the lenses, a plurality of light emitters mounted on the circuit board along a specific one of the two opposite edges of the rectangular detection area, each light emitter operable when activated to project light beams through a respective one of the lenses, wherein the lenses are configured to split the light beam projected from each light emitter into a plurality of divergent light beams directed across the rectangular detection area to respective pluralities of the lenses that are arranged along the edge of the rectangular detection area that is opposite the specific edge, wherein a light intensity of each directed beam is maximized along the center of the directed beam and a distribution of light intensity within each thus directed beam is known, a plurality of light detectors mounted on the circuit board along the edge of the rectangular detection area that is opposite the specific edge, each detector receiving the light beams directed across the rectangular detection area through a respective one of the lenses that are arranged along the edge of the rectangular detection area opposite that specific edge, and a processor receiving outputs from the light detectors, and calculating a location of an object in the rectangular detection area based on the known distribution of light intensity within each directed beam, and the received outputs.


According to further features in embodiments of the invention, the plurality of light emitters is shift-aligned with respect to the plurality of light detectors.


According to further features in embodiments of the invention, the lenses are designed such that light beams of different widths are directed by the lenses across the rectangular detection area.


According to further features in embodiments of the invention, the lenses are designed such that those of the lenses that are arranged near corners of the rectangular detection area direct light beams across the rectangular detection area that are narrower than the light beams directed across the rectangular detection area by the others of said lenses.


According to further features in embodiments of the invention, those of the lenses that are arranged near corners of the rectangular detection area are smaller than the others of the lenses.


According to further features in embodiments of the invention, those of the lenses that are arranged near corners of the rectangular detection area are designed to split the light beams from respective ones of the light emitters into fewer divergent light beams than the others of the lenses.


According to further features in embodiments of the invention, the lenses spread the pluralities of divergent light beams in fan-like shapes, each fan having an apex angle, wherein those of the lenses that are arranged near corners of the rectangular detection area generate fans of light beams having apex angles that are smaller than the apex angles of the fans of light beams generated by the others of the lenses.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will be more fully understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:



FIG. 1 is a simplified illustration of a first prior art optical touchscreen;



FIG. 2 is a simplified illustration of a second prior art optical touchscreen;



FIG. 3 is a simplified illustration of ghost touches;



FIG. 4 is a simplified illustration of the principles of calculating a touch location based on optical signals, in the optical touchscreen of FIG. 1, according to the present invention;



FIG. 5 is a simplified illustration of light from an emitter being split into three optical channels, in accordance with an embodiment of the present invention;



FIG. 6 is a simplified illustration of light from an emitter being split into two optical channels, in accordance with an embodiment of the present invention;



FIG. 7 is a simplified illustration of distinguishing an actual touch from a ghost touch, in accordance with embodiments of the present invention;



FIG. 8 is a simplified illustration of five different beams that may be created by a lens, in accordance with embodiments of the present invention;



FIGS. 9-12 are illustrations of lenses used in an optical sensor, in accordance with an embodiment of the present invention;



FIG. 13 is a simplified illustration of an optical touchscreen detection area, in accordance with an embodiment of the present invention;



FIGS. 14-16 are simplified illustrations of partially overlapping central light beams in an optical touchscreen, in accordance with an embodiment of the present invention;



FIGS. 17-19 are simplified illustrations of partially overlapping left-leaning light beams in an optical touchscreen, in accordance with an embodiment of the present invention;



FIGS. 20-22 are simplified illustrations of partially overlapping right-leaning light beams in an optical touchscreen, in accordance with an embodiment of the present invention;



FIGS. 23 and 24 are simplified illustrations of the number of overlapping light beams provided for touch detection in different portions of a touchscreen that features the beams of FIGS. 14-22, in accordance with an embodiment of the present invention;



FIGS. 25-27 are simplified illustrations of partially overlapping far left-leaning light beams in an optical touchscreen, in accordance with an embodiment of the present invention;



FIGS. 28 and 29 are simplified illustrations of partially overlapping far right-leaning light beams in an optical touchscreen, in accordance with an embodiment of the present invention;



FIGS. 30 and 31 are simplified illustrations of the number of overlapping light beams provided for touch detection in different portions of a touchscreen, featuring the beams of FIGS. 4, 14, 15 and 25-29, in accordance with an embodiment of the present invention;



FIG. 32 is a simplified illustration of an optical touchscreen having different optics for outer and inner beams crossing the screen, in accordance with an embodiment of the present invention;



FIG. 33A is a map showing different screen sections in an optical touchscreen, the sections having different numbers of overlapping detection channels, in accordance with an embodiment of the present invention;



FIG. 33B illustrates the different screen sections of FIG. 33A using text and arrows, in accordance with an embodiment of the present invention; and



FIG. 34 is a simplified illustration of the number of overlapping light beams provided for touch detection in different portions of a touchscreen for a touchscreen whose lenses are configured to split each emitter beam into two diverging beams, not three, in accordance with an embodiment of the present invention.





In the disclosure and figures, the following numbering scheme is used. Like numbered elements are similar but not necessarily identical.









TABLE II







Elements of Figures











Type of element
Numbering range
FIGS.







light emitters
100-199
1, 2, 4-6, 8, 13-





32, 34



light detectors
200-299
1, 2, 4-6, 13-





32, 34



lenses
300-399
1, 2, 4-6, 9-32,





34



light beams
400-699
1, 2, 4-6, 8, 14,





15, 17-23, 25-





30



light detection
700-799
3-6



other items
900-999
1-5, 7-9, 13-





32, 34










DETAILED DESCRIPTION

The following table summarizes certain features of a touchscreen according to the present invention, in relation to features of the prior art touchscreens illustrated in FIGS. 1 and 2.









TABLE III







Features of the Present Invention









FIG. 1
Present invention
FIG. 2





good gradients
good gradients
no gradients


good signal levels
intermediate signal
low signal levels



levels



few components
few components
intermediate number




of components


few channels
intermediate no.
many channels



channels



bad angles for ghost
excellent angles for
excellent angles for


touch suppression
ghost touch
ghost touch



suppression
suppression









Reference is made to FIG. 4, which is a simplified illustration of the principles of calculating a touch location based on optical signals, in the optical touchscreen of FIG. 1, according to the present invention. FIG. 4 illustrates signal gradients in a touchscreen embodiment. Emitter 109 projects vertical light beam 409 across detection area 900, detected by neighboring detectors 221 and 222. Specifically, beam portion 475 is detected by detector 222 and beam portion 476 is detected by detector 221. Beam portion 475 is shown as a parallelogram that extends from the lens at emitter 109 to the lens at detector 222, and beam portion 476 is shown as a parallelogram that extends from the lens at emitter 109 to the lens at detector 221; the dashed lines in beam portions 475 and 476 indicate the center of each beam portion. Lenses 301 shape beam 409 such that maximum intensity is along the center of the beam, and the intensity gradient is reduced toward the edges of the beam, as represented by detections 709 and 708. Lenses 303 are similar to lenses 301 and collect the incoming light onto the detectors. When an object blocks part of beam portion 475 from reaching detector 222 and part of beam portion 476 from reaching detector 221, the location of the object along the width of beam 409 is determined by comparing the detections at detectors 221 and 222: if the detections are equal, the object is along the beam's central axis, and if the detections are unequal, their ratio indicates how far the object is shifted to one side of the beam's center.



FIG. 4 also shows detector 219 receiving light from two emitters 111 and 112, specifically, portion 477 of the beam from emitter 111 and portion 478 of the beam from emitter 112 arrive at detector 219. As beam portions 477 and 478 overlap, the system determines where along the width of the overlap an object is positioned by comparing the detections of emitter-detector pair 111-219 with the detections of emitter-detector pair 112-219.



FIG. 4 shows vertical beams 417-419 from emitters 117-119 to detectors 211-214. Overlapping detected portions 479 of these beams illustrate that an object placed in detection area 900 will be detected by at least two emitter-detector pairs, i.e., {(e,d), (e,d+1)} or {(e,d), (e+1,d)}, where e and e+1 are two neighboring emitters and d and d+1 are two neighboring detectors. As explained above, the overlapping detections and the shaped light beams provide signal gradients that enable precise calculations of locations of objects touching the screen, or otherwise entering the plane of these light beams, by comparing the magnitude of detections of a single object by several emitter-detector pairs. One method of comparing these detections is by interpolation. The synchronized activation of emitter-detector pairs is controlled by processor 903. Processor 903 also stores detector outputs and calculates a location of a detected object based on those outputs.



FIG. 4 shows that the emitter lenses along two edges of detection area 900 are shift-aligned with the detector lenses along the opposite edges of detection area 900, and the beam from each emitter detected by two opposite detectors can be expressed as being detected by detectors that are offset +/−0.5 lens pitch from opposite the emitter. According to the present invention, light from each emitter is split into several beams detected by additional pairs of detectors to provide additional detection channels and more angles for ghost touch suppression.


Thus, in certain embodiments of the invention, a first additional beam is directed towards detectors, along the opposite edge of the detection area, that are offset 1.5 and 2.5 lens pitches from opposite the emitter, and a second additional beam is directed towards detectors, along the opposite edge of the detection area, that are offset −1.5 and −2.5 lens pitches from opposite the emitter. These additional beams are illustrated in FIGS. 17-22. When all three sets of beams are provided, namely, the initial beam directed at the detectors that are offset +/−0.5 lens pitches from opposite the emitter, plus the two additional beams, six sets of detection channels are provided. This is illustrated in FIGS. 23 and 24.


In other embodiments of the invention, different additional beams are used to provide a wider range of angles for ghost touch suppression; namely, a first additional beam is directed towards neighboring detectors that are offset 3.5 and 4.5 lens pitches from opposite the emitter, and a second additional beam is directed towards detectors that are offset −3.5 and −4.5 lens pitches from opposite the emitter. These additional beams are illustrated in FIGS. 25-29. When these beams are provided together with the initial beam directed at the detectors that are offset +/−0.5 lens pitches from the emitter, six sets of detection channels are provided, as illustrated in FIGS. 30 and 31. Still other embodiments of the invention provide additional beams with larger emitter-detector offsets, as illustrated in FIGS. 25-29, but provide additional beams with smaller emitter-detector offsets near the edges of the detection area, as illustrated in FIGS. 32 and 33.


Reference is made to FIG. 5, which is a simplified illustration of light from an emitter being split into three optical channels, in accordance with an embodiment of the present invention. FIG. 5 illustrates splitting each emitter beam into three separate, diverging beams 484-486, each separate beam being shaped with a gradient as described hereinabove with respect to FIG. 4, such that the highest intensity is along the center of each of separate beam, and the light intensity is reduced from the center of the beam outward. These split, shaped beams are formed by lenses 309, and similar lenses 311 are provided at the detectors for directing beams in each of the three directions onto each detector. Each of beams 484-486 arrives at two detectors. Specifically, beam 484 arrives at detectors 214 and 215; beam 485 arrives at detectors 212 and 213; and beam 486 arrives at detectors 210 and 211. As such, the minimum angle between beams 484 and 485, and between beams 485 and 486, is







θ

m

i

n


=


tan

-
1


(


2

p

L

)






where p is the pitch between neighboring components and L is the distance between the row of emitters and the opposite row of detectors.


Although FIG. 5 shows each beam split into three separate, shaped beams, in other embodiments of the invention each beam is split into only two separate, shaped beams, providing a greater amount of light for each separate beam.


Reference is made to FIG. 6 illustrating light from emitter 118 being split into two separate beams 484 and 485, each separate beam being shaped with a gradient as described hereinabove with respect to FIGS. 4 and 5, such that the highest intensity is along the center of each separate beam, and the light intensity is reduced from the center of the beam outward. These split, shaped beams are formed by lenses 309, and similar lenses 311 are provided at the detectors for directing beams in each of the two directions onto the detectors. Each of beams 484 and 485 arrives at two detectors. Specifically, beam 484 arrives at detectors 213 and 214, and beam 485 arrives at detectors 211 and 212. As such, the minimum angle between beams 484 and 485, is







θ

m

i

n


=


tan

-
1


(


2

p

L

)






where p is the pitch between neighboring components and L is the distance between the row of emitters and the opposite row of detectors.


Beams 484 and 485 are both projected by emitter 118. Lenses 309 split and shape these two beams and direct them at detectors 211-214, via lenses 311. The detections at detectors 211-214 are indicated by sloping curves 711-714. The maximum intensity for each beam is along its center, indicated by the maximum of each curve 711-714 being near the beam center and declining outward.


Reference is made to FIG. 7, which is a simplified illustration of distinguishing an actual touch from a ghost touch, in accordance with embodiments of the present invention. FIG. 7 shows two touches in detection area 904 and illustrates that, in a system with two diverging beams for each emitter as illustrated in FIG. 6, the locations of the four candidate touch points indicated by blocked left-sloping beams 484 of FIG. 6 are slightly different than the four candidate touch points indicated by the blocked right-sloping beams 485 of FIG. 6. Arrows 931 indicate the directions of the right-sloping vertical and horizontal beams, and arrows 932 indicate the directions of the left-sloping vertical and horizontal beams. The right-sloping beams indicate possible touch locations 910, 911, 912′ and 913′, and the left-sloping beams indicate possible touch locations 910, 911, 912″ and 913″. Thus, locations 910 and 911 at which actual objects are present are identical for both sets of beams, whereas the ghost locations 912 and 913 are not the same. Accordingly, by comparing the touch locations indicated by the different sets of beams, it is determined that locations that are identical for the different sets of beams are actual touch locations, and locations that are not identical for the different sets of beams are not touch locations. Put differently, by combining the touch detections from all of the beams, it is determined that locations whose detections by different sets of beams are concentrated at specific locations are actual touch locations, and locations for which the combined detections from different sets of beams are more diffuse are not touch locations. Splitting each emitter beam into three diverging beams provides a more robust solution to the ghost touch problem. In addition, the larger the angle between the diverging beams as discussed hereinbelow, the more robust the elimination of ghost touches.


Thus, splitting the light from each emitter into two or three beams provides different angled beams that resolve many ghost touch situations that the touchscreen of FIG. 1 cannot similarly resolve. In addition, the split beams cover most of the display area with many different overlapping beams, providing greater resolution and precision for calculating touch locations.


Reference is made to FIG. 8, which is a simplified illustration of five different beams that may be created by a lens, in accordance with embodiments of the present invention. FIG. 8 shows five different beams 487-491 that may be created from emitter 118 by a lens. As discussed hereinabove, each of the five beams is shaped to have maximum intensity along the center with the intensity decreasing further from the central line. This beam shape enables identifying where within the width of the beam an object is located based on the amount of light it blocks. Each beam is directed toward a pair of detectors at the opposite edge of the detection area. In certain embodiments of the invention, each lens creates three of these five beams. In certain embodiments discussed hereinbelow the lenses farther from the corners of detection area 904 generate the central beam 489 and the two outer beams 487 and 491, in order to provide beams separated by large angles, whereas lenses closer to the corners of the detection area are configured to generate the three inner beams 488-490, as an outer beam 487 or 491 would be directed outside detection area 904 and would not reach a detector on the opposite edge.


Reference is made to FIGS. 9-12, which are illustrations of lenses used in an optical sensor, in accordance with an embodiment of the present invention. FIG. 9 shows a touchscreen having detection area 904 surrounded by lens structures 309-312, and cross-sections thereof.



FIG. 10 shows lens structure 309 of the touchscreen in FIG. 9. This lens structure is mounted above a series of light emitters, and it splits the light from each emitter into three separate beams. FIG. 10 shows focusing lenses 324 and 325, and lenses 327 and 328 that split the light into separate beams.



FIG. 11 shows one of focusing lenses 325 and the associated portion of lens 327, having a repeating three-facet pattern indicated by three arrows that split the light into three separate beams. In sensors configured with two beams for each emitter instead of three, lens 327 has a repeating sawtooth pattern of facets in two directions to create two diverging beams.



FIG. 12 shows a section of lens structure 309 near the corner of the touch detection area. This section includes focusing lens 324 and associated lens 328 for splitting the light. Because this lens is near a corner of the detection area, lens 328 is configured to direct the light so that the rightmost beam arrives at a detector on the opposite edge of the detection area, and in some embodiments this section splits the light into fewer beams than the other sections. In certain embodiments, this section of lenses is smaller than the others and is thus configured to create more concentrated beams of light than the other sections because an edge of the screen is covered by fewer overlapping beams than other screen sections.


Reference is made to FIG. 13, which is a simplified illustration of an optical touchscreen detection area, in accordance with an embodiment of the present invention. FIG. 13 shows a frame of lenses 308-311 surrounding detection area 904, in accordance with an embodiment of the present invention. Each lens in lens array 308 directs beams from a respective one of emitters 100-107 across detection area 904. Each lens in lens array 310 directs incoming light onto a respective one of detectors 200-208. The number of detectors 200-208 is greater than the number of opposite emitters 100-107, as the emitters and detectors are shift-aligned. This is evident from lens array 310 which features seven full-lenses and two half-lenses at either end of array 310, whereas lens array 308 features eight full lenses. Each lens in lens array 309 directs light beams from a respective one of emitters 108-121 across detection area 904. Each lens, or half-lens at either end, in lens array 311 directs incoming light onto a respective one of detectors 209-223. The number of detectors 209-223 is greater than the number of opposite emitters 108-121, as the emitters and detectors are shift-aligned. This is evident from lens array 311 which features thirteen full-lenses and two half-lenses at either end of array 311, whereas lens array 309 features fourteen full lenses. As discussed hereinabove with reference to FIGS. 5 and 6, light from each emitter is split into multiple separate shaped beams, and each beam arrives at two neighboring detectors along the opposite edge of detection area 904. An emitter-detector pair refers to a portion of the light beam from one emitter that arrives at one opposite detector. The following figures illustrate these portions to indicate the various emitter-detector pairs providing detection information in accordance with embodiments of the present invention.


Reference is made to FIGS. 14-16, which are simplified illustrations of partially overlapping central light beams in an optical touchscreen, in accordance with an embodiment of the present invention. FIG. 14 shows a first set of beam portions 500-513, namely, the left-hand portion of the central beam from each emitter. These beam portions provide a first set of emitter-detector detection channels.



FIG. 15 shows a second set of beam portions 514-527, namely, the right-hand portion of the central beam from each emitter. These beam portions provide a second set of emitter-detector detection channels.



FIG. 16 shows coverage of detection area 904 by beam portions 500-527 shown in FIGS. 14 and 15. FIG. 16 indicates that most of detection area 904 is covered by two beam portions; i.e., two detection channels, except for two narrow triangular sections at the outer edges of detection area 904, which are only covered by one detection channel. A similar coverage map exists for the horizontal beams traveling from emitters 100-107 to detectors 200-208.


Reference is made to FIGS. 17-19, which are simplified illustrations of partially overlapping left-leaning light beams in an optical touchscreen, in accordance with an embodiment of the present invention. FIG. 17 shows a third set of beam portions 528-539, namely, the left-hand portion of the left-leaning beam from each emitter. These beam portions provide a third set of emitter-detector detection channels.



FIG. 18 shows a fourth set of beam portions 540-551, namely, the right-hand portion of the left-leaning beam from each emitter. These beam portions provide a fourth set of emitter-detector detection channels.



FIG. 19 shows additional coverage of detection area 904 by the left-leaning beams; i.e., beam portions 528-551 shown in FIGS. 17 and 18. FIG. 19 indicates that most of detection area 904 is covered by two beam portions; i.e., two detection channels, except for a first narrow triangular section along the outer left edge of left-leaning beam 528, and a wider triangular section at the outer right edge of left-leaning beam 551, which are only covered by one detection channel. FIG. 19 also illustrates two triangular sections at the outer edges of detection area 904 that are not covered by these left-leaning beams. A similar coverage map exists for the horizontal beams traveling from emitters 100-107 to detectors 200-208. FIG. 19 indicates series 921 of emitters and series 920 of detectors that are active in providing the illustrated detection channels 528-551.


Reference is made to FIGS. 20-22, which are simplified illustrations of partially overlapping right-leaning light beams in an optical touchscreen, in accordance with an embodiment of the present invention. FIG. 20 shows a fifth set of beam portions 563-574; namely, the left-hand portion of the right-leaning beam from each emitter. These beam portions provide a fifth set of emitter-detector detection channels.



FIG. 21 shows a sixth set of beam portions 575-586, namely, the right-hand portion of the right-leaning beam from each emitter. These beam portions provide a sixth set of emitter-detector detection channels.



FIG. 22 shows additional coverage of detection area 904 by the right-leaning beams; i.e., beam portions 563-586 shown in FIGS. 20 and 21. FIG. 22 indicates that most of detection area 904 is covered by two beam portions; i.e., two detection channels, except for a first narrow triangular section along the outer right edge of right-leaning beam 586, and a wider triangular section at the outer left edge of right-leaning beam 563, which are only covered by one detection channel. FIG. 22 also illustrates two triangular sections at the outer edges of detection area 904 that are not covered by these right-leaning beams. A similar coverage map exists for the horizontal beams traveling from emitters 100-107 to detectors 200-208. FIG. 22 indicates series 923 of emitters and series 922 of detectors that are active in providing the illustrated detection channels 563-586.


Reference is made to FIGS. 23 and 24, which are simplified illustrations of the number of overlapping light beams provided for touch detection in different portions of a touchscreen that features the beams of FIGS. 14-22, in accordance with an embodiment of the present invention. FIG. 23 shows coverage of detection area 904 when all six sets of detection channels discussed hereinabove with reference to FIGS. 14-22 are combined. The central portion of detection area 904 is covered by multiple detection signals from all six sets of detection beams—the central, left-leaning and right-leaning beams. However, FIG. 23 indicates areas near the edges of detection area 904 that are covered by only three or four sets of detection signals, and areas covered by only one or two detection signals.



FIG. 24 indicates the number of detection channels covering different portions of detection area 904, in the embodiment illustrated in FIG. 23. The number of detection channels varies between 1-6. A similar map of detection channels applies to the horizontal beams, from emitters 100-107 to detectors 200-208.


Reference is made to FIGS. 25-27, which are simplified illustrations of partially overlapping far left-leaning light beams in an optical touchscreen, in accordance with an embodiment of the present invention. FIG. 25 illustrates detection channels between each emitter and a respective detector that is offset 4.5 lens pitches from opposite the emitter; FIG. 26 illustrates detection channels between each emitter and a respective detector that is offset 3.5 lens pitches from opposite the emitter; and FIG. 27 illustrates the combined area covered by these two detection channels. FIG. 27 indicates series 925 of emitters and series 924 of detectors that are active in providing the illustrated detection channels 600-619.


Reference is made to FIGS. 28 and 29, which are simplified illustrations of partially overlapping far right-leaning light beams in an optical touchscreen, in accordance with an embodiment of the present invention. FIG. 28 illustrates detection channels between each emitter and a respective detector that is offset −3.5 lens pitches from opposite the emitter; FIG. 29 illustrates detection channels between each emitter and a respective detector that is offset −4.5 lens pitches from opposite the emitter.


Reference is made to FIGS. 30 and 31, which are simplified illustrations of the number of overlapping light beams provided for touch detection in different portions of a touchscreen, featuring the beams of FIGS. 4, 14, 15 and 25-29, in accordance with an embodiment of the present invention. FIGS. 30 and 31 show the number of channels covering different portions of the detection area that utilize the beams illustrated in FIGS. 4, 14, 15 and 25-29. FIGS. 30 and 31 show that the outer portions of the detection area have fewer detection channels than the central portion of the detection area. FIGS. 30 and 31 indicate series 925, 927 of emitters and series 924, 926 of detectors that are active in providing the illustrated detection channels 600-619.


Reference is made to FIGS. 32 and 33, which are simplified illustrations of optical touchscreens having different optics for outer and inner beams crossing the screen, in accordance with an embodiment of the present invention. FIG. 32 shows detection channels provided along the outer edges of a detection area by beams similar to beams 484 and 486 in FIG. 5. Adding these beams to the system illustrated in FIGS. 30 and 31, provides the outer edge-related portions of the detection area with additional detection channels.



FIGS. 33A and 33B are two views of detection channel coverage in the system of FIGS. 30 and 31 when the additional beams and channels shown in FIG. 32 are provided along the edges of the detection area. FIG. 33A is color-coded, and FIG. 33B uses text and arrows to indicate the number of detection channels for each section of the touchscreen. FIGS. 33A and 33B show that portions of the detection area have up to nine detection channels. Additional beams with even larger emitter-detector offsets can be added to the optical sensor by modifying the lenses and the emitter-detector activation sequence, and such systems are also within the scope of the present invention.


As discussed hereinabove, splitting light from each emitter into additional beams adds precision and enables better discrimination of ghost touches. Reference is made to FIG. 34, which is a simplified illustration of the number of overlapping light beams provided for touch detection in different portions of a touchscreen for a touchscreen whose lenses are configured to split each emitter beam into two diverging beams, not three, in accordance with an embodiment of the present invention. FIG. 34 shows the number of beams, or detection channels, that cover each part of a touchscreen when light from each emitter is split into four beams, namely, the two beams 484 and 485 of FIG. 6 and one beam to the left of beam 484 and another beam to the right of beam 485.


The edge portions of the touchscreen have fewer beams, or detection channels, than the center of the screen. In addition, a smaller portion of light from the emitters near the corners is used for touch detection as fewer channels can be used along the screen edges. Therefore, in certain embodiments of the invention, the light channels traversing the screen along the screen edges are shaped to be wider and therefore contain more signal strength than the channels in the middle of the screen. This requires that the lenses near the screen corners are longer than the other lenses, and the light channel is spread across a larger segment of the edge near the corner than the other light channels. In terms of calculating touch locations, each beam is assigned coordinates based on its center line and a weighted sum of all of these coordinates is calculated according to the amount that the signal is blocked. Thus, coordinates are also assigned to the edge beams and they are added to the weighted average, in the same way as the other beams. As a result of the larger lenses and wider beams near edges of the screen, the distance between the central line of a beam near an edge of the screen and its neighboring beam is greater than the distance between the central lines of other neighboring beams.


In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made to the specific exemplary embodiments without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

Claims
  • 1. An optical sensor detecting locations of objects, comprising: a circuit board;a plurality of light emitters mounted along a first edge of said circuit board, each light emitter operable when activated to emit light beams, wherein no light emitters are arranged along a second edge of said circuit board, the second edge being opposite the first edge;a plurality of light detectors mounted along the second edge of said circuit board, each light detector operable when activated to detect an intensity profile of a light beam entering the light detector;a plurality of focusing lenses mounted along a first edge of a rectangular detection area in relation to said light detectors such that each focusing lens refracts light beams from the detection area onto a respective one of said light detectors;a plurality of diverging lenses mounted along a second edge of the detection area, the second edge being opposite the first edge, and in relation to said light emitters such that each light emitter, when activated, emits single light beams through a respective one of the diverging lenses, and the diverging lens refracts each single light beam into multiple divergent light beams, wherein each divergent light beam travels across the detection area in a different direction relative to the second edge, and is directed to a respective pair of said focusing lenses, wherein an intensity profile of each divergent light beam has maximum intensity along the center of the beam,
  • 2. The optical sensor of claim 1, wherein said plurality of light emitters is shift-aligned with respect to said plurality of light detectors.
  • 3. The optical sensor of claim 1, wherein said focusing lenses along the first edge are shift-aligned with respect to said diverging lenses along the second edge.
  • 4. The optical sensor of claim 1, wherein the comparing performed by said processor comprises interpolating the received light profile outputs.
  • 5. The optical sensor of claim 1, wherein each of said diverging lenses that refracts each single light beam into a number of divergent light beams comprises a repeating pattern of that same number of facets and a single lens separated from the facets by an air gap.
  • 6. The optical sensor of claim 5, wherein said focusing lenses and said diverging lenses have the same structure.
  • 7. The optical sensor of claim 1, wherein, when multiple objects are located in the detection area, said processor calculates candidate locations of the multiple objects based on the comparing, and identifies ghost locations among the candidate locations based on comparing the candidate locations.
  • 8. The optical sensor of claim 1, wherein said processor initially calculates multiple candidate locations of the object in the detection area based on the comparing, and subsequently calculates the location of the object based on comparing the candidate locations.
  • 9. An optical method for detecting locations of objects in a plane, comprising: emit single light beams, one beam at a time, at a plurality of locations along a first edge of a detection area;refraining from emitting light beams from any location along a second edge of the detection area, the second edge being opposite the first edge;refract each single emitted light beam into multiple divergent light beams;direct each of the divergent light beams across the detection area to arrive at a respective pair of focusing lenses mounted along a second edge of the detection area, opposite the first edge, wherein an intensity profile of each divergent light beam has maximum intensity along the center of the beam;for each of the focusing lenses, measure an intensity profile of that portion of the divergent light beam that enters the focusing lens;for each pair of focusing lenses receiving a single divergent light beam, compare the measured intensity profiles for light arriving at each lens; anddetermine a location of an object in the detection area that partially blocks at least one of the divergent light beams, based on said compares.
  • 10. The optical method of claim 9, wherein each of said focusing lenses comprises a repeating pattern of facets and a single lens separated from the facets by an air gap.
  • 11. The optical method of claim 10, wherein said refract is performed by a diverging lens that has the same structure as said focusing lenses.
  • 12. The optical method of claim 9, wherein said compare comprises interpolation of the measured intensity profiles.
  • 13. The optical method of claim 9, further comprising: identify candidate locations of multiple objects in the detection area that partially block at least one of the divergent light beams, based on said compares; anddetermine ghost locations among the candidate locations based on comparing the candidate locations.
  • 14. The optical method of claim 9, further comprising: identify candidate locations of the object, based on said compares; anddetermine the location of the object based on comparing the candidate locations.
  • 15. An optical method for detecting locations of objects in a plane, comprising: emit single light beams, one beam at a time, at a plurality of locations along a first edge of a detection area;refraining from emitting light beams from any location along a second edge of the detection area, the second edge being opposite the first edge;refract each single emitted light beam into multiple divergent light beams;direct each of said divergent light beams across the detection area in a different direction, relative to the first edge, to arrive at a respective pair of focusing lenses mounted along a second edge of the detection area, opposite the first edge, wherein an intensity profile of each divergent light beam has maximum intensity along the center of the beam, whereby pairs of the divergent light beams, emitted at neighboring locations along the first edge, that travel across the detection area in the same direction relative to the first edge, arrive at two respective pairs of focusing lenses that share a common focusing lens;measure an intensity profile of the light that enters each focusing lens;for each of the pairs of divergent light beams, compare the measured intensity profiles of the light from each of the divergent light beams in the pair, that enters the common focusing lens for that pair; anddetermine a location of an object in the detection area that partially blocks at least one of the pairs of divergent light beams, based on said compares.
  • 16. The optical method of claim 15, wherein each of said focusing lenses comprises a repeating pattern of facets and a single lens separated from the facets by an air gap.
  • 17. The optical method of claim 16, wherein said refract is performed by diverging lenses that have the same structure as the focusing lenses.
  • 18. The optical method of claim 15, wherein said compare comprises interpolation of the measured intensity profiles.
  • 19. The optical method of claim 15, further comprising: identify candidate locations of multiple objects in the detection area that partially block at least one of the divergent light beams, based on said compares; anddetermine ghost locations among the candidate locations based comparing the candidate locations.
  • 20. The optical method of claim 15, further comprising: identify candidate locations of the object, based on said compares; anddetermine the location of the object based on comparing the candidate locations.
PRIORITY REFERENCE TO PROVISIONAL APPLICATION

This application claims priority benefit from U.S. Provisional Patent Application No. 63/085,838, entitled OPTICAL TOUCH SENSOR, and filed on Sep. 30, 2020, by inventor Stefan Holmgren.

US Referenced Citations (398)
Number Name Date Kind
4243879 Carroll et al. Jan 1981 A
4267443 Carroll et al. May 1981 A
4301447 Funk et al. Nov 1981 A
4518249 Murata et al. May 1985 A
4550250 Mueller et al. Oct 1985 A
4588258 Hoopman May 1986 A
4641426 Hartman et al. Feb 1987 A
4645920 Carroll et al. Feb 1987 A
4672364 Lucas Jun 1987 A
4703316 Sherbeck Oct 1987 A
4710760 Kasday Dec 1987 A
4737626 Hasegawa Apr 1988 A
4737631 Sasaki et al. Apr 1988 A
4761637 Lucas et al. Aug 1988 A
4782328 Denlinger Nov 1988 A
4790028 Ramage Dec 1988 A
4847606 Beiswenger Jul 1989 A
4868912 Doering Sep 1989 A
4880969 Lawrie Nov 1989 A
4905174 Ouchi Feb 1990 A
4928094 Smith May 1990 A
5003505 McClelland Mar 1991 A
5016008 Gruaz May 1991 A
5036187 Yoshida et al. Jul 1991 A
5053758 Cornett et al. Oct 1991 A
5119079 Hube et al. Jun 1992 A
5162783 Moreno Nov 1992 A
5179369 Person et al. Jan 1993 A
5194863 Barker et al. Mar 1993 A
5196835 Blue et al. Mar 1993 A
5220409 Bures Jun 1993 A
5283558 Chan Feb 1994 A
5406307 Hirayama et al. Apr 1995 A
5414413 Tamaru et al. May 1995 A
5422494 West Jun 1995 A
5463725 Henckel et al. Oct 1995 A
5559727 Deley et al. Sep 1996 A
5577733 Downing Nov 1996 A
5579035 Beiswenger Nov 1996 A
5603053 Gough et al. Feb 1997 A
5605406 Bowen Feb 1997 A
5612719 Beernink et al. Mar 1997 A
5618232 Martin Apr 1997 A
5729250 Bishop et al. Mar 1998 A
5748185 Stephan et al. May 1998 A
5785439 Bowen Jul 1998 A
5825352 Bisset et al. Oct 1998 A
5838308 Knapp et al. Nov 1998 A
5880462 Hsia Mar 1999 A
5880743 Moran et al. Mar 1999 A
5886697 Naughton et al. Mar 1999 A
5889236 Gillespie et al. Mar 1999 A
5900863 Numazaki May 1999 A
5900875 Haitani et al. May 1999 A
5914709 Graham et al. Jun 1999 A
5936615 Waters Aug 1999 A
5943043 Furuhata et al. Aug 1999 A
5943044 Martinelli et al. Aug 1999 A
5946134 Benson et al. Aug 1999 A
5956030 Conrad et al. Sep 1999 A
5988645 Downing Nov 1999 A
6010061 Howell Jan 2000 A
6023265 Lee Feb 2000 A
6031989 Cordell Feb 2000 A
6035180 Kubes et al. Mar 2000 A
6052279 Friend et al. Apr 2000 A
6073036 Heikkinen et al. Jun 2000 A
6085204 Chijiwa et al. Jul 2000 A
6091405 Lowe et al. Jul 2000 A
6246395 Goyins et al. Jun 2001 B1
6259436 Moon et al. Jul 2001 B1
6292179 Lee Sep 2001 B1
6323846 Westerman et al. Nov 2001 B1
6333735 Anvekar Dec 2001 B1
6337873 Goering et al. Jan 2002 B1
6340979 Beaton et al. Jan 2002 B1
6346935 Nakajima et al. Feb 2002 B1
6356287 Ruberry et al. Mar 2002 B1
6359632 Eastty et al. Mar 2002 B1
6362468 Murakami et al. Mar 2002 B1
6411283 Murphy Jun 2002 B1
6421042 Omura et al. Jul 2002 B1
6429857 Masters et al. Aug 2002 B1
6456952 Nathan Sep 2002 B1
6529920 Arons et al. Mar 2003 B1
6542191 Yonezawa Apr 2003 B1
6549217 De Greef et al. Apr 2003 B1
6570557 Westerman et al. May 2003 B1
6597345 Hirshberg Jul 2003 B2
6628268 Harada et al. Sep 2003 B1
6639584 Li Oct 2003 B1
6677932 Westerman Jan 2004 B1
6690365 Hinckley et al. Feb 2004 B2
6690387 Zimmerman et al. Feb 2004 B2
6707449 Hinckley et al. Mar 2004 B2
6727917 Chew et al. Apr 2004 B1
6734883 Wynn et al. May 2004 B1
6757002 Oross et al. Jun 2004 B1
6762077 Schuurmans et al. Jul 2004 B2
6788292 Nako et al. Sep 2004 B1
6803906 Morrison et al. Oct 2004 B1
6833827 Lui et al. Dec 2004 B2
6836367 Seino et al. Dec 2004 B2
6857746 Dyner Feb 2005 B2
6864882 Newton Mar 2005 B2
6874683 Keronen et al. Apr 2005 B2
6888536 Westerman et al. May 2005 B2
6944557 Hama et al. Sep 2005 B2
6947032 Morrison et al. Sep 2005 B2
6954197 Morrison et al. Oct 2005 B2
6958749 Matsushita et al. Oct 2005 B1
6972401 Akitt et al. Dec 2005 B2
6972834 Oka et al. Dec 2005 B1
6985137 Kaikuranta Jan 2006 B2
6988246 Kopitzke et al. Jan 2006 B2
6992660 Kawano et al. Jan 2006 B2
7006077 Uusimaki Feb 2006 B1
7007239 Hawkins et al. Feb 2006 B1
7030861 Westerman et al. Apr 2006 B1
7042444 Cok May 2006 B2
7099553 Graham et al. Aug 2006 B1
7133032 Cok Nov 2006 B2
7155683 Williams Dec 2006 B1
7159763 Yap et al. Jan 2007 B2
7170590 Kishida Jan 2007 B2
7176905 Baharav et al. Feb 2007 B2
7184030 McCharles et al. Feb 2007 B2
7225408 ORourke May 2007 B2
7232986 Worthington et al. Jun 2007 B2
7254775 Geaghan et al. Aug 2007 B2
7265748 Ryynanen Sep 2007 B2
7283845 De Bast Oct 2007 B2
7286063 Gauthey et al. Oct 2007 B2
7339580 Westerman et al. Mar 2008 B2
7352940 Charters et al. Apr 2008 B2
7355594 Barkan Apr 2008 B2
7369724 Deane May 2008 B2
7372456 McLintock May 2008 B2
7429706 Ho Sep 2008 B2
7435940 Eliasson et al. Oct 2008 B2
7441196 Gottfurcht et al. Oct 2008 B2
7442914 Eliasson et al. Oct 2008 B2
7464110 Pyhalammi et al. Dec 2008 B2
7465914 Eliasson et al. Dec 2008 B2
7469381 Ording Dec 2008 B2
7474816 Payne Jan 2009 B2
7477241 Lieberman et al. Jan 2009 B2
7479949 Jobs et al. Jan 2009 B2
7520050 Graham Apr 2009 B2
7633300 Keroe et al. Dec 2009 B2
7663607 Hotelling et al. Feb 2010 B2
7705835 Eikman Apr 2010 B2
7800594 Nakamura et al. Sep 2010 B2
7812828 Westerman et al. Oct 2010 B2
7818691 Irvine Oct 2010 B2
7855716 McCreary et al. Dec 2010 B2
7880724 Nguyen et al. Feb 2011 B2
7880732 Goertz Feb 2011 B2
7957615 Juni et al. Jun 2011 B2
8022941 Smoot Sep 2011 B2
8023780 Juni Sep 2011 B2
8068101 Goertz Nov 2011 B2
8095879 Goertz Jan 2012 B2
8120595 Kukulj et al. Feb 2012 B2
8120625 Hinckley Feb 2012 B2
8130210 Saxena et al. Mar 2012 B2
8139045 Jang et al. Mar 2012 B2
8193498 Cavallucci et al. Jun 2012 B2
8243047 Chiang et al. Aug 2012 B2
8243048 Kent et al. Aug 2012 B2
8269740 Sohn et al. Sep 2012 B2
8289299 Newton Oct 2012 B2
8350831 Drumm Jan 2013 B2
8426799 Drumm Apr 2013 B2
8471830 Goertz Jun 2013 B2
8482547 Christiansson et al. Jul 2013 B2
8508505 Shin et al. Aug 2013 B2
8520323 Machida Aug 2013 B2
8531435 Drumm Sep 2013 B2
8581884 Fahraeus et al. Nov 2013 B2
8587562 Goertz et al. Nov 2013 B2
8674963 Cornish et al. Mar 2014 B2
8674966 Jansson et al. Mar 2014 B2
8933876 Galor et al. Jan 2015 B2
9292132 An et al. Mar 2016 B2
20010002694 Nakazawa et al. Jun 2001 A1
20010022579 Hirabayashi Sep 2001 A1
20010026268 Ito Oct 2001 A1
20010028344 Iwamoto et al. Oct 2001 A1
20010030641 Suzuki Oct 2001 A1
20010043189 Brisebois et al. Nov 2001 A1
20010055006 Sano et al. Dec 2001 A1
20020002326 Causey, III et al. Jan 2002 A1
20020018824 Buazza et al. Feb 2002 A1
20020027549 Hirshberg Mar 2002 A1
20020033805 Fujioka Mar 2002 A1
20020046353 Kishimoto Apr 2002 A1
20020060699 DAgostini May 2002 A1
20020067346 Mouton Jun 2002 A1
20020067348 Masters et al. Jun 2002 A1
20020075243 Newton Jun 2002 A1
20020075244 Tani et al. Jun 2002 A1
20020108693 Veligdan Aug 2002 A1
20020109843 Ehsani et al. Aug 2002 A1
20020118177 Newton Aug 2002 A1
20020171691 Currans et al. Nov 2002 A1
20020173300 Shtivelman et al. Nov 2002 A1
20020175900 Armstrong Nov 2002 A1
20020191029 Gillespie et al. Dec 2002 A1
20030002809 Jian Jan 2003 A1
20030010043 Ferragut Jan 2003 A1
20030013483 Ausems et al. Jan 2003 A1
20030030656 Ang et al. Feb 2003 A1
20030043207 Duarte Mar 2003 A1
20030076306 Zadesky et al. Apr 2003 A1
20030095102 Kraft et al. May 2003 A1
20030098803 Gourgey et al. May 2003 A1
20030122787 Zimmerman et al. Jul 2003 A1
20030231308 Granger Dec 2003 A1
20030234346 Kao Dec 2003 A1
20040021643 Hoshino et al. Feb 2004 A1
20040021681 Liao Feb 2004 A1
20040046960 Wagner et al. Mar 2004 A1
20040100510 Milic-Frayling et al. May 2004 A1
20040109013 Goertz Jun 2004 A1
20040125143 Deaton et al. Jul 2004 A1
20040140961 Cok Jul 2004 A1
20040201579 Graham Oct 2004 A1
20040263482 Goertz Dec 2004 A1
20050035956 Sinclair et al. Feb 2005 A1
20050046621 Kaikuranta Mar 2005 A1
20050073508 Pittel et al. Apr 2005 A1
20050091612 Stabb et al. Apr 2005 A1
20050104860 McCreary et al. May 2005 A1
20050122308 Bell et al. Jun 2005 A1
20050133702 Meyer Jun 2005 A1
20050165615 Minar Jul 2005 A1
20050174473 Morgan et al. Aug 2005 A1
20050190162 Newton Sep 2005 A1
20050253818 Nettamo Nov 2005 A1
20050271319 Graham Dec 2005 A1
20060001654 Smits Jan 2006 A1
20060018586 Kishida Jan 2006 A1
20060132454 Chen et al. Jun 2006 A1
20060161870 Hotelling et al. Jul 2006 A1
20060161871 Hotelling et al. Jul 2006 A1
20060229509 Al-Ali et al. Oct 2006 A1
20070024598 Miller et al. Feb 2007 A1
20070052693 Watari Mar 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070084989 Lange et al. Apr 2007 A1
20070103436 Kong May 2007 A1
20070146318 Juh et al. Jun 2007 A1
20070150842 Chaudhri et al. Jun 2007 A1
20070152984 Ording et al. Jul 2007 A1
20070164201 Liess et al. Jul 2007 A1
20070165008 Crockett Jul 2007 A1
20070176908 Lipman et al. Aug 2007 A1
20080008472 Dress et al. Jan 2008 A1
20080012850 Keating, III Jan 2008 A1
20080013913 Lieberman et al. Jan 2008 A1
20080049341 Epstein et al. Feb 2008 A1
20080055273 Forstall Mar 2008 A1
20080056068 Yeh et al. Mar 2008 A1
20080068353 Lieberman et al. Mar 2008 A1
20080074402 Cornish et al. Mar 2008 A1
20080080811 Deane Apr 2008 A1
20080086703 Flynt et al. Apr 2008 A1
20080093542 Lieberman et al. Apr 2008 A1
20080100593 Skillman et al. May 2008 A1
20080111797 Lee May 2008 A1
20080117176 Ko et al. May 2008 A1
20080117183 Yu et al. May 2008 A1
20080121442 Boer et al. May 2008 A1
20080122792 Izadi et al. May 2008 A1
20080122796 Jobs et al. May 2008 A1
20080122803 Izadi et al. May 2008 A1
20080136790 Hio Jun 2008 A1
20080158174 Land et al. Jul 2008 A1
20080165190 Min et al. Jul 2008 A1
20080168404 Ording Jul 2008 A1
20080192025 Jaeger et al. Aug 2008 A1
20080221711 Trainer Sep 2008 A1
20080259053 Newton Oct 2008 A1
20080266266 Kent et al. Oct 2008 A1
20080273019 Deane Nov 2008 A1
20080278460 Arnett et al. Nov 2008 A1
20080297409 Klassen et al. Dec 2008 A1
20080297487 Hotelling et al. Dec 2008 A1
20090006418 OMalley Jan 2009 A1
20090009944 Yukawa et al. Jan 2009 A1
20090027357 Morrison Jan 2009 A1
20090031208 Robinson Jan 2009 A1
20090058833 Newton Mar 2009 A1
20090064055 Chaudhri et al. Mar 2009 A1
20090066673 Molne et al. Mar 2009 A1
20090096994 Smits Apr 2009 A1
20090102815 Juni Apr 2009 A1
20090135162 Van De Wijdeven et al. May 2009 A1
20090139778 Butler et al. Jun 2009 A1
20090153519 Suarez Rovere Jun 2009 A1
20090167724 Xuan et al. Jul 2009 A1
20090187840 Moosavi Jul 2009 A1
20090189878 Goertz et al. Jul 2009 A1
20090192849 Hughes et al. Jul 2009 A1
20090285383 Tsuei Nov 2009 A1
20090322699 Hansson Dec 2009 A1
20090322701 Dsouza et al. Dec 2009 A1
20100002291 Fukuyama Jan 2010 A1
20100017872 Goertz et al. Jan 2010 A1
20100023895 Benko et al. Jan 2010 A1
20100079407 Suggs Apr 2010 A1
20100079409 Sirotich et al. Apr 2010 A1
20100079412 Chiang et al. Apr 2010 A1
20100095234 Lane Apr 2010 A1
20100133424 Lindsay Jun 2010 A1
20100149073 Chaum et al. Jun 2010 A1
20100193259 Wassvik Aug 2010 A1
20100238138 Goertz et al. Sep 2010 A1
20100238139 Goertz et al. Sep 2010 A1
20100289755 Zhu et al. Nov 2010 A1
20100295821 Chang et al. Nov 2010 A1
20100302185 Han et al. Dec 2010 A1
20100321344 Yen et al. Dec 2010 A1
20110007032 Goertz Jan 2011 A1
20110043485 Goertz Feb 2011 A1
20110043826 Kiyose Feb 2011 A1
20110044579 Travis et al. Feb 2011 A1
20110050639 Challener et al. Mar 2011 A1
20110050650 McGibney et al. Mar 2011 A1
20110057906 Raynor et al. Mar 2011 A1
20110063214 Knapp Mar 2011 A1
20110074734 Wassvik et al. Mar 2011 A1
20110074735 Wassvik et al. Mar 2011 A1
20110074736 Takakura Mar 2011 A1
20110075418 Mallory et al. Mar 2011 A1
20110090176 Christiansson et al. Apr 2011 A1
20110115748 Xu May 2011 A1
20110116104 Kao et al. May 2011 A1
20110128554 Nakanishi Jun 2011 A1
20110134064 Goertz Jun 2011 A1
20110148820 Song Jun 2011 A1
20110157097 Hamada et al. Jun 2011 A1
20110163956 Zdralek Jul 2011 A1
20110163996 Wassvik et al. Jul 2011 A1
20110163998 Goertz et al. Jul 2011 A1
20110167628 Goertz et al. Jul 2011 A1
20110169780 Goertz et al. Jul 2011 A1
20110169781 Goertz et al. Jul 2011 A1
20110169782 Goertz et al. Jul 2011 A1
20110175533 Holman et al. Jul 2011 A1
20110175852 Goertz et al. Jul 2011 A1
20110179368 King et al. Jul 2011 A1
20110181552 Goertz et al. Jul 2011 A1
20110205175 Chen Aug 2011 A1
20110205186 Newton et al. Aug 2011 A1
20110210946 Goertz et al. Sep 2011 A1
20110216040 Yen et al. Sep 2011 A1
20110221706 McGibney et al. Sep 2011 A1
20110227487 Nichol et al. Sep 2011 A1
20110227874 Fahraeus et al. Sep 2011 A1
20110242056 Lee et al. Oct 2011 A1
20110248151 Holcombe et al. Oct 2011 A1
20110261020 Song et al. Oct 2011 A1
20120038584 Liu Feb 2012 A1
20120045170 Shibata et al. Feb 2012 A1
20120050226 Kato Mar 2012 A1
20120056821 Goh Mar 2012 A1
20120068971 Pemberton-Pigott Mar 2012 A1
20120068973 Christiansson et al. Mar 2012 A1
20120086672 Tseng et al. Apr 2012 A1
20120094723 Goertz Apr 2012 A1
20120098753 Lu Apr 2012 A1
20120098794 Kleinert et al. Apr 2012 A1
20120176343 Holmgren et al. Jul 2012 A1
20120188203 Yao et al. Jul 2012 A1
20120188206 Sparf et al. Jul 2012 A1
20120200536 Cornish et al. Aug 2012 A1
20120212456 Smits Aug 2012 A1
20120212457 Drumm Aug 2012 A1
20120212458 Drumm Aug 2012 A1
20120218226 Wang et al. Aug 2012 A1
20120218229 Drumm Aug 2012 A1
20120223916 Kukulj Sep 2012 A1
20120280942 Kent et al. Nov 2012 A1
20120306793 Liu et al. Dec 2012 A1
20120327039 Kukulj Dec 2012 A1
20130021302 Drumm Jan 2013 A1
20130027352 Holloway et al. Jan 2013 A1
20130044071 Hu et al. Feb 2013 A1
20130127788 Drumm May 2013 A1
20130135259 King et al. May 2013 A1
20130141395 Holmgren Jun 2013 A1
20130215034 Oh et al. Aug 2013 A1
20140320459 Pettersson et al. Oct 2014 A1
20180292053 Minor et al. Oct 2018 A1
20180296915 Holmgren et al. Oct 2018 A1
20190011567 Pacala et al. Jan 2019 A1
Foreign Referenced Citations (35)
Number Date Country
1924620 Mar 2007 CN
101387930 Mar 2009 CN
102053762 May 2011 CN
201965588 Sep 2011 CN
0330767 Sep 1989 EP
0513694 Nov 1992 EP
0601651 Jun 1994 EP
0618528 Oct 1994 EP
0703525 Mar 1996 EP
0884691 Dec 1998 EP
1059603 Dec 2000 EP
1906632 Apr 2008 EP
03-216719 Sep 1991 JP
04-322204 Nov 1992 JP
05-173699 Jul 1993 JP
06-39621 May 1994 JP
10-269012 Oct 1998 JP
11-232024 Aug 1999 JP
2014513375 May 2014 JP
8600446 Jan 1986 WO
8600447 Jan 1986 WO
0102949 Jan 2001 WO
0140922 Jun 2001 WO
02095668 Nov 2002 WO
03038592 May 2003 WO
03083767 Oct 2003 WO
2005026938 Mar 2005 WO
2008004103 Jan 2008 WO
2008133941 Nov 2008 WO
2008147266 Dec 2008 WO
2009008786 Jan 2009 WO
2010015408 Feb 2010 WO
2010093570 Aug 2010 WO
2010121031 Oct 2010 WO
2010134865 Nov 2010 WO
Non-Patent Literature Citations (27)
Entry
Hodges, S., Izadi, S., Butler, A., Rrustemi A., Buxton, B., “ThinSight: Versatile Multitouch Sensing for Thin Form-Factor Displays.” UIST'07, Oct. 7-10, 2007. <http://www.hci.iastate.edu/REU09/pub/main/telerobotics_team_papers/thinsight_versatile_multitouch_sensing_for_thin_formfactor_displays.pdf>.
Moeller, J. et al., ZeroTouch: An Optical Multi-Touch and Free-Air Interaction Architecture, Proc. CHI 2012 Proceedings of the 2012 annual conference extended abstracts on Human factors in computing systems, May 5, 2012, pp. 2165-2174. ACM New York, NY, USA.
Moeller, J. et al., ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field, CHI EA '11 Proceedings of the 2011 annual conference extended abstracts on Human factors in computing systems, May 2011, pp. 1165-1170. ACM New York, NY, USA.
Moeller, J. et al., IntangibleCanvas: Free-Air Finger Painting on a Projected Canvas, CHI EA '11 Proceedings of the 2011 annual conference extended abstracts on Human factors in computing systems, May 2011, pp. 1615-1620. ACM New York, NY, USA.
Moeller, J. et al., Scanning FTIR: Unobtrusive Optoelectronic Multi-Touch Sensing through Waveguide Transmissivity Imaging, TEI '10 Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction, Jan. 2010, pp. 73-76. ACM New York, NY, USA.
Van Loenen, Evert, et al., Entertable: A Solution for Social Gaming Experiences, Tangible Play Workshop, Jan. 28, 2007, pp. 16-19, Tangible Play Research and Design for Tangible and Tabletop Games, Workshop at the 2007 Intelligent User Interfaces Conference, Workshop Proceedings.
Goertz, Magnus, “Optical Touch Screen with Wide Beam Transmitters and Receivers”, Provisional U.S. Appl. No. 61/317,255, filed Mar. 24, 2010.
Johnson, M., “Enhanced Optical Touch Input Panel”, IBM Technical Disclosure Bulletin vol. 28, No. 4, Sep. 1985 pp. 1760-1762.
John Wallace, “LED Optics: Efficient LED collimators have simple design” available at http://www. laserfocusworld.com/articles/print/volume-48/issue-06/world-news/efficient-led-collimators-have-simple-design.html [Jun. 1, 2014 11:00:04 AM].
Non-final Office action received for U.S. Appl. No. 13/053,229 mailed on Nov. 27, 2013, 11 pages.
Final Office action received for U.S. Appl. No. 13/053,229 mailed on Aug. 15, 2014, 15 pages.
Search Report and Written Opinion for PCT application No. PCT/US2011/029191 mailed on Jul. 26, 2011, 38 pages.
International Preliminary Report on Patentability for PCT application No. PCT/US2011/029191 mailed on Sep. 25, 2012, 17 pages.
Notice of Allowance for U.S. Appl. No. 13/311,366 mailed on Jan. 12, 2015, 10 pages.
International Preliminary Report on Patentability for PCT application No. PCT/US2014/040579 mailed on Dec. 8, 2015, 8 pages.
Search Report and Written Opinion for PCT application No. PCT/US2014/040579 mailed on Dec. 5, 2014, 11 pages.
First Office action for Korean patent application No. 10-2015-7036757 mailed on Nov. 2, 2016, 4 pages.
Non-final Office action received for U.S. Appl. No. 14/588,462 mailed on Apr. 8, 2015, 11 pages.
Notice of Allowance for U.S. Appl. No. 14/588,462 mailed on Aug. 14, 2015, 8 pages.
Non-final Office action received for U.S. Appl. No. 14/730,765 mailed on Sep. 22, 2016, 10 pages.
Non-final Office action received for U.S. Appl. No. 14/730,765 mailed on May 17, 2017, 12 pages.
Non-final Office action received for U.S. Appl. No. 14/730,880 mailed on Dec. 14, 2015, 8 pages.
Final Office action received for U.S. Appl. No. 14/730,880 mailed on Sep. 22, 2016, 9 pages.
Non-final Office action received for U.S. Appl. No. 14/730,880 mailed on Mar. 9, 2017, 14 pages.
Notice of Allowance for U.S. Appl. No. 14/960,369 mailed on Jan. 3, 2017, 9 pages.
Sumriddetchkajorn, Sarun & Amarit, Ratthasart. (2005). “Optical Touch Sensor Technology”, 2005 IEEE LEOS Annual Meeting Conference Proceedings, pp. 824-825.
International Patent Application No. PCT/US2021/052370. Search Report and Written Opinion, Dec. 23, 2021, 7 pages.
Related Publications (1)
Number Date Country
20230325035 A1 Oct 2023 US
Provisional Applications (1)
Number Date Country
63085838 Sep 2020 US
Continuations (1)
Number Date Country
Parent 17487195 Sep 2021 US
Child 18327419 US