Pen differentiation for touch displays

Information

  • Patent Grant
  • 11016605
  • Patent Number
    11,016,605
  • Date Filed
    Wednesday, October 16, 2019
    4 years ago
  • Date Issued
    Tuesday, May 25, 2021
    3 years ago
Abstract
An optical IR touch sensing apparatus can determine, based on output signals of light detectors, a light energy value for each light path across a touch surface, and generate a transmission value for each light path based on the light energy value. A processor can operate an image reconstruction algorithm on at least part of the thus-generated transmission values and determine a position of a touching object on the touch surface, an attenuation value corresponding to the attenuation of the light resulting from the object touching the touch surface, and an occlusion compensation value for compensating the occlusion affect from other objects on the touch surface. Using these values, the processor can identify the type of object.
Description
INCORPORATION BY REFERENCE TO ANY PRIORITY APPLICATIONS

Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference under 37 CFR 1.57.


BACKGROUND OF THE INVENTION
Field of the Invention

The present disclosure relates to techniques for detecting and identifying objects on a touch surface.


Description of the Related Art

To an increasing extent, touch-sensitive panels are being used for providing input data to computers, electronic measurement and test equipment, gaming devices, etc. The panel may be provided with a graphical user interface (GUI) for a user to interact with using e.g. a pointer, stylus or one or more fingers. The GUI may be fixed or dynamic. A fixed GUI may e.g. be in the form of printed matter placed over, under or inside the panel. A dynamic GUI can be provided by a display screen integrated with, or placed underneath, the panel or by an image being projected onto the panel by a projector.


There are numerous known techniques for providing touch sensitivity to the panel, e.g. by using cameras to capture light scattered off the point(s) of touch on the panel, by using cameras to directly observe the objects interacting with the panel, by incorporating resistive wire grids, capacitive sensors, strain gauges, etc. into the panel.


In one category of touch-sensitive panels known as ‘above surface optical touch systems’ and known from e.g. U.S. Pat. No. 4,459,476, a plurality of optical emitters and optical receivers are arranged around the periphery of a touch surface to create a grid of intersecting light paths above the touch surface. Each light path extends between a respective emitter/receiver pair. An object that touches the touch surface will block or attenuate some of the light paths. Based on the identity of the receivers detecting a blocked light path, a processor can determine the location of the intercept between the blocked light paths.


For most touch systems, a user may place a finger onto the surface of a touch panel in order to register a touch. Alternatively, a stylus may be used. A stylus is typically a pen shaped object with one end configured to be pressed against the surface of the touch panel. An example of a stylus according to the prior art is shown in FIG. 2. Use of a stylus 60 may provide improved selection accuracy and pointer precision over a simple finger touch. This can be due to the engineered stylus tip 160 providing a smaller and/or more regular contact surface with the touch panel than is possible with a human finger. Also, muscular control of an entire hand in a pen holding position can be more precise than a single finger for the purposes of pointer control due to lifelong training in the use of pens and pencils.


PCT/SE2016/051229 describes an optical IR touch sensing apparatus configured to determine a position of a touching object on the touch surface and an attenuation value corresponding to the attenuation of the light resulting from the object touching the touch surface. Using these values, the apparatus can differentiate between different types of objects, including multiple stylus tips, fingers, palms. The differentiation between the object types may be determined by a function that takes into account how the attenuation of a touching object varies across the touch surface, compensating for e.g. light field height, detection line density, detection line angular density etc. However, the determination of an object in this way becomes more difficult when other touching objects are close by, since they occlude a lot of detection lines (otherwise known as scanlines) passing through both the occluding objects and the object to be determined.


Therefore, what is needed is a way of improving the identification of objects touching an optical touch system that mitigates the above problem.


SUMMARY OF THE INVENTION

It is an objective of the disclosure to at least partly overcome one or more of the above-identified limitations of the prior art.


One or more of these objectives, as well as further objectives that may appear from the description below, are at least partly achieved by means of a method for data processing, a computer readable medium, devices for data processing, and a touch-sensing apparatus according to the independent claims, embodiments thereof being defined by the dependent claims.


An embodiment provides a touch sensing apparatus, comprising: a touch surface, a plurality of emitters arranged around the periphery of the touch surface to emit beams of light such that one or more objects touching the touch surface cause an attenuation of the light; a plurality of light detectors arranged around the periphery of the touch surface to receive light from the plurality of emitters on a plurality of light paths, wherein each light detector is arranged to receive light from more than one emitter; and a processing element configured to: determine, based on output signals of the light detectors, a light energy value for each light path; generate a transmission value for each light path based on the light energy value; operate an image reconstruction algorithm on at least part of the thus-generated transmission values so as to determine, for each object; a position of the object on the touch surface, and an attenuation value corresponding to the attenuation of the light resulting from the object touching the touch surface, an occlusion compensation value indicative of the occlusion, by other objects on the touch surface, of light paths intersecting with the object, determine an object type of the object in dependence on the attenuation value and occlusion compensation value.


Another embodiment provides a method of determining a type of object in contact with a touch surface of a touch sensing apparatus, said touch sensing apparatus can include: a touch surface, a plurality of emitters arranged around the periphery of the touch surface to emit beams of light such that one or more objects touching the touch surface cause an attenuation of the light; a plurality of light detectors arranged around the periphery of the touch surface to receive light from the plurality of emitters on a plurality of light paths, wherein each light detector is arranged to receive light from more than one emitter; and said method comprising the steps of: determining, based on output signals of the light detectors, a light energy value for each light path; generating a transmission value for each light path based on the light energy value; operating an image reconstruction algorithm on at least part of the thus-generated transmission values so as to determine, for each object; a position of the object on the touch surface, and an attenuation value corresponding to the attenuation of the light resulting from the object touching the touch surface, an occlusion compensation value indicative of the occlusion, by other objects on the touch surface, of light paths intersecting with the object; determining an object type of the object in dependence on the attenuation value and occlusion compensation value.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the invention will now be described in more detail with reference to the accompanying schematic drawings.



FIG. 1 is a top plan view of an optical touch apparatus.



FIG. 2 shows a cross-section of an IR optical touch apparatus according to the prior art.



FIG. 3 shows a cross-section of another IR optical touch apparatus.



FIG. 4 shows a light field of an IR optical touch apparatus.



FIG. 5a is a flow chart showing a touch determination process.



FIG. 5b is a flow chart showing a touch determination process using position compensation.



FIG. 5c is a flow chart showing a touch determination process using speed compensation.



FIG. 6 is a histogram showing measured attenuation of light beams from eight unique objects applied to the touch surface.



FIG. 7 is a histogram showing measured attenuation of light beams from two objects applied to the touch surface, a stylus and a finger.



FIG. 8 is a histogram showing measured attenuation of light beams from three objects applied to the touch surface, a first stylus, a second stylus, and a finger.



FIG. 9 is flow chart showing the process for determining an occlusion compensation value for a touching object.



FIG. 10a is diagram showing an example of occluding objects on a touch surface.



FIG. 10b shows the occluded angle ranges of a touching object.



FIG. 11 shows an object entering a light field.



FIG. 12 shows an attenuation value of an object during a ‘touch down’ and a ‘touch up’ event.



FIG. 13 is a graph showing measured attenuation of light beams by an object in proportion to the object's distance from a corner of the touch surface.



FIG. 14 is an attenuation map showing the relative attenuation of an object at each location on a corner portion of the touch surface.



FIG. 15 is a graph showing measured attenuation of light beams by an object in proportion to speed at which the object is moving across the touch surface.





DETAILED DESCRIPTION OF THE EMBODIMENT

The present disclosure relates to optical touch panels and the use of techniques for providing touch sensitivity to a display apparatus. Throughout the description the same reference numerals are used to identify corresponding elements.


In addition to having its ordinary meaning, the following terms can also mean:


A “touch object” or “touching object” may be a physical object that touches, or is brought in sufficient proximity to, a touch surface so as to be detected by one or more sensors in the touch system. The physical object may be animate or inanimate.


An “interaction” can occur when the touch object affects a parameter measured by the sensor.


A “touch” can denote a point of interaction as seen in the interaction pattern.


A “light field” can be the light flowing between an emitter and a corresponding detector. Although an emitter may generate a large amount of light in many directions, only the light measured by a detector from an emitter defines the light field for the emitter and detector.



FIG. 1 is a top plan view of an optical touch apparatus which may correspond to the IR optical touch apparatus of FIG. 2. Emitters 30a are distributed around the periphery of touch surface 20, to project light across the touch surface 20 of touch panel 10. Detectors 30b are distributed around the periphery of touch surface 20, to receive part of the propagating light. The light from each of emitters 30a will thereby propagate to a number of different detectors 30b on a plurality of light paths 50.



FIG. 2 shows a cross-section of an IR optical touch apparatus according to the prior art. In the example apparatus shown in FIG. 2, object 60 will attenuate light propagating along at least one light path 50. In the example shown of FIG. 2, object 60 may even fully occlude the light on at least one light path 50.


Light paths 50 may conceptually be represented as “detection lines” that extend across the touch surface 20 to the periphery of touch surface 20 between pairs of emitters 30a and detectors 30b, as shown in FIG. 1. Thus, the detection lines 50 correspond to a projection of the light paths 50 onto the touch surface 20. Thereby, the emitters 30a and detectors 30b collectively define a grid of detection lines 50 (“detection grid”) on the touch surface 20, as seen in a top plan view. The spacing of intersections in the detection grid defines the spatial resolution of the touch-sensitive apparatus 100, i.e. the smallest object that can be detected on the touch surface 20. The width of the detection line is a function of the width of the emitters and corresponding detectors. A wide detector detecting light from a wide emitter provides a wide detection line with a broader surface coverage, minimising the space in between detection lines which provide no touch coverage. A disadvantage of broad detection lines may be the reduced touch precision and lower signal to noise ratio.


As used herein, the emitters 30a may be any type of device capable of emitting radiation in a desired wavelength range, for example a diode laser, a VCSEL (vertical-cavity surface-emitting laser), an LED (light-emitting diode), an incandescent lamp, a halogen lamp, etc. The emitters 30a may also be formed by the end of an optical fibre. The emitters 30a may generate light in any wavelength range. The following examples presume that the light is generated in the infrared (IR), i.e. at wavelengths above about 750 nm. Analogously, the detectors 30b may be any device capable of converting light (in the same wavelength range) into an electrical signal, such as a photo-detector, a CCD device, a CMOS device, etc.


The detectors 30b collectively provide an output signal, which is received and sampled by a signal processor 140. The output signal contains a number of sub-signals, also denoted “transmission values”, each representing the energy of light received by one of light detectors 30b from one of light emitters 30a. Depending on implementation, the signal processor 140 may need to process the output signal for separation of the individual transmission values. The transmission values represent the received energy, intensity or power of light received by the detectors 30b on the individual detection lines 50. Whenever an object touches a detection line 50, the received energy on this detection line is decreased or “attenuated”. Where an object blocks the entire width of the detection line of an above-surface system, the detection line will be fully attenuated or occluded.


In an embodiment, the touch apparatus is arranged according to FIG. 2. A light emitted by emitters 30a is transmitted through transmissive panel 10 in a manner that does not cause the light to TIR within transmissive panel 10. Instead, the light exits transmissive panel 10 through touch surface 20 and is reflected by reflector surface 80 of edge reflector 70 to travel along a path 50 in a plane parallel with touch surface 20. The light will then continue until deflected by reflector surface 80 of the edge reflector 70 at an opposing edge of the transmissive panel 10, wherein the light will be deflected back down through transmissive panel 10 and onto detectors 30b. An object 60 (optionally having object tip 160) touching surface 20 will occlude light paths 50 that intersect with the location of the object on the surface resulting in an attenuated light signal received at detector 30b. In an alternative embodiment shown in FIG. 3, emitters 30a and detectors 30b are arranged beyond the periphery of the panel and light is provided to the surface of panel 10 from beyond the edges of panel 10.



FIG. 4 shows the manner in which light travelling from emitters 30a to detectors 30b will form a light field 90 between reflector surfaces 80. In an embodiment, the top edge of reflector surface 80 is 2 mm above touch surface 20. This results in a light field 90 which is 2 mm deep. A 2 mm deep field is advantageous for this embodiment as it minimizes the distance that the object needs to travel into the light field to reach the touch surface and to maximally attenuate the light. The smaller the distance, the shorter time between the object entering the light field and contacting the surface. This is particularly advantageous for differentiating between large objects entering the light field slowly and small objects entering the light field quickly. A large object entering the light field will initially cause a similar attenuation as a smaller object fully extended into the light field. The shorter distance for the objects to travel, the fewer frames are required before a representative attenuation signal for each object can be observed. This effect is particularly apparent when the light field is between 0.5 mm and 2 mm deep.


Unless otherwise stated, the embodiments described in the specification apply to the arrangement shown in FIG. 2. However, some of these embodiments may also be applied to an arrangement shown in FIG. 3.


The signal processor 140 may be configured to process the transmission values so as to determine a property of the touching objects, such as a position (e.g. in a x,y coordinate system), a shape, or an area. This determination may involve a straight-forward triangulation based on the attenuated detection lines, e.g. as disclosed in U.S. Pat. No. 7,432,893 and WO2010/015408, or a more advanced processing to recreate a distribution of attenuation values (for simplicity, referred to as an “attenuation pattern”) across the touch surface 20, where each attenuation value represents a local degree of light attenuation. The attenuation pattern may be further processed by the signal processor 140 or by a separate device (not shown) for determination of a position, shape or area of touching objects. The attenuation pattern may be generated e.g. by any available algorithm for image reconstruction based on transmission values, including tomographic reconstruction methods such as Filtered Back Projection, FFT-based algorithms, ART (Algebraic Reconstruction Technique), SART (Simultaneous Algebraic Reconstruction Technique), etc. Alternatively, the attenuation pattern may be generated by adapting one or more basis functions and/or by statistical methods such as Bayesian inversion. Examples of such reconstruction functions designed for use in touch determination are found in WO2009/077962, WO2011/049511, WO2011/139213, WO2012/050510, and WO2013/062471, all of which are incorporated herein by reference.


For the purposes of brevity, the term ‘signal processor’ is used throughout to describe one or more processing components for performing the various stages of processing required between receiving the signal from the detectors through to outputting a determination of touch including touch co-ordinates, touch properties, etc. Although the processing stages of the present disclosure may be carried out on a single processing unit (with a corresponding memory unit), the disclosure is also intended to cover multiple processing units and even remotely located processing units. In an embodiment, the signal processor 140 can include one or more hardware processors 130 and a memory 120. The hardware processors can include, for example, one or more computer processing units. The hardware processor can also include microcontrollers and/or application specific circuitry such as ASICs and FPGAs. The flowcharts and functions discussed herein can be implemented as programming instructions stored, for example, in the memory 120 or a memory of the one or more hardware processors. The programming instructions can be implemented in machine code, C, C++, JAVA, or any other suitable programming languages. The signal processor 140 can execute the programming instructions and accordingly execute the flowcharts and functions discussed herein.



FIG. 5a shows a flow diagram according to an embodiment.


In step 510 of FIG. 5a, the signal processor 140 receives and samples output signals from detectors 30b.


In step 520, the output signals are processed for determination of the transmission values (or ‘transmission signals’). As described above, the transmission values represent the received energy, intensity or power of light received by the detectors 30b on the individual detection lines 50.


In step 530, the signal processor 140 is configured to process the transmission values to determine the presence of one or more touching objects on the touch surface. In an embodiment, the signal processor 140 is configured to process the transmission values to generate a two-dimensional estimation of the attenuation field across the touch surface, i.e. a spatial distribution of attenuation values, in which each touching object typically appears as a region of changed attenuation. From the attenuation field, two-dimensional touch data may be extracted and one or more touch locations may be identified. The transmission values may be processed according to a tomographic reconstruction algorithm to generate the two-dimensional estimation of the attenuation field.


In one embodiment, the signal processor 140 maybe configured to generate an attenuation field for the entire touch surface. In an alternative embodiment, the signal processor 140 maybe configured to generate an attenuation field for a sub-section of the touch surface, the sub-section being selected according to one or more criteria determined during processing of the transmission values.


In step 540, the signal processor 140 determines properties of the object at each touch location, including an attenuation value corresponding to the attenuation of the beams of light passing through the touch location resulting from the object touching the touch surface.


In one embodiment, the attenuation value is determined in the following manner: First, the attenuation pattern is processed for detection of peaks, e.g. using any known technique. In one embodiment, a global or local threshold is first applied to the attenuation pattern, to suppress noise. Any areas with attenuation values that fall above the threshold may be further processed to find local maxima. The identified maxima may be further processed for determination of a touch shape and a center position, e.g. by fitting a two-dimensional second-order polynomial or a Gaussian bell shape to the attenuation values, or by finding the ellipse of inertia of the attenuation values. There are also numerous other techniques as is well known in the art, such as clustering algorithms, edge detection algorithms, standard blob detection, water shedding techniques, flood fill techniques, etc. Step 540 results in a collection of peak data, which may include values of position, attenuation, size, and shape for each detected peak. The attenuation value may be calculated from a maximum attenuation value or a sum of attenuation values within the peak shape.


The attenuation value recorded for an object may vary due to noise, object angle, object material, or a number of other reasons. FIG. 6 is a histogram showing a count of attenuation values for each of eight unique objects applied to the touch surface. Each object demonstrates a roughly bell-shaped distribution of frequency of recorded attenuation values. It is clear from FIG. 6 that it is possible to differentiate between different objects using recorded attenuation values, particularly where multiple attenuation values for each object are recorded.


Certain objects may provide a wider distribution of attenuation values than others. FIG. 7 is a histogram showing measured attenuation of light beams from two objects applied to the touch surface, a stylus and a finger. Bell-shaped distribution of values 710 represents attenuation values for a specially designed stylus tip applied to the touch surface. Distribution 720 represents attenuation values for a population of different fingers applied to the touch surface. As people have different sized fingers and some fingers may be more oily that others, the range of possible attenuation values from objects in distribution 720 is much wider than the possible attenuation values for a specially designed stylus tip. Zone 730 represents attenuation values which are too small for the system to reliably record. In a typical example, zone 730 covers attenuation values smaller than 1.8*10−3 (Note: All attenuation values described in the present specification have units of mm−1 but it is understood that attenuation may be measured in a number of different ways.) Depending on the touch resolution of the system, this may translate to objects smaller than 0.5 mm. Therefore, an embodiment of the disclosure includes a stylus tip configured to provide attenuation values in a range greater than values in zone 730 but smaller than the range of attenuation values occupied by distribution 720. E.g. 1.8*10−3<stylus tip attenuation<2.0*10−2



FIG. 8 is a histogram showing measured attenuation of light beams from three objects applied to the touch surface, a first stylus, a second stylus, and a finger. As in FIG. 7, bell-shaped distribution of values 810 represents attenuation values for a first specially designed stylus tip applied to the touch surface. Distribution 830 represents attenuation values for a finger applied to the touch surface. Zone 840 represents attenuation values which are too small for the system to reliably record. Bell-shaped distribution of values 820 represents attenuation values for a second specially designed stylus tip applied to the touch surface. Therefore, another embodiment of the disclosure includes a first stylus tip configured to provide attenuation values in a range greater than values in zone 840 but smaller than the range of attenuation values 820 occupied by a second stylus tip. The second stylus tip is configured to provide attenuation values in a range greater than values occupied by distribution 810 but smaller than the range of attenuation values occupied by distribution 830. E.g. 1.8*10−3<first stylus tip attenuation<7.2*10−3<second stylus tip attenuation<2.0*10−2


In step 542 of FIG. 5a, the signal processor 140 determines an occlusion compensation value of the object at each touch location. FIG. 9 shows a flow diagram for determining the occlusion compensation value for each touch location according to an embodiment.


In step 910, signal processor 140 determines all known objects within an occlusion region of the touch surface. A known object is typically a touch location identified in the present or a previous frame. In an embodiment, the occlusion region of the touch surface is determined by a radius around the touch location. The determination of known objects within a radius of the touch location may be determined by measuring the distance between the coordinates of the touch location and the coordinates of the respective known object. If the distance is less than the value used to determine the radius, the known object is determined to be within the occlusion region.


In some embodiments, the occlusion region is the entire touch surface. In other embodiments, the occlusion region may be limited to a predefined cell of a plurality of cells covering the touch surface, wherein a cell is a portion of the touch surface. In this embodiment, the occlusion region may be limited to the cell occupied by the touch location.


Steps 920-940 may be executed for each known object in the occlusion region.



FIG. 10a shows an example set of objects 250, 260, 270, 280, and 290 in contact with an example touch surface.



FIG. 10b shows the occlusion of light at touch location 250 on the touch surface with respect to angle φ resulting from occluding objects 260, 270, 280, and 290. Occlusion shadow 265′ shows the shadow resulting from objects 260 and 270. Although the figure shows the accumulated occlusion where the objects overlap, occlusion level 262 shows the maximum occlusion used to determine occlusion of an object. This is a natural result of the fact that light cannot typically be occluded more than once by opaque objects. Occlusion shadow 280′ shows the shadow resulting from larger object 280. Occlusion shadow 290′ shows the shadow resulting from object 290. Occlusion shadows 265″, 280″, and 290″ are the corresponding occlusion shadows resulting from detection lines that must be treated as occluded as they also pass through the position of the respective occluding object, but from the far side of touch location 250.


In step 920, the angle φ1 between axis N and the vector between the first respective known object 260 and the touch location 250 is determined. In an embodiment, the centroid of the area of interaction between the object and the touch surface is used to determine co-ordinates of touching objects. In an embodiment, the angle may be calculated using the co-ordinates of the known object 260 and touch location 250 and ordinary Euclidean geometry.


In step 930, the size of the first respective known object 260 is determined. In one embodiment, an estimated size of the first known object 260 may already be known by the touch determination system or may be determined as a function of the reconstruction image generated by the signal processor 140, e.g. in dependence on the portion of the reconstruction image occupied by the first known object 260. For the purposes of the present embodiment, the size of an object is the diameter of the object in the plane of the touch surface as viewed from the touch location 250. i.e. The width of the silhouette of the known object as viewed from the touch location 250. Where first known object 260 is substantially circular, this diameter may be largely consistent regardless of the angle between the touch location 250 and first known object 260. Where first known object 260 is irregular, the diameter of the object in the plane of the touch surface as viewed from the touch location 250 may vary considerably in dependence on the orientation of first known object 260 and/or the angle between the touch location 250 and first known object 260. For smaller objects, this variation may be ignored, but for larger objects (e.g. a rectangular board eraser), the variation in the diameter of the large object as viewed from the touch location 250 may be as much as three times larger from one angle than from another. In one embodiment, the size of an object is determined as a function of the width, length, and azimuth of the object, wherein the length of the object is defined as the major axis of the object, the length of the object is defined as the minor axis of the object, and wherein the azimuth is the angle between the major axis of the object and axis N.


In an embodiment, the length, width, and azimuth are used to determine a rectangular bounding box B around the shape of the known object. The length, width of the known object defines the length and width of bounding box B, and the azimuth defines the rotation of bounding box B. The co-ordinate of the four corners of the bounding box are then determined and the two co-ordinates defining the minimum (Bmin) and maximum angle (Bmax) to the touch location 250 with respect to axis N are used to determine the diameter of the object in the plane of the touch surface as viewed from the touch location 250, and therefore the size of the object.


In step 940, the angular range over which the light reaching touch location 250 is occluded by first known object 260 is calculated. In an embodiment, the angular range φΔ is determined to be:

φΔ=2*tan−1(size/(2*distance between objects))


In an embodiment, where the ratio between the known object size and the distance between the known object and the touch location is larger than a threshold, angular range φΔ is determined to be:

φΔ=size/distance between objects


In one embodiment, the threshold is between 0.15 and 0.5. In an embodiment, the above threshold is 0.3.


In step 950, the angular range for all known objects is aggregated to determine the full occlusion for all the known objects. In an embodiment, the angular range for each of the known objects is summed to generate a summed angular range. For overlapping angular ranges (e.g. occlusion shadow 265 of FIG. 10b), the overlapping portion is used only once for the generation of the summed angular range. i.e. The determined occlusion has a maximum of occlusion maximum 262. This is to ensure that the occlusion compensation value is not distorted by angular ranges where light has already been occluded once.


In step 960, the summed angular range is normalised to generate an occlusion ratio. In one example, the summed angular range is normalised by dividing the summed angular range by pi radians or 180 degrees, depending on the angular units used to determine the summed angular range. The need for this normalisation step is shown in FIG. 10b, where occlusion shadows 265″, 280″, and 290″ must also compensated for. In an alternative embodiment, the step of normalization is performed as part of step 940 for each angular range φΔ.


In step 970, an occlusion compensation value for the touch location is generated in dependence on the occlusion ratio. Because the attenuation of an object at touch location 250 will decrease in dependence on the occlusion ratio, it is useful to use this occlusion ratio to generate the occlusion compensation value. As the detection lines occluded by other objects cannot be used to determine the attenuation caused by the object at touch location 250, the measured attenuation is reduced in proportion to the number of occluded detection lines. Consequently, the measured attenuation can be compensated using the occlusion ratio. In an embodiment, the compensated attenuation value is equal to the occlusion ratio.


In an alternative embodiment, the occlusion compensation value is generated in dependence on the number of known objects present on the touch surface alone.


In another alternative embodiment, the occlusion compensation value is generated just in dependence on the number of known objects present on the touch surface within a selected distance of the object.


In step 548 of FIG. 5a, the signal processor 140 determines a compensated attenuation value of the object at each touch location in dependence on the attenuation value and the occlusion compensation value. In an embodiment, the compensated attenuation value is determined by dividing the attenuation value determined in step 540 by the occlusion compensation value determined in step 542/970.


In an embodiment, signal processor 140 is configured to store a plurality of object IDs in memory, each object ID having an associated attenuation value range. In the following example, three object types with associated Object IDs are shown.















Object ID:
001
002
003







Object type:
Stylus Thick
Stylus Thin
Finger


Output type:
Think Blue Ink
Thick Red Ink
Thick Black Ink


Attenuation Max:
2.0 * 10−2
7.2 * 10−3


Attenuation Min:
7.2 * 10−3
1.8 * 10−3
2.0 * 10−2









In an embodiment, each Object ID has an attenuation value range, defined by an Attenuation Max value and an Attenuation Min value. The Object IDs may optionally include further values defining properties of an associated object, including a recognised object type, an output type (e.g. a brush type, ink colour, selection type, etc.)


In step 550, signal processor 140 matches each touch location to an Object ID. This is done by matching the compensated attenuation value of each touch location to the range of the matching Object ID. i.e. A touch location with an attenuation value of 1.2*10−2 will be matched to Object ID 001. In one embodiment, an Object ID exists with a range for all values above a specific value. This allows all objects with a compensated attenuation value above the usual ranges of the Object IDs to be identified using the same ‘default large object’ Object ID. Similarly, in one embodiment, an Object ID exists with a range for all values below a specific value allowing very low compensated attenuation value objects to be identified with a generic ‘default small object’ Object ID.


In step 560, signal processor 140 outputs the touch data, including the touch locations and corresponding Object IDs for each location.


In an alternative embodiment to that shown in FIG. 5a, the attenuation compensation value determined in step 540 is applied to the Attenuation Max value and an Attenuation Min value of the respective Object IDs, instead of directly to the attenuation value of the touch location. In an example of this embodiment, each of the Attenuation Max value and an Attenuation Min value of an Object ID is multiplied by the Attenuation Value to determine a Compensated Attenuation Max value and Compensated Attenuation Min value for the Object ID. The attenuation value is then compared against the Compensated Attenuation Max value and Compensated Attenuation Min value to determine the matching Object ID.


When matching an attenuation value of a touch to an object ID, it is important to use a stable attenuation value which correctly reflects the attenuation of the light caused by the object once it is in contact with the surface. In an ‘above surface’ system such as the embodiment shown in FIG. 2, light field 90 has a depth and so the object must travel a distance through the light field before contacting the touch surface. Consequently, a period of time between when the object enters the light field and when the object contacts the touch surface exists when the attenuation caused by the object is likely to be increasing. Any attenuation values measured during this period will likely not accurately reflect the light attenuation of the object once it is contacting the touch surface. In one embodiment of the disclosure, step 540 is delayed until an attenuation value of an object is determined to be stable. In one embodiment, the attenuation value of an object is determined to be stable once it has not changed greater than 10% each frame for at least three frames.


As an object is lowered into the light field, it occludes increasingly more light. As a consequence, the attenuation of light caused by the object increases until the object has hit the touch surface. The gradient of attenuation (i.e. the rate of change of the attenuation) is therefore positive as the object travels towards the touch surface until it flattens out when the object is in contact with the surface. FIG. 13 shows an object 60 with tip 160 having travelled into light field 90 for a distance of hmax−h. FIG. 13 shows an attenuation value 1020 of an object during a ‘touch down’ event 1040 (i.e the application of a touching object to the touch surface) and a ‘touch up’ event 1050 (i.e lifting the touching object off and away from the touch surface). A corresponding height h (shown as line 1010) of the object from the touch surface is also shown. The line 1030 showing the attenuation gradient (i.e. the rate of change of change of the attenuation value with respect to time) shows a typical attenuation gradient signature for both touch down and touch up events. An attenuation gradient signature is the shape of the attenuation gradient values during a touch down or touch up event.


Therefore, in an embodiment of the disclosure, signal processor 140 is configured to determine that an object attenuation value is stable and/or that a touch down event has occurred in dependence on an attenuation gradient signature (shown at time 1040 in FIG. 13) of an event. In an embodiment, the attenuation gradient signature corresponding to a touch down event is a first period of a first attenuation gradient, a second period of higher attenuation gradient, and a third period of attenuation gradient lower than the second period.


In one embodiment, a touch down event determined to have occurred once object attenuation value has exceeded a first attenuation value threshold. However, a determination that a touch down event has occurred is possible before this threshold is met, using the above method. Where the object attenuation value is below the first attenuation value threshold but an attenuation gradient signature is observed having a higher attenuation gradient equal to or greater than 20% of the first attenuation value threshold over a single frame, the object attenuation value may be determined to be stable and/or that a touch down event has occurred.


During a ‘touch up’ event, an attenuation value of the object decreases as the object is lifted out of the light field. Similarly to the above, the attenuation gradient signature of this event (shown at time 1050 in FIG. 13) can be recognized and actioned accordingly. Therefore, in an embodiment of the disclosure, signal processor 140 is configured to determine that an object attenuation value is reduced to zero and/or that a touch up event has occurred in dependence on an attenuation gradient signature of an event. In an embodiment, the attenuation gradient signature corresponding to a touch up event is a first period of a first attenuation gradient, a second period of negative attenuation gradient, and a third period of attenuation corresponding to the first attenuation gradient.


In one embodiment, a touch up event is determined to have occurred once the object attenuation value is determined to have dropped below a second attenuation value threshold. However, a determination that a touch up event has occurred is possible before this threshold is met, using the above method. Where the object attenuation value is above the second attenuation value threshold but an attenuation gradient signature is observed having a negative attenuation gradient equal to or greater than 20% of the second attenuation value threshold over a single frame, a touch up event may be determined to have occurred.


In an embodiment, the attenuation gradient values required to trigger touch up/down events for an object may be scaled in dependence on the presence of other occluding objects in close proximity to the object. In an example, the attenuation gradient of the second period of a signature is scaled up to require an even larger value to trigger a touch down event for an object in close proximity to other occluding objects on the touch surface. In one embodiment, the higher attenuation gradient is scaled linearly occurred to the number of additional touches within a radius of up to 10 cm. The radius may be chosen in dependence on the screen size, touch resolution, and environmental noise.


‘Hooks’ are a problem observed in the flow of co-ordinates of user touch input over time when the user is providing rapidly changing touch input. E.g. When drawing or writing. An example of a ‘hook’ is where the user finishes drawing a stroke, lifts the touch object from the surface of the panel and rapidly changes the direction of movement of the touching object to begin drawing the next stroke. The ‘hook’ is a small artifact seen at the end of the stroke pointing in the new direction of the user's touch object. A method of minimizing hooks is proposed. In an embodiment of the disclosure, once a negative attenuation gradient has been observed, the touch coordinates will not be updated with the object's position and the coordinates of the object's position are stored. If the object attenuation value drops below a threshold value, the stored coordinates are discarded and a ‘touch up’ event is signaled. If the object attenuation value does not drop below a threshold value and a positive attenuation gradient is subsequently observed, the stored touch coordinates for the intervening period will be output and the touch coordinates will continue to be output as before. In an embodiment, the method is only used when the direction of movement of the object contacting the touch surface in the plane of the touch surface is changing. In this embodiment, a vector α from a last touch coordinate of the object to a current coordinate is determined. A second vector β from a touch coordinate previous to the last coordinate to the last coordinate is determined. Vectors α and β allow a determination of the direction the interaction is moving and how it is changing. A rapid change of direction of the object may result in a scalarproduct β<0. In one embodiment, if this condition is met, it may be determined that the direction of movement of the object contacting the touch surface has significantly changed and the above method for minimizing hooks is then applied.


Although the attenuation value of an object provides information regarding the light attenuated by the object touching the surface, some embodiments of the disclosure require that the attenuation value be compensated in order to provide a true reflection of the nature and/or position of the object.


In one embodiment of the disclosure shown in FIG. 5b, the compensated attenuation value of an object is determined in dependence on the attenuation value of the object determined in step 540, an occlusion compensation value determined in step 542, and a position compensation value determined in step 544.


In certain arrangements of the system shown FIG. 1, certain positions on the touch surface are likely to result in lower attenuation values than others for equivalent objects. In particular, attenuation values towards the edge of the screen are likely to be lower than in the centre. A variety of factors may cause this to be the case. One is that efficient implementations of certain tomographic reconstruction algorithms make approximations resulting in lower reconstructed attenuation values towards the edges of the panel. In one example, attenuation values in a corner of a panel may be as low as 30% of attenuation values located at the centre of the panel for an equivalent object. FIG. 13 shows a graph of attenuation values (shown as relative attenuation) of an object touching a rectangular touch surface relative to the distance of the object from a corner of the touch surface. Consequently, an embodiment of the disclosure provides that the position compensation value is a function of at least the position of the object on the touch surface. In one embodiment, the position compensation value is determined as a function of the distance from a central point on the touch surface to the touch position. Alternatively, the compensation value may be determined as a function of the distance from the nearest corner of the touch surface to the touch position.


The relationship between the position of the touch and a required position compensation value may be a complex function of the geometry of the emitters and detectors. FIG. 14 shows a heat map of a corner of a rectangular touch surface showing relative attenuation of a touching object. When touching at co-ordinate (0,0), the object generates relatively little attenuation. When touching at co-ordinate (10,15), a much larger amount of attenuation occurs.


Consequently, an embodiment of the disclosure provides calculating a position compensation value as a function of the position of the corresponding touch on the touch surface. An alternative embodiment describes using a compensation map to determine a position compensation value given a position on the touch surface. The compensation map may include a 2D image corresponding to the dimensions of the touch surface with pixel values corresponding to position compensation values. A touch position is then used to determine the corresponding pixel on the compensation map and the pixel value at that position provides the corresponding position compensation value. In an embodiment, the compensation map has a resolution lower than or equal to the touch resolution of the touch determination system. The compensation map is preferably generated in advance but may also be generated dynamically as a function of environmental and performance variables. In one embodiment, the compensation map is generated manually or using a calibration robot during the manufacturing phase.


Another variable which may affect the recorded attenuation of a touch object is the speed at which the touching object is moving across the touch surface. The light attenuation of each light path is recorded sequentially over a series of frames. Therefore, a sufficiently fast moving object may have moved away from a specific position before the attenuation of all light paths intersecting the position have been measured. Consequently, a moving object may generate a weaker attenuation signal. FIG. 15 shows a graph of recorded attenuation values of an object relative to the speed of the object across the touch surface. A relationship can be seen showing that a faster moving object is likely to generate a weaker attenuation value. Therefore, in one embodiment of the disclosure shown in FIG. 5c, the compensated attenuation value of an object is determined in dependence on at least the attenuation value of the object determined in step 540, an occlusion compensation value determined in step 542, and a speed compensation value determined in step 546. As the relationship between the speed of an object and the recorded attenuation value may also be complicated by the position of the moving object on the touch surface, an embodiment of the disclosure provides determining a compensation value as a function of both the position and speed of the object on the touch surface.

Claims
  • 1. A touch sensing apparatus, comprising: a touch surface,a plurality of emitters arranged around a periphery of the touch surface, the plurality of emitters configured to emit beams of light such that one or more objects touching the touch surface cause an attenuation of the light;a plurality of detectors arranged around the periphery of the touch surface, the plurality of detectors configured to receive light from the plurality of emitters on a plurality of light paths, wherein each detector in the plurality of detectors is arranged to receive light from more than one emitter in the plurality of emitters; anda hardware processor configured to: determine, based on output signals generated by the plurality of detectors, a light energy value for each light path of the plurality of light paths;generate a transmission value for each light path of the plurality of light paths based on the light energy value; anddetermine for each object from at least some of the generated transmission values: a position of the object on the touch surface,an attenuation value of the object corresponding to the attenuation of the light resulting from the object touching the touch surface, andan object type of the object in dependence on the attenuation value of the object and attenuation value(s) of other objects on the touch surface.
  • 2. The touch sensing apparatus of claim 1, wherein other objects on the touch surface are occluding objects and the occlusion compensation value is determined in dependence on the number of occluding objects.
  • 3. The touch sensing apparatus of claim 2, wherein the occlusion compensation value is determined in dependence on the number of occluding objects within a selected distance of the object.
  • 4. The touch sensing apparatus of claim 1, wherein the occlusion compensation value is determined as a function of the size and angle with respect to the object, of the occluding objects.
  • 5. The touch sensing apparatus of claim 4, wherein the size an occluding object is determined as a function of a width, length, and azimuth of the occluding object.
  • 6. The touch sensing apparatus of claim 1, wherein the image reconstruction algorithm is an algorithm for transmission tomography.
  • 7. The touch sensing apparatus of claim 1, the processing element further configured to store a plurality of object IDs, each having an associated attenuation value range.
  • 8. The touch sensing apparatus of claim 7, the processing element further configured to: determine an occlusion compensated attenuation value for the object in dependence on the attenuation value and the occlusion compensation value; andidentify an object ID with an attenuation value range corresponding to the occlusion compensated attenuation value of the object and associating the object ID with the object.
  • 9. The touch sensing apparatus of claim 8, wherein the identifying step is not performed by the processing element until the attenuation value is determined to be stable.
  • 10. The touch sensing apparatus of claim 9, wherein the attenuation value is determined to be stable when the rate of change of the attenuation value is below a predefined threshold.
  • 11. The touch sensing apparatus of claim 9, wherein the attenuation value is determined to be stable when the rate of change of the attenuation value is determined to rise above a first threshold and then subsequently drop below a second threshold, wherein the second threshold is lower than the first threshold.
  • 12. The touch sensing apparatus of claim 1, the attenuation value is generated in dependence on the attenuation of the light resulting from the object touching the touch surface and a compensation value.
  • 13. The touch sensing apparatus of claim 12, wherein the compensation value is a function of at least the position of the object on the touch surface.
  • 14. The touch sensing apparatus of claim 12, the compensation value being the value at a position on a compensation map corresponding to the position of the object on the touch surface.
  • 15. The touch sensing apparatus of claim 12, wherein the compensation value is a function of at least the speed of the object across the touch surface.
  • 16. The touch sensing apparatus of claim 12, wherein the compensation value is proportional to the speed of the object across the touch surface.
  • 17. A method of determining a type of object in contact with a touch surface of a touch sensing apparatus, said touch sensing apparatus comprising: a touch surface,a plurality of emitters, arranged around the periphery of the touch surface, configured to emit beams of light such that one or more objects touching the touch surface cause an attenuation of the light; anda plurality of detectors, arranged around the periphery of the touch surface, configured to receive light from the plurality of emitters on a plurality of light paths, wherein each detector in the plurality of detectors is arranged to receive light from more than one emitter in the plurality of emitters; said method comprising the steps of:determining, based on output signals generated by the plurality of detectors, a light energy value for each light path of the plurality of light paths;generating a transmission value for each light path of the plurality of light paths based on the light energy value; anddetermining for each object from at least some of the generated transmission values: a position of the object on the touch surface,an attenuation value of the object corresponding to the attenuation of the light resulting from the object touching the touch surface, andan object type of the object in dependence on the attenuation value of the object and attenuation value(s) of other objects on the touch surface.
Priority Claims (4)
Number Date Country Kind
1730073 Mar 2017 SE national
1730120 Apr 2017 SE national
17172910 May 2017 EP regional
1730276 Oct 2017 SE national
US Referenced Citations (667)
Number Name Date Kind
3440426 Bush Apr 1969 A
3553680 Cooreman Jan 1971 A
3673327 Johnson et al. Jun 1972 A
4129384 Walker et al. Dec 1978 A
4180702 Sick et al. Dec 1979 A
4209255 Heynau et al. Jun 1980 A
4213707 Evans, Jr. Jul 1980 A
4254333 Bergström Mar 1981 A
4254407 Tipon Mar 1981 A
4294543 Apple et al. Oct 1981 A
4346376 Mallos Aug 1982 A
4420261 Barlow et al. Dec 1983 A
4484179 Kasday Nov 1984 A
4507557 Tsikos Mar 1985 A
4521112 Kuwabara et al. Jun 1985 A
4542375 Alles et al. Sep 1985 A
4550250 Mueller et al. Oct 1985 A
4593191 Alles Jun 1986 A
4673918 Adler et al. Jun 1987 A
4688933 Lapeyre Aug 1987 A
4688993 Ferris et al. Aug 1987 A
4692809 Beining et al. Sep 1987 A
4710760 Kasday Dec 1987 A
4736191 Matzke et al. Apr 1988 A
4737626 Hasegawa Apr 1988 A
4746770 McAvinney May 1988 A
4751379 Sasaki et al. Jun 1988 A
4752655 Tajiri et al. Jun 1988 A
4772763 Garwin et al. Sep 1988 A
4782328 Denlinger Nov 1988 A
4812833 Shimauchi Mar 1989 A
4837430 Hasegawa Jun 1989 A
4868912 Doering Sep 1989 A
4891829 Deckman et al. Jan 1990 A
4916712 Bender Apr 1990 A
4933544 Tamaru Jun 1990 A
4949079 Loebner Aug 1990 A
4986662 Bures Jan 1991 A
4988983 Wehrer Jan 1991 A
5065185 Powers et al. Nov 1991 A
5073770 Lowbner Dec 1991 A
5105186 May Apr 1992 A
5159322 Loebner Oct 1992 A
5166668 Aoyagi Nov 1992 A
5227622 Suzuki Jul 1993 A
5248856 Mallicoat Sep 1993 A
5254407 Sergerie et al. Oct 1993 A
5345490 Finnigan et al. Sep 1994 A
5383022 Kaser Jan 1995 A
5483261 Yasutake Jan 1996 A
5484966 Segen Jan 1996 A
5499098 Ogawa Mar 1996 A
5502568 Ogawa et al. Mar 1996 A
5515083 Casebolt et al. May 1996 A
5525764 Junkins et al. Jun 1996 A
5526422 Keen Jun 1996 A
5570181 Yasuo et al. Oct 1996 A
5572251 Ogawa Nov 1996 A
5577501 Flohr et al. Nov 1996 A
5600105 Fukuzaki et al. Feb 1997 A
5608550 Epstein et al. Mar 1997 A
5672852 Fukuzaki et al. Sep 1997 A
5679930 Katsurahira Oct 1997 A
5686942 Ball Nov 1997 A
5688933 Evans et al. Nov 1997 A
5729249 Yasutake Mar 1998 A
5736686 Perret, Jr. et al. Apr 1998 A
5740224 Müller et al. Apr 1998 A
5764223 Chang et al. Jun 1998 A
5767517 Hawkins Jun 1998 A
5775792 Wiese Jul 1998 A
5945980 Moissev et al. Aug 1999 A
5945981 Paull et al. Aug 1999 A
5959617 Bird et al. Sep 1999 A
6031524 Kunert Feb 2000 A
6061177 Fujimoto May 2000 A
6067079 Shieh May 2000 A
6122394 Neukermans et al. Sep 2000 A
6141104 Schulz et al. Oct 2000 A
6172667 Sayag Jan 2001 B1
6175999 Sloan et al. Jan 2001 B1
6227667 Halldorsson et al. May 2001 B1
6229529 Yano et al. May 2001 B1
6333735 Anvekar Dec 2001 B1
6366276 Kunimatsu et al. Apr 2002 B1
6380732 Gilboa Apr 2002 B1
6380740 Laub Apr 2002 B1
6390370 Plesko May 2002 B1
6429857 Masters et al. Aug 2002 B1
6452996 Hsieh Sep 2002 B1
6476797 Kurihara et al. Nov 2002 B1
6492633 Nakazawa et al. Dec 2002 B2
6495832 Kirby Dec 2002 B1
6504143 Koops et al. Jan 2003 B2
6529327 Graindorge Mar 2003 B1
6538644 Muraoka Mar 2003 B1
6587099 Takekawa Jul 2003 B2
6648485 Colgan et al. Nov 2003 B1
6660964 Benderly Dec 2003 B1
6664498 Forsman et al. Dec 2003 B2
6664952 Iwamoto et al. Dec 2003 B2
6690363 Newton Feb 2004 B2
6707027 Liess et al. Mar 2004 B2
6738051 Boyd et al. May 2004 B2
6748098 Rosenfeld Jun 2004 B1
6784948 Kawashima et al. Aug 2004 B2
6799141 Stoustrup et al. Sep 2004 B1
6806871 Yasue Oct 2004 B1
6927384 Reime et al. Aug 2005 B2
6940286 Wang et al. Sep 2005 B2
6965836 Richardson Nov 2005 B2
6972753 Kimura et al. Dec 2005 B1
6985137 Kaikuranta Jan 2006 B2
7042444 Cok May 2006 B2
7084859 Pryor Aug 2006 B1
7133031 Wang et al. Nov 2006 B2
7176904 Satoh Feb 2007 B2
7199932 Sugiura Apr 2007 B2
7359041 Xie et al. Apr 2008 B2
7397418 Doerry et al. Jul 2008 B1
7432893 Ma et al. Oct 2008 B2
7435940 Eliasson et al. Oct 2008 B2
7436443 Hirunuma et al. Oct 2008 B2
7442914 Eliasson et al. Oct 2008 B2
7465914 Eliasson et al. Dec 2008 B2
7528898 Hashimoto May 2009 B2
7613375 Shimizu Nov 2009 B2
7629968 Miller et al. Dec 2009 B2
7646833 He et al. Jan 2010 B1
7653883 Hotelling et al. Jan 2010 B2
7655901 Idzik et al. Feb 2010 B2
7705835 Eikman Apr 2010 B2
7729056 Hwang et al. Jun 2010 B2
7847789 Kolmykov-Zotov et al. Dec 2010 B2
7855716 McCreary et al. Dec 2010 B2
7859519 Tulbert Dec 2010 B2
7924272 Boer et al. Apr 2011 B2
7932899 Newton et al. Apr 2011 B2
7969410 Kakarala Jun 2011 B2
7995039 Eliasson et al. Aug 2011 B2
8013845 Ostergaard et al. Sep 2011 B2
8031186 Ostergaard Oct 2011 B2
8077147 Krah et al. Dec 2011 B2
8093545 Leong et al. Jan 2012 B2
8094136 Eliasson et al. Jan 2012 B2
8094910 Xu Jan 2012 B2
8149211 Hayakawa et al. Apr 2012 B2
8218154 Østergaard et al. Jul 2012 B2
8274495 Lee Sep 2012 B2
8325158 Yatsuda et al. Dec 2012 B2
8339379 Goertz et al. Dec 2012 B2
8350827 Chung et al. Jan 2013 B2
8384010 Hong et al. Feb 2013 B2
8407606 Davidson et al. Mar 2013 B1
8441467 Han May 2013 B2
8445834 Hong et al. May 2013 B2
8466901 Yen et al. Jun 2013 B2
8482547 Cobon et al. Jul 2013 B2
8542217 Wassvik et al. Sep 2013 B2
8567257 Van Steenberge et al. Oct 2013 B2
8581884 Fåhraeus et al. Nov 2013 B2
8624858 Fyke et al. Jan 2014 B2
8686974 Christiansson et al. Apr 2014 B2
8692807 Føhraeus et al. Apr 2014 B2
8716614 Wassvik May 2014 B2
8727581 Saccomanno May 2014 B2
8745514 Davidson Jun 2014 B1
8780066 Christiansson et al. Jul 2014 B2
8830181 Clark et al. Sep 2014 B1
8860696 Wassvik et al. Oct 2014 B2
8872098 Bergström et al. Oct 2014 B2
8872801 Bergström et al. Oct 2014 B2
8884900 Wassvik Nov 2014 B2
8890843 Wassvik et al. Nov 2014 B2
8890849 Christiansson et al. Nov 2014 B2
8928590 El Dokor Jan 2015 B1
8963886 Wassvik Feb 2015 B2
8982084 Christiansson et al. Mar 2015 B2
9001086 Saini Apr 2015 B1
9024896 Chen May 2015 B2
9024916 Christiansson May 2015 B2
9035909 Christiansson May 2015 B2
9063614 Petterson et al. Jun 2015 B2
9063617 Eliasson et al. Jun 2015 B2
9086763 Johansson et al. Jul 2015 B2
9134854 Wassvik et al. Sep 2015 B2
9158401 Christiansson Oct 2015 B2
9158415 Song et al. Oct 2015 B2
9201520 Benko et al. Dec 2015 B2
9207800 Eriksson et al. Dec 2015 B1
9213445 King et al. Dec 2015 B2
9274645 Christiansson et al. Mar 2016 B2
9280237 Kukulj Mar 2016 B2
9317146 Hufnagel Apr 2016 B1
9317168 Christiansson et al. Apr 2016 B2
9323396 Han et al. Apr 2016 B2
9366565 Uvnäs Jun 2016 B2
9377884 Christiansson et al. Jun 2016 B2
9389732 Craven-Bartle Jul 2016 B2
9411444 Christiansson et al. Aug 2016 B2
9411464 Wallander et al. Aug 2016 B2
9430079 Christiansson et al. Aug 2016 B2
9442574 Fåhraeus et al. Sep 2016 B2
9547393 Christiansson et al. Jan 2017 B2
9552103 Craven-Bartle et al. Jan 2017 B2
9557846 Baharav et al. Jan 2017 B2
9588619 Christiansson et al. Mar 2017 B2
9594467 Christiansson et al. Mar 2017 B2
9618682 Yoon et al. Apr 2017 B2
9626018 Christiansson et al. Apr 2017 B2
9626040 Wallander et al. Apr 2017 B2
9639210 Wallander et al. May 2017 B2
9678602 Wallander Jun 2017 B2
9684414 Christiansson et al. Jun 2017 B2
9710101 Christiansson et al. Jul 2017 B2
9874978 Wall Jan 2018 B2
10013107 Christiansson et al. Jul 2018 B2
10019113 Christiansson et al. Jul 2018 B2
10282035 Kocovski et al. May 2019 B2
10649585 van Beek et al. May 2020 B1
20010002694 Nakazawa et al. Jun 2001 A1
20010005004 Shiratsuki et al. Jun 2001 A1
20010005308 Oishi et al. Jun 2001 A1
20010030642 Sullivan et al. Oct 2001 A1
20020067348 Masters et al. Jun 2002 A1
20020075243 Newton Jun 2002 A1
20020118177 Newton Aug 2002 A1
20020158823 Zavracky et al. Oct 2002 A1
20020158853 Sugawara et al. Oct 2002 A1
20020163505 Takekawa Nov 2002 A1
20030016450 Bluemel et al. Jan 2003 A1
20030034439 Reime et al. Feb 2003 A1
20030034935 Amanai et al. Feb 2003 A1
20030048257 Mattila Mar 2003 A1
20030052257 Sumriddetchkajorn Mar 2003 A1
20030095399 Grenda et al. May 2003 A1
20030107748 Lee Jun 2003 A1
20030137494 Tulbert Jul 2003 A1
20030156100 Gettemy Aug 2003 A1
20030160155 Liess Aug 2003 A1
20030210537 Engelmann Nov 2003 A1
20030214486 Roberts Nov 2003 A1
20040027339 Schulz Feb 2004 A1
20040032401 Nakazawa et al. Feb 2004 A1
20040090432 Takahashi et al. May 2004 A1
20040130338 Wang et al. Jul 2004 A1
20040174541 Freifeld Sep 2004 A1
20040201579 Graham Oct 2004 A1
20040212603 Cok Oct 2004 A1
20040238627 Silverbrook et al. Dec 2004 A1
20040239702 Kang et al. Dec 2004 A1
20040245438 Payne et al. Dec 2004 A1
20040252091 Ma et al. Dec 2004 A1
20040252867 Lan et al. Dec 2004 A1
20050012714 Russo et al. Jan 2005 A1
20050041013 Tanaka Feb 2005 A1
20050057903 Choi Mar 2005 A1
20050073508 Pittel et al. Apr 2005 A1
20050083293 Dixon Apr 2005 A1
20050128190 Ryynanen Jun 2005 A1
20050143923 Keers et al. Jun 2005 A1
20050156914 Lipman et al. Jul 2005 A1
20050162398 Eliasson et al. Jul 2005 A1
20050179977 Chui et al. Aug 2005 A1
20050200613 Kobayashi et al. Sep 2005 A1
20050212774 Ho et al. Sep 2005 A1
20050248540 Newton Nov 2005 A1
20050253834 Sakamaki et al. Nov 2005 A1
20050276053 Nortrup et al. Dec 2005 A1
20060001650 Robbins et al. Jan 2006 A1
20060001653 Smits Jan 2006 A1
20060007185 Kobayashi Jan 2006 A1
20060008164 Wu et al. Jan 2006 A1
20060017706 Cutherell et al. Jan 2006 A1
20060017709 Okano Jan 2006 A1
20060033725 Marggraff et al. Feb 2006 A1
20060038698 Chen Feb 2006 A1
20060061861 Munro et al. Mar 2006 A1
20060114237 Crockett et al. Jun 2006 A1
20060132454 Chen et al. Jun 2006 A1
20060139340 Geaghan Jun 2006 A1
20060158437 Blythe et al. Jul 2006 A1
20060170658 Nakamura et al. Aug 2006 A1
20060202974 Thielman Sep 2006 A1
20060227120 Eikman Oct 2006 A1
20060255248 Eliasson Nov 2006 A1
20060256092 Lee Nov 2006 A1
20060279558 Van Delden et al. Dec 2006 A1
20060281543 Sutton et al. Dec 2006 A1
20060290684 Giraldo et al. Dec 2006 A1
20070014486 Schiwietz et al. Jan 2007 A1
20070024598 Miller et al. Feb 2007 A1
20070034783 Eliasson et al. Feb 2007 A1
20070038691 Candes et al. Feb 2007 A1
20070052684 Gruhlke et al. Mar 2007 A1
20070070056 Sato et al. Mar 2007 A1
20070075648 Blythe et al. Apr 2007 A1
20070120833 Yamaguchi et al. May 2007 A1
20070125937 Eliasson et al. Jun 2007 A1
20070152985 Ostergaard et al. Jul 2007 A1
20070201042 Eliasson et al. Aug 2007 A1
20070296688 Nakamura et al. Dec 2007 A1
20080006766 Oon et al. Jan 2008 A1
20080007540 Ostergaard Jan 2008 A1
20080007541 Eliasson et al. Jan 2008 A1
20080007542 Eliasson et al. Jan 2008 A1
20080011944 Chua et al. Jan 2008 A1
20080029691 Han Feb 2008 A1
20080036743 Westerman et al. Feb 2008 A1
20080062150 Lee Mar 2008 A1
20080068691 Miyatake Mar 2008 A1
20080074401 Chung et al. Mar 2008 A1
20080080811 Deane Apr 2008 A1
20080088603 Eliasson et al. Apr 2008 A1
20080121442 Boer et al. May 2008 A1
20080122792 Izadi et al. May 2008 A1
20080122803 Izadi et al. May 2008 A1
20080130979 Run et al. Jun 2008 A1
20080150846 Chung et al. Jun 2008 A1
20080150848 Chung et al. Jun 2008 A1
20080151126 Yu Jun 2008 A1
20080158176 Land et al. Jul 2008 A1
20080189046 Eliasson et al. Aug 2008 A1
20080192025 Jaeger et al. Aug 2008 A1
20080238433 Joutsenoja et al. Oct 2008 A1
20080246388 Cheon et al. Oct 2008 A1
20080252619 Crockett et al. Oct 2008 A1
20080266266 Kent et al. Oct 2008 A1
20080278460 Arnett et al. Nov 2008 A1
20080284925 Han Nov 2008 A1
20080291668 Aylward et al. Nov 2008 A1
20080297482 Weiss Dec 2008 A1
20090000831 Miller et al. Jan 2009 A1
20090002340 Van Genechten Jan 2009 A1
20090006292 Block Jan 2009 A1
20090040786 Mori Feb 2009 A1
20090066647 Kerr et al. Mar 2009 A1
20090067178 Huang et al. Mar 2009 A1
20090073142 Yamashita et al. Mar 2009 A1
20090077501 Partridge et al. Mar 2009 A1
20090085894 Gandhi et al. Apr 2009 A1
20090091554 Keam Apr 2009 A1
20090115919 Tanaka et al. May 2009 A1
20090122020 Eliasson et al. May 2009 A1
20090122027 Newton May 2009 A1
20090128508 Sohn et al. May 2009 A1
20090135162 Van De Wijdeven et al. May 2009 A1
20090143141 Wells et al. Jun 2009 A1
20090153519 Suarez Rovere Jun 2009 A1
20090161026 Wu et al. Jun 2009 A1
20090168459 Holman et al. Jul 2009 A1
20090187842 Collins et al. Jul 2009 A1
20090189857 Benko et al. Jul 2009 A1
20090189874 Chene et al. Jul 2009 A1
20090189878 Goertz et al. Jul 2009 A1
20090219256 Newton Sep 2009 A1
20090229892 Fisher et al. Sep 2009 A1
20090251439 Westerman et al. Oct 2009 A1
20090256817 Perlin et al. Oct 2009 A1
20090259967 Davidson et al. Oct 2009 A1
20090267919 Chao et al. Oct 2009 A1
20090273794 Østergaard et al. Nov 2009 A1
20090278816 Colson Nov 2009 A1
20090297009 Xu et al. Dec 2009 A1
20100033444 Kobayashi Feb 2010 A1
20100045629 Newton Feb 2010 A1
20100060896 Van De Wijdeven et al. Mar 2010 A1
20100066016 Van De Wijdeven et al. Mar 2010 A1
20100066704 Kasai Mar 2010 A1
20100073318 Hu et al. Mar 2010 A1
20100073327 Mau et al. Mar 2010 A1
20100078545 Leong et al. Apr 2010 A1
20100079407 Suggs et al. Apr 2010 A1
20100079408 Leong et al. Apr 2010 A1
20100097345 Jang et al. Apr 2010 A1
20100097348 Park et al. Apr 2010 A1
20100097353 Newton Apr 2010 A1
20100103133 Park et al. Apr 2010 A1
20100125438 Audet May 2010 A1
20100127975 Jensen May 2010 A1
20100134435 Kimura et al. Jun 2010 A1
20100142823 Wang et al. Jun 2010 A1
20100187422 Kothari et al. Jul 2010 A1
20100193259 Wassvik Aug 2010 A1
20100229091 Homma et al. Sep 2010 A1
20100238139 Goertz et al. Sep 2010 A1
20100245292 Wu Sep 2010 A1
20100265170 Norieda Oct 2010 A1
20100277436 Feng et al. Nov 2010 A1
20100283785 Satulovsky Nov 2010 A1
20100284596 Miao et al. Nov 2010 A1
20100289754 Sleeman et al. Nov 2010 A1
20100295821 Chang et al. Nov 2010 A1
20100302196 Han et al. Dec 2010 A1
20100302209 Large Dec 2010 A1
20100302210 Han et al. Dec 2010 A1
20100302240 Lettvin Dec 2010 A1
20100315379 Allard et al. Dec 2010 A1
20100321328 Chang et al. Dec 2010 A1
20100322550 Trott Dec 2010 A1
20110043490 Powell et al. Feb 2011 A1
20110049388 Delaney et al. Mar 2011 A1
20110050649 Newton et al. Mar 2011 A1
20110051394 Bailey Mar 2011 A1
20110068256 Hong et al. Mar 2011 A1
20110069039 Lee et al. Mar 2011 A1
20110069807 Dennerlein et al. Mar 2011 A1
20110074725 Westerman et al. Mar 2011 A1
20110074734 Wassvik et al. Mar 2011 A1
20110074735 Wassvik et al. Mar 2011 A1
20110080361 Miller et al. Apr 2011 A1
20110084939 Gepner et al. Apr 2011 A1
20110090176 Christiansson et al. Apr 2011 A1
20110102374 Wassvik et al. May 2011 A1
20110115748 Xu May 2011 A1
20110121323 Wu et al. May 2011 A1
20110122075 Seo et al. May 2011 A1
20110122091 King et al. May 2011 A1
20110122094 Tsang et al. May 2011 A1
20110134079 Stark Jun 2011 A1
20110141062 Yu et al. Jun 2011 A1
20110147569 Drumm Jun 2011 A1
20110157095 Drumm Jun 2011 A1
20110157096 Drumm Jun 2011 A1
20110163996 Wassvik et al. Jul 2011 A1
20110163997 Kim Jul 2011 A1
20110163998 Goertz et al. Jul 2011 A1
20110169780 Goertz et al. Jul 2011 A1
20110175852 Goertz et al. Jul 2011 A1
20110205186 Newton et al. Aug 2011 A1
20110216042 Wassvik et al. Sep 2011 A1
20110221705 Yi et al. Sep 2011 A1
20110221997 Kim et al. Sep 2011 A1
20110227036 Vaufrey Sep 2011 A1
20110227874 Fåhraeus et al. Sep 2011 A1
20110234537 Kim et al. Sep 2011 A1
20110254864 Tsuchikawa et al. Oct 2011 A1
20110261020 Song et al. Oct 2011 A1
20110267296 Noguchi et al. Nov 2011 A1
20110291989 Lee Dec 2011 A1
20110298743 Machida et al. Dec 2011 A1
20110309325 Park et al. Dec 2011 A1
20110310045 Toda et al. Dec 2011 A1
20110316005 Murao et al. Dec 2011 A1
20120019448 Pitkanen et al. Jan 2012 A1
20120026408 Lee et al. Feb 2012 A1
20120038593 Rönkä et al. Feb 2012 A1
20120056807 Chapman et al. Mar 2012 A1
20120062474 Weishaupt et al. Mar 2012 A1
20120062492 Katoh Mar 2012 A1
20120068973 Christiansson et al. Mar 2012 A1
20120086673 Chien et al. Apr 2012 A1
20120089348 Perlin et al. Apr 2012 A1
20120110447 Chen May 2012 A1
20120131490 Lin et al. May 2012 A1
20120141001 Zhang et al. Jun 2012 A1
20120146930 Lee Jun 2012 A1
20120153134 Bergström et al. Jun 2012 A1
20120154338 Bergström et al. Jun 2012 A1
20120162142 Christiansson et al. Jun 2012 A1
20120162144 Fåhraeus et al. Jun 2012 A1
20120169672 Christiansson Jul 2012 A1
20120170056 Jakobsen et al. Jul 2012 A1
20120181419 Momtahan Jul 2012 A1
20120182266 Han Jul 2012 A1
20120188206 Sparf et al. Jul 2012 A1
20120191993 Drader et al. Jul 2012 A1
20120200532 Powell et al. Aug 2012 A1
20120200538 Christiansson et al. Aug 2012 A1
20120212441 Christiansson et al. Aug 2012 A1
20120212457 Drumm Aug 2012 A1
20120217882 Wong et al. Aug 2012 A1
20120218229 Drumm Aug 2012 A1
20120223916 Kukulj Sep 2012 A1
20120249478 Chang et al. Oct 2012 A1
20120256882 Christiansson et al. Oct 2012 A1
20120268403 Christiansson Oct 2012 A1
20120268427 Slobodin Oct 2012 A1
20120274559 Mathai et al. Nov 2012 A1
20120305755 Hong et al. Dec 2012 A1
20120313865 Pearce Dec 2012 A1
20130021300 Wassvik Jan 2013 A1
20130021302 Drumm Jan 2013 A1
20130027404 Sarnoff Jan 2013 A1
20130044073 Christiansson et al. Feb 2013 A1
20130055080 Komer et al. Feb 2013 A1
20130076697 Goertz et al. Mar 2013 A1
20130082980 Gruhlke et al. Apr 2013 A1
20130106709 Simmons May 2013 A1
20130107569 Suganuma May 2013 A1
20130113715 Grant et al. May 2013 A1
20130120320 Liu et al. May 2013 A1
20130125016 Pallakoff et al. May 2013 A1
20130127790 Wassvik May 2013 A1
20130135258 King et al. May 2013 A1
20130135259 King et al. May 2013 A1
20130141388 Ludwig et al. Jun 2013 A1
20130141395 Holmgren et al. Jun 2013 A1
20130154983 Christiansson et al. Jun 2013 A1
20130155027 Holmgren et al. Jun 2013 A1
20130155655 Lee et al. Jun 2013 A1
20130181896 Gruhlke et al. Jul 2013 A1
20130181953 Hinckley et al. Jul 2013 A1
20130187891 Eriksson et al. Jul 2013 A1
20130201142 Suarez Rovere Aug 2013 A1
20130222346 Chen et al. Aug 2013 A1
20130234991 Sparf Sep 2013 A1
20130241887 Sharma Sep 2013 A1
20130249833 Christiansson et al. Sep 2013 A1
20130269867 Trott Oct 2013 A1
20130275082 Follmer et al. Oct 2013 A1
20130285920 Colley Oct 2013 A1
20130285968 Christiansson et al. Oct 2013 A1
20130300714 Goh et al. Nov 2013 A1
20130300716 Craven-Bartle et al. Nov 2013 A1
20130307795 Suarez Rovere Nov 2013 A1
20130321740 An et al. Dec 2013 A1
20130342490 Wallander et al. Dec 2013 A1
20140002400 Christiansson et al. Jan 2014 A1
20140015803 Drumm Jan 2014 A1
20140028575 Parivar et al. Jan 2014 A1
20140028604 Morinaga et al. Jan 2014 A1
20140028629 Drumm et al. Jan 2014 A1
20140036203 Guillou et al. Feb 2014 A1
20140055421 Christiansson et al. Feb 2014 A1
20140063853 Nichol et al. Mar 2014 A1
20140071653 Thompson et al. Mar 2014 A1
20140085241 Christiansson et al. Mar 2014 A1
20140092052 Grunthaner et al. Apr 2014 A1
20140098032 Ng et al. Apr 2014 A1
20140098058 Baharav et al. Apr 2014 A1
20140109219 Rohrweck et al. Apr 2014 A1
20140125633 Fåhraeus et al. May 2014 A1
20140152624 Piot et al. Jun 2014 A1
20140160762 Dudik et al. Jun 2014 A1
20140192023 Hoffman Jul 2014 A1
20140226084 Utukuri et al. Aug 2014 A1
20140232669 Ohlsson et al. Aug 2014 A1
20140237401 Krus et al. Aug 2014 A1
20140237408 Ohlsson et al. Aug 2014 A1
20140237422 Ohlsson et al. Aug 2014 A1
20140253520 Cueto et al. Sep 2014 A1
20140253831 Craven-Bartle Sep 2014 A1
20140259029 Choi et al. Sep 2014 A1
20140267124 Christiansson et al. Sep 2014 A1
20140292701 Christiansson et al. Oct 2014 A1
20140300572 Ohlsson et al. Oct 2014 A1
20140320460 Johansson et al. Oct 2014 A1
20140347325 Wallander et al. Nov 2014 A1
20140362046 Yoshida Dec 2014 A1
20140368471 Christiansson et al. Dec 2014 A1
20140375607 Christiansson et al. Dec 2014 A1
20150002386 Mankowski et al. Jan 2015 A1
20150009687 Lin Jan 2015 A1
20150015497 Leigh Jan 2015 A1
20150035774 Christiansson et al. Feb 2015 A1
20150035803 Wassvik et al. Feb 2015 A1
20150053850 Uvnäs Feb 2015 A1
20150054759 Christiansson et al. Feb 2015 A1
20150083891 Wallander Mar 2015 A1
20150103013 Huang Apr 2015 A9
20150121691 Wang May 2015 A1
20150130769 Björklund May 2015 A1
20150131010 Sugiyama May 2015 A1
20150138105 Christiansson et al. May 2015 A1
20150138158 Wallander et al. May 2015 A1
20150138161 Wassvik May 2015 A1
20150199071 Hou Jul 2015 A1
20150205441 Bergström et al. Jul 2015 A1
20150215450 Seo et al. Jul 2015 A1
20150242055 Wallander Aug 2015 A1
20150261323 Cui et al. Sep 2015 A1
20150286698 Gagnier et al. Oct 2015 A1
20150317036 Johansson et al. Nov 2015 A1
20150324028 Wassvik et al. Nov 2015 A1
20150331544 Bergström et al. Nov 2015 A1
20150331545 Wassvik et al. Nov 2015 A1
20150331546 Craven-Bartle et al. Nov 2015 A1
20150331547 Wassvik et al. Nov 2015 A1
20150332655 Krus et al. Nov 2015 A1
20150339000 Lee et al. Nov 2015 A1
20150346856 Wassvik Dec 2015 A1
20150346911 Christiansson Dec 2015 A1
20150363042 Krus et al. Dec 2015 A1
20150373864 Jung Dec 2015 A1
20160004898 Holz Jan 2016 A1
20160026297 Shinkai et al. Jan 2016 A1
20160026337 Wassvik et al. Jan 2016 A1
20160034099 Christiansson et al. Feb 2016 A1
20160041629 Rao et al. Feb 2016 A1
20160050746 Wassvik et al. Feb 2016 A1
20160062549 Drumm et al. Mar 2016 A1
20160070415 Christiansson et al. Mar 2016 A1
20160070416 Wassvik Mar 2016 A1
20160092021 Tu et al. Mar 2016 A1
20160103026 Povazay et al. Apr 2016 A1
20160117019 Michiaki Apr 2016 A1
20160124546 Chen et al. May 2016 A1
20160124551 Christiansson et al. May 2016 A1
20160077616 Durojaiye et al. Jun 2016 A1
20160154531 Wall Jun 2016 A1
20160154533 Eriksson et al. Jun 2016 A1
20160179261 Drumm Jun 2016 A1
20160202841 Christiansson et al. Jul 2016 A1
20160209886 Suh et al. Jul 2016 A1
20160216844 Bergström Jul 2016 A1
20160224144 Klinghult et al. Aug 2016 A1
20160255713 Kim et al. Sep 2016 A1
20160295711 Ryu et al. Oct 2016 A1
20160299583 Watanabe Oct 2016 A1
20160299593 Christiansson et al. Oct 2016 A1
20160306501 Drumm et al. Oct 2016 A1
20160328090 Klinghult Nov 2016 A1
20160328091 Wassvik et al. Nov 2016 A1
20160334942 Wassvik Nov 2016 A1
20160342282 Wassvik Nov 2016 A1
20160357348 Wallander Dec 2016 A1
20170010688 Fahraeus et al. Jan 2017 A1
20170031516 Sugiyama et al. Feb 2017 A1
20170090090 Craven-Bartle et al. Mar 2017 A1
20170102827 Christiansson et al. Apr 2017 A1
20170115235 Ohlsson et al. Apr 2017 A1
20170139541 Christiansson et al. May 2017 A1
20170160871 Drumm Jun 2017 A1
20170177163 Wallander et al. Jun 2017 A1
20170185230 Wallander et al. Jun 2017 A1
20170220204 Huang et al. Aug 2017 A1
20170293392 Christiansson et al. Oct 2017 A1
20170344185 Ohlsson et al. Nov 2017 A1
20180031753 Craven-Bartle et al. Feb 2018 A1
20180107373 Cheng Apr 2018 A1
20180129354 Christiansson et al. May 2018 A1
20180136788 He et al. May 2018 A1
20180149792 Lee et al. May 2018 A1
20180210572 Wallander et al. Jul 2018 A1
20180225006 Wall Aug 2018 A1
20180253187 Christiansson et al. Sep 2018 A1
20180267672 Wassvik et al. Sep 2018 A1
20180275788 Christiansson et al. Sep 2018 A1
20180275830 Christiansson et al. Sep 2018 A1
20180275831 Christiansson et al. Sep 2018 A1
20180314206 Lee et al. Nov 2018 A1
20190004668 Jeong et al. Jan 2019 A1
20190025984 Weilbacher et al. Jan 2019 A1
20190050074 Kocovski Feb 2019 A1
20190107923 Drumm Apr 2019 A1
20190146630 Chen et al. May 2019 A1
20190196658 Skagmo et al. Jun 2019 A1
20190196659 Skagmo et al. Jun 2019 A1
20190227670 O'Cleirigh et al. Jul 2019 A1
20190235701 Han et al. Aug 2019 A1
20190258353 Drumm et al. Aug 2019 A1
20190196657 Skagmo et al. Oct 2019 A1
20190324570 Kolundzjia et al. Oct 2019 A1
20190377431 Drumm Dec 2019 A1
20190377435 Piot et al. Dec 2019 A1
20200012408 Drumm et al. Jan 2020 A1
20200073509 Shih et al. Mar 2020 A1
20200098147 Ha et al. Mar 2020 A1
20200125189 Kim et al. Apr 2020 A1
20200159382 Drumm May 2020 A1
20200167033 Kim et al. May 2020 A1
20200249777 Hou et al. Aug 2020 A1
20200310621 Piot et al. Oct 2020 A1
20200341587 Drumm Oct 2020 A1
20200348473 Drumm Nov 2020 A1
20200387237 Drumm Dec 2020 A1
Foreign Referenced Citations (150)
Number Date Country
2008 280 952 Mar 2009 AU
201233592 May 2009 CN
101174191 Jun 2009 CN
101644854 Feb 2010 CN
201437963 Apr 2010 CN
201465071 May 2010 CN
101882034 Nov 2010 CN
101019071 Jun 2012 CN
101206550 Jun 2012 CN
203189466 Sep 2013 CN
203224848 Oct 2013 CN
203453994 Feb 2014 CN
101075168 Apr 2014 CN
205015574 Feb 2016 CN
205384833 Jul 2016 CN
3511330 May 1988 DE
68902419 Mar 1993 DE
69000920 Jun 1993 DE
19809934 Sep 1999 DE
10026201 Dec 2000 DE
102010000473 Aug 2010 DE
0845812 Jun 1998 EP
0600576 Oct 1998 EP
1798630 Jun 2007 EP
0897161 Oct 2007 EP
2088501 Aug 2009 EP
1512989 Sep 2009 EP
2077490 Jan 2010 EP
1126236 Dec 2010 EP
2314203 Apr 2011 EP
2325735 May 2011 EP
2339437 Oct 2011 EP
2442180 Apr 2012 EP
2466429 Jun 2012 EP
2479642 Jul 2012 EP
1457870 Aug 2012 EP
2565770 Mar 2013 EP
2765622 Aug 2014 EP
2778849 Sep 2014 EP
2515216 Mar 2016 EP
3535640 Sep 2019 EP
3644167 Apr 2020 EP
2172828 Oct 1973 FR
2617619 Jan 1990 FR
2614711 Mar 1992 FR
2617620 Sep 1992 FR
2676275 Nov 1992 FR
1380144 Jan 1975 GB
2131544 Mar 1986 GB
2204126 Nov 1988 GB
H05190066 Jul 1993 JP
2000506655 May 2000 JP
2000172438 Jun 2000 JP
2000259334 Sep 2000 JP
2000293311 Oct 2000 JP
2003330603 Nov 2003 JP
2005004278 Jan 2005 JP
2008506173 Feb 2008 JP
2011530124 Dec 2011 JP
100359400 Jul 2001 KR
100940435 Feb 2010 KR
WO 1984003186 Aug 1984 WO
WO 1999046602 Sep 1999 WO
WO 01127867 Apr 2001 WO
WO 0184251 Nov 2001 WO
WO 0235460 May 2002 WO
WO 02077915 Oct 2002 WO
WO 02095668 Nov 2002 WO
WO 03076870 Sep 2003 WO
WO 2004081502 Sep 2004 WO
WO 2004081956 Sep 2004 WO
WO 2005026938 Mar 2005 WO
WO 2005029172 Mar 2005 WO
WO 2005029395 Mar 2005 WO
WO 2005125011 Dec 2005 WO
WO 2006095320 Sep 2006 WO
WO 2006124551 Nov 2006 WO
WO 2007003196 Jan 2007 WO
WO 2007058924 May 2007 WO
WO 2007112742 Oct 2007 WO
WO 2008004103 Jan 2008 WO
WO 2008007276 Jan 2008 WO
WO 2008017077 Feb 2008 WO
WO 2008034184 Mar 2008 WO
WO 2008039006 Apr 2008 WO
WO 2008068607 Jun 2008 WO
WO 2006124551 Jul 2008 WO
WO 2008017077 Feb 2009 WO
WO 2009048365 Apr 2009 WO
WO 2009077962 Jun 2009 WO
WO 2009102681 Aug 2009 WO
WO 2009137355 Nov 2009 WO
WO 2010006882 Jan 2010 WO
WO 2010006883 Jan 2010 WO
WO 2010006884 Jan 2010 WO
WO 2010006885 Jan 2010 WO
WO 2010006886 Jan 2010 WO
WO 2010015408 Feb 2010 WO
WO 2010046539 Apr 2010 WO
WO 2010056177 May 2010 WO
WO 2010064983 Jun 2010 WO
WO 2010081702 Jul 2010 WO
WO 2010112404 Oct 2010 WO
WO 2010123809 Oct 2010 WO
WO 2010134865 Nov 2010 WO
WO 2011028169 Mar 2011 WO
WO 2011028170 Mar 2011 WO
WO 2011049511 Apr 2011 WO
WO 2011049512 Apr 2011 WO
WO 2011049513 Apr 2011 WO
WO 2011057572 May 2011 WO
WO 2011078769 Jun 2011 WO
WO 2011082477 Jul 2011 WO
WO 2011139213 Nov 2011 WO
WO 2012002894 Jan 2012 WO
WO 2012010078 Jan 2012 WO
WO 2012018176 Feb 2012 WO
WO 2012050510 Apr 2012 WO
WO 2012082055 Jun 2012 WO
WO 2012105893 Aug 2012 WO
WO 2012121652 Sep 2012 WO
WO 2012158105 Nov 2012 WO
WO 2012172302 Dec 2012 WO
WO 2012176801 Dec 2012 WO
WO 2013036192 Mar 2013 WO
WO 2013048312 Apr 2013 WO
WO 2013055282 Apr 2013 WO
WO 2013062471 May 2013 WO
WO 2013089622 Jun 2013 WO
WO 2013115710 Aug 2013 WO
WO 2013133756 Sep 2013 WO
WO 2013133757 Sep 2013 WO
WO 2013159472 Oct 2013 WO
WO 2013176613 Nov 2013 WO
WO 2013176614 Nov 2013 WO
WO 2013176615 Nov 2013 WO
WO 2014044181 Mar 2014 WO
WO 2014055809 Apr 2014 WO
WO 2014065601 May 2014 WO
WO 2014086084 Jun 2014 WO
WO 2014098744 Jun 2014 WO
WO 2014104967 Jul 2014 WO
WO 2015123322 Aug 2015 WO
WO 2015175586 Nov 2015 WO
WO 2016130074 Aug 2016 WO
WO 2018096430 May 2018 WO
WO 2018106172 Jun 2018 WO
WO 2018106176 Jun 2018 WO
WO 2020078339 Apr 2020 WO
WO 2020168802 Aug 2020 WO
Non-Patent Literature Citations (23)
Entry
Ahn, Y., et al., “A slim and wide multi-touch tabletop interface and its applications,” BigComp2014, IEEE, 2014, in 6 pages.
Chou, N., et al., “Generalized pseudo-polar Fourier grids and applications in regfersting optical coherence tomography images,” 43rd Asilomar Conference on Signals, Systems and Computers, Nov. 2009, in 5 pages.
Fihn, M., “Touch Panel—Special Edition,” Veritas et Visus, Nov. 2011, in 1 page.
Fourmont, K., “Non-Equispaced Fast Fourier Transforms with Applications to Tomography,” Journal of Fourier Analysis and Applications, vol. 9, Issue 5, 2003, in 20 pages.
Iizuka, K., “Boundaries, Near-Field Optics, and Near-Field Imaging,” Elements of Photonics, vol. 1: In Free Space and Special Media, Wiley & Sons, 2002, in 57 pages.
Johnson, M., “Enhanced Optical Touch Input Panel”, IBM Technical Discolusre Bulletin, 1985, in 3 pages.
Kak, et al., “Principles of Computerized Tomographic Imaging”, Institute of Electrical Engineers, Inc., 1999, in 333 pages.
The Laser Wall, MIT, 1997, http://web.media.mit.edu/˜joep/SpectrumWeb/captions/Laser.html.
Liu, J., et al. “Multiple touch points identifying method, involves starting touch screen, driving specific emission tube, and computing and transmitting coordinate of touch points to computer system by direct lines through interface of touch screen,” 2007, in 25 pages.
Natterer, F., “The Mathematics of Computerized Tomography”, Society for Industrial and Applied Mathematics, 2001, in 240 pages.
Natterer, F., et al. “Fourier Reconstruction,” Mathematical Methods in Image Reconstruction, Society for Industrial and Applied Mathematics, 2001, in 12 pages.
Paradiso, J.A., “Several Sensor Approaches that Retrofit Large Surfaces for Interactivity,” ACM Ubicomp 2002 Workshop on Collaboration with Interactive Walls and Tables, 2002, in 8 pages.
Tedaldi, M., et al. “Refractive index mapping of layered samples using optical coherence refractometry,” Proceedings of SPIE, vol. 7171, 2009, in 8 pages.
Supplementary European Search Report for European App. No. EP 16759213, dated Oct. 4, 2018, in 9 pages.
Extended European Search Report for European App. No. 16743795.3, dated Sep. 11, 2018, in 5 pages.
International Search Report for International App. No. PCT/SE2017/051224, dated Feb. 23, 2018, in 5 pages.
International Search Report for International App. No. PCT/IB2017/057201, dated Mar. 6, 2018, in 4 pages.
Extended European Search Report in European Application No. 19165019.1, dated Jul. 18, 2019 in 8 pages.
International Preliminary Report on Patentability received in International Application No. PCT/SE2017/051233, dated Jun. 11, 2019, in 6 pages.
International Search Report for International App. No. PCT/SE2018/050070, dated Apr. 25, 2018, in 4 pages.
Extended European Search Report in European Application No. 17750516.1, dated Jul. 16, 2019 in 5 pages.
Extended European Search Report in European Application No. 16873465.5, dated Jun. 25, 2019 in 9 pages.
Extended European Search Report for European App. No. 18772370.5, dated Dec. 9, 2020, in 8 pages.
Related Publications (1)
Number Date Country
20200150822 A1 May 2020 US
Continuations (1)
Number Date Country
Parent 15925333 Mar 2018 US
Child 16654393 US