The present invention relates to techniques for detecting and identifying objects on a touch surface.
To an increasing extent, touch-sensitive panels are being used for providing input data to computers, electronic measurement and test equipment, gaming devices, etc. The panel may be provided with a graphical user interface (GUI) for a user to interact with using e.g. a pointer, stylus or one or more fingers. The GUI may be fixed or dynamic. A fixed GUI may e.g. be in the form of printed matter placed over, under or inside the panel. A dynamic GUI can be provided by a display screen integrated with, or placed underneath, the panel or by an image being projected onto the panel by a projector.
There are numerous known techniques for providing touch sensitivity to the panel, e.g. by using cameras to capture light scattered off the point(s) of touch on the panel, by using cameras to directly observe the objects interacting with the panel, by incorporating resistive wire grids, capacitive sensors, strain gauges, etc. into the panel.
In one category of touch-sensitive panels known as ‘above surface optical touch systems’ and known from e.g. U.S. Pat. No. 4,459,476, a plurality of optical emitters and optical receivers are arranged around the periphery of a touch surface to create a grid of intersecting light paths above the touch surface. Each light path extends between a respective emitter/receiver pair. An object that touches the touch surface will block or attenuate some of the light paths. Based on the identity of the receivers detecting a blocked light path, a processor can determine the location of the intercept between the blocked light paths.
US patent publication 2004/0252091 discloses an alternative technique which is based on frustrated total internal reflection (FTIR). Light is coupled into a panel to propagate inside the panel by total internal reflection. Arrays of light sensors are located around the perimeter of the panel to detect the light. When an object comes into contact with a surface of the panel, the light will be locally attenuated at the point of touch. The location of the object is determined by triangulation based on the attenuation of the light from each source at the array of light sensors.
For most touch systems, a user may place a finger onto the surface of a touch panel in order to register a touch. Alternatively, a stylus may be used. A stylus is typically a pen shaped object with one end configured to be pressed against the surface of the touch panel. An example of a stylus according to the prior art is shown in
Two types of stylus exist for touch systems. An active stylus is a stylus typically comprising some form of power source and electronics to transmit a signal to the host touch system. The type of signal transmitted can vary but may include position information, pressure information, tilt information, stylus ID, stylus type, ink colour etc. The source of power for an active stylus may include a battery, capacitor, or an electrical field for providing power via inductive coupling. Without power, an active stylus may lose some or all of its functionality.
An active stylus may be readily identified by a host system by receiving an electronic stylus ID from the active stylus and associating the stylus ID with position information relating to the contact position between the stylus and the touch surface of the host system.
A passive stylus has no power source and does not actively communicate with the host system. Therefore, a passive stylus is cheaper to manufacture than an active stylus and does not require maintenance. However, advanced information like application pressure, tilt information, stylus ID, stylus type, ink colour etc. can be significantly more difficult to obtain from a passive stylus than from an active stylus.
U.S. Pat. No. 6,567,078 describes a method of marking a plurality of passive styluses with one or more colour films in a pattern unique to each stylus. A camera is arranged to record the colour markings on the stylus and identify the passive stylus in use in order to determine the appropriate choice of ink colour to be displayed on the screen.
For optical touch systems such as those described in US patent publication 2004/0252091 and U.S. Pat. No. 4,459,476, it can be difficult to identify an object with a tip as small as a stylus. In particular, stylus tips are typically small (i.e. smaller than 4 mm in diameter) and provide a relatively small amount of attenuation of the light signals compared with a finger or other large object. The stylus tip may also have a smaller diameter than the resolution of the touch system is able to resolve.
Furthermore, the low signal-to-noise of such systems makes identification of each of a plurality of passive styluses using unique retro-reflective material arrangements difficult and unreliable.
Therefore, what is needed is a way of identifying objects touching an optical touch system which does not suffer from the above problem.
It is an objective of the invention to at least partly overcome one or more of the above-identified limitations of the prior art.
One or more of these objectives, as well as further objectives that may appear from the description below, are at least partly achieved by means of a method for data processing, a computer readable medium, devices for data processing, and a touch-sensing apparatus according to the independent claims, embodiments thereof being defined by the dependent claims.
Embodiments of the invention will now be described in more detail with reference to the accompanying schematic drawings.
The present invention relates to optical touch panels and the use of techniques for providing touch sensitivity to a display apparatus. Throughout the description the same reference numerals are used to identify corresponding elements.
Before describing embodiments of the invention, a few definitions will be given.
A “touch object” or “touching object” is a physical object that touches, or is brought in sufficient proximity to, a touch surface so as to be detected by one or more sensors in the touch system. The physical object may be animate or inanimate.
An “interaction” occurs when the touch object affects a parameter measured by the sensor.
A “touch” denotes a point of interaction as seen in the interaction pattern. Throughout the following description, the same reference numerals are used to identify corresponding elements.
A “light field” is the light flowing between an emitter and a corresponding detector. Although an emitter may generate a large amount of light in many directions, only the light measured by a detector from an emitter defines the light field for the emitter and detector.
Light paths 50 may conceptually be represented as “detection lines” that extend across the touch surface 20 to the periphery of touch surface 20 between pairs of emitters 30a and detectors 30b, as shown in
As used herein, the emitters 30a may be any type of device capable of emitting radiation in a desired wavelength range, for example a diode laser, a VC SEL (vertical-cavity surface-emitting laser), an LED (light-emitting diode), an incandescent lamp, a halogen lamp, etc. The emitters 30a may also be formed by the end of an optical fibre. The emitters 30a may generate light in any wavelength range. The following examples presume that the light is generated in the infrared (IR), i.e. at wavelengths above about 750 nm. Analogously, the detectors 30b may be any device capable of converting light (in the same wavelength range) into an electrical signal, such as a photo-detector, a CCD device, a CMOS device, etc.
The detectors 30b collectively provide an output signal, which is received and sampled by a signal processor 130. The output signal contains a number of sub-signals, also denoted “transmission values”, each representing the energy of light received by one of light detectors 30b from one of light emitters 30a. Depending on implementation, the signal processor 130 may need to process the output signal for separation of the individual transmission values. The transmission values represent the received energy, intensity or power of light received by the detectors 30b on the individual detection lines 50. Whenever an object touches a detection line 50, the received energy on this detection line is decreased or “attenuated”. Where an object blocks the entire width of the detection line of an above-surface system, the detection line will be fully attenuated or occluded.
In the preferred embodiment, the touch apparatus is arranged according to
The arrangement shown in
Unless otherwise stated, the embodiments described in the specification apply to the arrangement shown in
The signal processor 130 may be configured to process the transmission values so as to determine a property of the touching objects, such as a position (e.g. in a x,y coordinate system), a shape, or an area. This determination may involve a straight-forward triangulation based on the attenuated detection lines, e.g. as disclosed in U.S. Pat. No. 7,432,893 and WO2010/015408, or a more advanced processing to recreate a distribution of attenuation values (for simplicity, referred to as an “attenuation pattern”) across the touch surface 20, where each attenuation value represents a local degree of light attenuation. The attenuation pattern may be further processed by the signal processor 130 or by a separate device (not shown) for determination of a position, shape or area of touching objects. The attenuation pattern may be generated e.g. by any available algorithm for image reconstruction based on transmission values, including tomographic reconstruction methods such as Filtered Back Projection, FFT-based algorithms, ART (Algebraic Reconstruction Technique), SART (Simultaneous Algebraic Reconstruction Technique), etc. Alternatively, the attenuation pattern may be generated by adapting one or more basis functions and/or by statistical methods such as Bayesian inversion. Examples of such reconstruction functions designed for use in touch determination are found in WO2009/077962, WO2011/049511, WO2011/139213, WO2012/050510, and WO2013/062471, all of which are incorporated herein by reference.
For the purposes of brevity, the term ‘signal processor’ is used throughout to describe one or more processing components for performing the various stages of processing required between receiving the signal from the detectors through to outputting a determination of touch including touch co-ordinates, touch properties, etc. Although the processing stages of the present disclosure may be carried out on a single processing unit (with a corresponding memory unit), the disclosure is also intended to cover multiple processing units and even remotely located processing units.
In the illustrated example, the apparatus 100 also includes a controller 120 which is connected to selectively control the activation of the emitters 30a and, possibly, the readout of data from the detectors 30b. Depending on implementation, the emitters 30a and/or detectors 30b may be activated in sequence or concurrently, e.g. as disclosed in U.S. Pat. No. 8,581,884. The signal processor 130 and the controller 120 may be configured as separate units, or they may be incorporated in a single unit. One or both of the signal processor 130 and the controller 120 may be at least partially implemented by software executed by a processing unit 140.
In step 510 of
In step 520, the output signals are processed for determination of the transmission values (or ‘transmission signals’). As described above, the transmission values represent the received energy, intensity or power of light received by the detectors 30b on the individual detection lines 50.
In step 530, the signal processor 130 is configured to process the transmission values to determine the presence of one or more touching objects on the touch surface. In the preferred embodiment, the signal processor 130 is configured to process the transmission values to generate a two-dimensional estimation of the attenuation field across the touch surface, i.e. a spatial distribution of attenuation values, in which each touching object typically appears as a region of changed attenuation. From the attenuation field, two-dimensional touch data may be extracted and one or more touch locations may be identified. The transmission values may be processed according to a tomographic reconstruction algorithm to generate the two-dimensional estimation of the attenuation field.
In one embodiment, the signal processor 130 maybe configured to generate an attenuation field for the entire touch surface. In an alternative embodiment, the signal processor 130 maybe configured to generate an attenuation field for a sub-section of the touch surface, the sub-section being selected according to one or more criteria determined during processing of the transmission values.
In step 540, the signal processor 130 determines properties of the object at each touch location, including an attenuation value corresponding to the attenuation of the beams of light resulting from the object touching the touch surface.
In one embodiment, the attenuation value is determined in the following manner: First, the attenuation pattern is processed for detection of peaks, e.g. using any known technique. In one embodiment, a global or local threshold is first applied to the attenuation pattern, to suppress noise. Any areas with attenuation values that fall above the threshold may be further processed to find local maxima. The identified maxima may be further processed for determination of a touch shape and a center position, e.g. by fitting a two-dimensional second-order polynomial or a Gaussian bell shape to the attenuation values, or by finding the ellipse of inertia of the attenuation values. There are also numerous other techniques as is well known in the art, such as clustering algorithms, edge detection algorithms, standard blob detection, water shedding techniques, flood fill techniques, etc. Step 540 results in a collection of peak data, which may include values of position, attenuation, size, and shape for each detected peak. The attenuation value may be calculated from a maximum attenuation value or a weighted sum of attenuation values within the peak shape.
The attenuation value recorded for an object may vary due to noise, object angle, object material, or a number of other reasons.
Certain objects may provide a wider distribution of attenuation values than others.
In the preferred embodiment, signal processor 130 is configured to store a plurality of object IDs in memory, each object ID having an associated attenuation value range. In the following example, three object types with associated Object IDs are shown.
In the preferred embodiment, each Object ID has an attenuation value range, defined by an Attenuation Max value and an Attenuation Min value. The Object IDs may optionally comprise further values defining properties of an associated object, including a recognised object type, an output type (e.g. a brush type, ink colour, selection type, etc.)
In step 550, signal processor 130 matches each touch location to an Object ID. This is done by matching the attenuation value of each touch location to the range of the matching Object ID. i.e. A touch location with an attenuation value of 1.2* 10−2 will be matched to Object ID 001. In one embodiment, an Object ID exists with a range for all values above a specific value. This allows all objects with an attenuation value above the usual ranges of the Object IDs to be identified using the same ‘default large object’ Object ID. Similarly, in one embodiment, an Object ID exists with a range for all values below a specific value allowing very low attenuation value objects to be identified with a generic ‘default small object’ Object ID.
In step 560, signal processor 130 outputs the touch data, including the touch locations and corresponding Object IDs for each location.
When matching an attenuation value of a touch to an object ID, it is important to use a stable attenuation value which correctly reflects the attenuation of the light caused by the object once it is in contact with the surface. In an ‘above surface’ system such as the embodiment shown in
As an object is lowered into the light field, it occludes increasingly more light. As a consequence, the attenuation of light caused by the object increases until the object has hit the touch surface. The gradient of attenuation (i.e. the rate of change of the attenuation) is therefore positive as the object travels towards the touch surface until it flattens out when the object is in contact with the surface.
Therefore, in a preferred embodiment of the invention, signal processor 130 is configured to determine that an object attenuation value is stable and/or that a touch down event has occurred in dependence on an attenuation gradient signature (shown at time 1040 in
In one embodiment, a touch down event determined to have occurred once object attenuation value has exceeded a first attenuation value threshold. However, a determination that a touch down event has occurred is possible before this threshold is met, using the above method. Where the object attenuation value is below the first attenuation value threshold but an attenuation gradient signature is observed having a higher attenuation gradient equal to or greater than 20% of the first attenuation value threshold over a single frame, the object attenuation value may be determined to be stable and/or that a touch down event has occurred.
During a ‘touch up’ event, an attenuation value of the object decreases as the object is lifted out of the light field. Similarly to the above, the attenuation gradient signature of this event (shown at time 1050 in
In one embodiment, a touch up event is determined to have occurred once the object attenuation value is determined to have dropped below a second attenuation value threshold. However, a determination that a touch up event has occurred is possible before this threshold is met, using the above method. Where the object attenuation value is above the second attenuation value threshold but an attenuation gradient signature is observed having a negative attenuation gradient equal to or greater than 20% of the second attenuation value threshold over a single frame, a touch up event may be determined to have occurred.
In a preferred embodiment, the attenuation gradient values required to trigger touch up/down events for an object may be scaled in dependence on the presence of other occluding objects in close proximity to the object. In a preferred example, the attenuation gradient of the second period of a signature is scaled up to require an even larger value to trigger a touch down event for an object in close proximity to other occluding objects on the touch surface. In one embodiment, the higher attenuation gradient is scaled linearly occurred to the number of additional touches within a radius of up to 10 cm. The radius may be chosen in dependence on the screen size, touch resolution, and environmental noise.
‘Hooks’ are a problem observed in the flow of co-ordinates of user touch input over time when the user is providing rapidly changing touch input. E.g. When drawing or writing. An example of a ‘hook’ is where the user finishes drawing a stroke, lifts the touch object from the surface of the panel and rapidly changes the direction of movement of the touching object to begin drawing the next stroke. The ‘hook’ is a small artifact seen at the end of the stroke pointing in the new direction of the user's touch object. A method of minimizing hooks is proposed. In a preferred embodiment of the invention, once a negative attenuation gradient has been observed, the touch coordinates will not be updated with the object's position and the coordinates of the object's position are stored. If the object attenuation value drops below a threshold value, the stored coordinates are discarded and a ‘touch up’ event is signaled. If the object attenuation value does not drop below a threshold value and a positive attenuation gradient is subsequently observed, the stored touch coordinates for the intervening period will be output and the touch coordinates will continue to be output as before. In a preferred embodiment, the method is only used when the direction of movement of the object contacting the touch surface in the plane of the touch surface is changing. In this embodiment, a vector a from a last touch coordinate of the object to a current coordinate is determined. A second vector p from a touch coordinate previous to the last coordinate to the last coordinate is determined. Vectors a and p allow a determination of the direction the interaction is moving and how it is changing. A rapid change of direction of the object may result in α scalarproduct β<0. In one embodiment, if this condition is met, it may be determined that the direction of movement of the object contacting the touch surface has significantly changed and the above method for minimizing hooks is then applied.
Although the attenuation value of an object provides information regarding the light attenuated by the object touching the surface, some embodiments of the invention require that the attenuation value be compensated in order to provide a true reflection of the nature and/or position of the object.
In one embodiment of the invention, the attenuation value is determined in dependence on the attenuation of the light resulting from the object touching the touch surface and a compensation value. The attenuation value is determined as in step 540 above but wherein the attenuation value is calculated from the compensation value and a maximum attenuation value or a weighted sum of attenuation values within the peak shape.
In certain arrangements of the system shown
The relationship between the position of the touch and a required compensation value may be a complex function of the geometry of the emitters and detectors.
Consequently, a preferred embodiment of the invention provides calculating a compensation value as a function of the position of the corresponding touch on the touch surface. An alternative embodiment describes using a compensation map to determine a compensation value given a position on the touch surface. The compensation map may comprise a 2D image corresponding to the dimensions of the touch surface with pixel values corresponding to compensation values. A touch position is then used to determine the corresponding pixel on the compensation map and the pixel value at that position provides the corresponding compensation value. In a preferred embodiment, the compensation map has a resolution lower than or equal to the touch resolution of the touch determination system. The compensation map is preferably generated in advance but may also be generated dynamically as a function of environmental and performance variables.
The signal processor 130 may be configured to determine the compensation value at a position on the compensation map by interpolation in the x- and y-direction of the compensation map between pre-defined compensation values in the compensation map. Thus, it is possible to have a coarse grid of compensation values, and subsequently use interpolation over the coarse grid to obtain the compensation values at a particular coordinate.
The compensation value may be determined for each position in a grid of positions where the resolution of the compensation map is determined by the pitch of the grid, i.e. the dimensions of a cell in the grid. The pitch may vary as a function of the position in the compensation map. For example, bi-linear interpolation in a coarse grid, i.e. higher pitch, may work well in the center of the map, but near the edges and especially corners the pitch is may advantageously be decreased to correctly capture the attenuation variation.
Another variable which may affect the recorded attenuation of a touch object is the speed at which the touching object is moving across the touch surface. The light attenuation of each light path is recorded sequentially over a series of frames. Therefore, a sufficiently fast moving object may have moved away from a specific position before the attenuation of all light paths intersecting the position have been measured. Consequently, a moving object may generate a weaker attenuation signal.
As the relationship between the speed of an object and the recorded attenuation value may also be complicated by the position of the moving object on the touch surface, an embodiment of the invention provides determining a compensation value as a function of both the position and speed of the object on the touch surface.
The compensation value may be a function of the depth of the light field (hmax). This provides for improving the classification of different objects used simultaneously. For example, if a cone-shaped stylus tip is used, the attenuation will be affected by the current lightfield height to a larger extent than a tip having a uniform thickness in the longitudinal direction. Thus, by compensating for the lightfield height differences, it will be easier to distinguish between styluses having different tips, since the influence of the field height is minimized.
The signal processor 130 may be configured to determine the depth of the light field (hmax) based on the output signals of the light detectors 30b. A more warped touch surface 20, i.e. being more concave in the direction towards the user of the touch surface, may provide an increase in the signal strength detected by the light detectors 30b. Increased warp is also associated with increased height of the lightfield. Thus, by having the signal processor 130 configured to detect an increase or decrease of the output signal in response to touch surface warp, the height of the lightfield can be estimated. The estimate of the lightfield height may then be used as an input to the lightfield height compensation discussed above.
Another factor which may affect the light attenuation resulting from an object touching the touch surface is the shape of the object tip and the angle at which the tip is applied to the touch surface 20.
However, as we can see from
For a system having a large number of emitters and detectors, a number of detection lines are likely to intersect stylus tip 270.
Therefore, in a preferred embodiment of the invention, the signal processor 130 is configured to determine the angle of an axis of a stylus relative to the normal of the touch surface in dependence on a ratio between the minimum amount of attenuation and maximum amount of attenuation for the light paths intersecting the stylus.
Furthermore, once the phi angles of detection lines having the minimum and maximum transmission values are identified, it is possible to determine the direction that the stylus is pointing in the phi plane. As the profile shown in
In
In one embodiment, information on the user's left or right handedness is used to select between which phi value to use to determine tilt orientation, due to the difference in typical stylus tilt orientation between left and right handed users. E.g. For
In one embodiment, the signal processor 130 is configured to determine that a tilt angle of the stylus relative to the normal of the touch surface has exceeded a threshold if the attenuation value begins to increase again whilst the ratio between the minimum and maximum transmission values remains high. Such a signal output would likely be caused by the stylus being tilted at such an angle to the normal of the touch surface that a section of the stylus casing has entered the light field and is having an effect on the attenuation value.
The object touching the touch surface 20 may be a stylus 60, which may have a distal stylus tip 400 comprising a spherically-shaped portion 410, as illustrated in
Number | Date | Country | Kind |
---|---|---|---|
1551614 | Dec 2015 | SE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/SE2016/051229 | 12/7/2016 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2017/099657 | 6/15/2017 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4916712 | Bender | Apr 1990 | A |
5608550 | Epstein et al. | Mar 1997 | A |
6031524 | Kunert | Feb 2000 | A |
6175999 | Sloan et al. | Jan 2001 | B1 |
7199932 | Sugiura | Apr 2007 | B2 |
7436443 | Hirunuma et al. | Oct 2008 | B2 |
7729056 | Hwang et al. | Jun 2010 | B2 |
9063614 | Petterson et al. | Jun 2015 | B2 |
9201520 | Benko et al. | Dec 2015 | B2 |
9280237 | Kukulj | Mar 2016 | B2 |
9618682 | Yoon et al. | Apr 2017 | B2 |
9874978 | Wall | Jan 2018 | B2 |
10013107 | Christiansson et al. | Jul 2018 | B2 |
10019113 | Christiansson et al. | Jul 2018 | B2 |
10282035 | Kocovski et al. | May 2019 | B2 |
20090000831 | Miller et al. | Jan 2009 | A1 |
20090013562 | Van De Wijdeven et al. | May 2009 | A1 |
20090122027 | Newton | May 2009 | A1 |
20100073327 | Mau | Mar 2010 | A1 |
20100103133 | Park et al. | Apr 2010 | A1 |
20110084939 | Gepner et al. | Apr 2011 | A1 |
20110210946 | Goertz et al. | Sep 2011 | A1 |
20120062492 | Katoh | Mar 2012 | A1 |
20120146957 | Dunagan | Jun 2012 | A1 |
20120218229 | Drumm | Aug 2012 | A1 |
20120313865 | Pearce | Dec 2012 | A1 |
20130106709 | Simmons | May 2013 | A1 |
20130141395 | Holmgren et al. | Jun 2013 | A1 |
20130141397 | Dunagan | Jun 2013 | A1 |
20130181953 | Hinckley et al. | Jul 2013 | A1 |
20130342490 | Wallander | Dec 2013 | A1 |
20140253520 | Cueto et al. | Sep 2014 | A1 |
20140259029 | Choi et al. | Sep 2014 | A1 |
20150009687 | Lin | Jan 2015 | A1 |
20150121691 | Wang | May 2015 | A1 |
20150199071 | Hou | Jul 2015 | A1 |
20150261375 | Leigh et al. | Sep 2015 | A1 |
20150286698 | Gagnier et al. | Oct 2015 | A1 |
20150293600 | Sears | Oct 2015 | A1 |
20150373864 | Jung | Dec 2015 | A1 |
20160004898 | Holz | Jan 2016 | A1 |
20160026297 | Shinkai et al. | Jan 2016 | A1 |
20160062549 | Drumm | Mar 2016 | A1 |
20160098152 | Drumm et al. | Apr 2016 | A1 |
20160117019 | Michiaki | Apr 2016 | A1 |
20160077616 | Durojaiye et al. | Jun 2016 | A1 |
20160209886 | Suh et al. | Jul 2016 | A1 |
20160255713 | Kim et al. | Sep 2016 | A1 |
20160295711 | Ryu et al. | Oct 2016 | A1 |
20160299583 | Watanabe | Oct 2016 | A1 |
20170293392 | Christiansson et al. | Oct 2017 | A1 |
20170344185 | Ohlsson et al. | Nov 2017 | A1 |
20180031753 | Craven-Bartle et al. | Feb 2018 | A1 |
20180129354 | Christiansson et al. | May 2018 | A1 |
20180210572 | Wallander et al. | Jul 2018 | A1 |
20180225006 | Wall | Aug 2018 | A1 |
20180253187 | Christiansson et al. | Sep 2018 | A1 |
20180267672 | Wassvik et al. | Sep 2018 | A1 |
20180275788 | Christiansson et al. | Sep 2018 | A1 |
20180275830 | Christiansson et al. | Sep 2018 | A1 |
20180275831 | Christiansson et al. | Sep 2018 | A1 |
20190050074 | Kocovski | Feb 2019 | A1 |
Number | Date | Country |
---|---|---|
2008 280 952 | Mar 2009 | AU |
101174191 | Jun 2009 | CN |
201465071 | May 2010 | CN |
101882034 | Nov 2010 | CN |
203189466 | Sep 2013 | CN |
203224848 | Oct 2013 | CN |
205015574 | Feb 2016 | CN |
2565770 | Mar 2013 | EP |
2778849 | Sep 2014 | EP |
2515216 | Mar 2016 | EP |
WO 2008034184 | Mar 2008 | WO |
WO 2010064983 | Jun 2010 | WO |
WO 2012018176 | Feb 2012 | WO |
WO 2013115710 | Aug 2013 | WO |
WO 2014086084 | Jun 2014 | WO |
WO 2014098744 | Jun 2014 | WO |
WO 2014104967 | Jul 2014 | WO |
WO 2015175586 | Nov 2015 | WO |
WO 2018096430 | May 2018 | WO |
WO 2018106172 | Jun 2018 | WO |
Entry |
---|
Extended European Search Report in European Application No. 16873465.5, dated Jun. 25, 2019 in 9 pages. |
International Search Report for International App. No. PCT/SE2016/051229, dated Mar. 10, 2017, in 4 pages. |
Supplementary European Search Report for European App. No. EP 16759213, dated Oct. 4, 2018, in 9 pages. |
Extended European Search Report for European App. No. 16743795.3, dated Sep. 11, 2018, in 5 pages. |
International Search Report for International App. No. PCT/SE2017/051224, dated Feb. 23, 2018, in 5 pages. |
International Search Report for International App. No. PCT/IB2017/057201, dated Mar. 6, 2018, in 4 pages. |
Extended European Search Report in European Application No. 19165019.1, dated Jul. 18, 2019 in 8 pages. |
International Preliminary Report on Patentability received in International Application No. PCT/SE2017/051233, dated Jun. 11, 2019, in 6 pages. |
International Search Report for International App. No. PCT/SE2018/050070, dated Apr. 25, 2018, in 4 pages. |
Extended European Search Report in European Application No. 17750516.1, dated Jul. 16, 2019 in 5 pages. |
Number | Date | Country | |
---|---|---|---|
20180356940 A1 | Dec 2018 | US |