The field of the present invention is proximity sensors used in connection with indicia or a display screen to enable a contactless input system. Applications of the present invention include, inter alia, vending machines, interactive kiosks, self-checkout terminals, automatic teller machines (ATMs) and elevator button panels. The proximity sensors of the present invention are also intended for rugged environments, including sensors in and around vehicles, automatic doors for vehicles and buildings, and sensors, inter alia in mobile phones, tablets and vehicle cabins, for detecting in-air hand gestures and approaching persons and objects.
Certain proximity sensors provide high-resolution detection of an object's location within a detection plane. Such proximity sensors are employed inter alia as sensors for touchscreens. Other proximity sensors provide only rudimentary object detection, such as parking sensors. It would be advantageous to optimize proximity sensor cost and performance for a range of applications that fall between these two extremes; namely, applications that do not require detection of an object's precise location within a detection zone, but rather, require detection of activity occurring within a zone, e.g., detection of in-air gestures.
Certain proximity sensors employed as sensors for touchscreens need to be extended along an entire edge of the screen, and limitations in automated assembly equipment may limit the maximum length of a sensor, or else result in high manufacturing cost when manufacturing long sensors for large screens. It would be advantageous to enable using multiple, small sensors to provide touch detection for larger screen sizes.
The COVID-19 pandemic of 2019-2020 generated interest in contactless user interfaces for touchscreens, buttons and knobs, without requiring the user to touch the screen, button or knob, particularly for public terminals, such as automated teller machines (ATMs), self-checkout terminals in supermarkets, self-check-in terminals at airports, elevator button panels and vending machines. Contactless user interfaces for touchscreens, buttons and knobs also useful for sterile environments, such as doctors' offices and hospitals, and also for environments where people tend to have greasy or otherwise soiled hands, such as vehicle repair shops.
Another application of contactless user interfaces is automatic opening and closing mechanisms for car and elevator doors and liftgates. These doors and liftgates require supervision to prevent the moving door or liftgate panel from colliding with a nearby object, such as a curb, a tree or a neighboring parked car. It would be advantageous to provide higher resolution than today's rudimentary parking sensors for detecting objects that a moving door or liftgate panel is approaching.
Electromechanical systems need to be protected, particularly from moisture, without hindering the system's operation.
Embodiments of the present invention provide low-cost optics for a proximity sensor that is highly accurate with respect to its intended uses and further enables low cost assembly practices to be used in the manufacture of the sensor. The sensor is an improvement in terms of cost and manufacturing realized by employing an extruded plastic lens to collimate light in a first dimension, and a Fresnel lens array to collimate light in a second dimension. Part of the novelty lies in realizing that sensors intended for gesture detection are far more tolerant of systematic errors than sensors intended for detecting absolute coordinates of an object.
There is thus provided in accordance with an embodiment of the present invention a proximity sensor including a circuit board, at least one lens, a support structure suspending the at least one lens above the circuit board, a plurality of light emitters mounted on the circuit board, each emitter operable when activated to project light beams through the at least one lens along a common projection plane, a plurality of light detectors mounted on the circuit board, each detector operable when activated to detect amounts of light arriving through the at least one lens at the detector, wherein a reflective object located in the projection plane above the at least one lens reflects light projected at the reflective object from an emitter to one or more of the detectors, and wherein each emitter-detector pair, including one of the emitters and one of the detectors, when synchronously activated, is expected to generate a greater detection signal at the activated detector than the other detectors, were they also to be synchronously activated with any of the emitters, when the reflective object is located at a specific 2D location in the projection plane corresponding to the emitter-detector pair, and a processor connected to the emitters and to the detectors, configured to sequentially activate each emitter and synchronously co-activate one or more of the detectors, and to identify gestures performed by the reflective object above the at least one lens, based on amounts of light detected by the detector of each synchronously activated emitter-detector pair.
There is additionally provided in accordance with an embodiment of the present invention a proximity sensor including a circuit board, at least one lens, a support structure suspending the at least one lens above the circuit board, a plurality of light emitters mounted on the circuit board and arranged along a first planar curve, each emitter operable when activated to project light beams through the at least one lens through a detection volume, a plurality of light detectors mounted on the circuit board and arranged along a second planar curve, each detector operable when activated to detect amounts of light arriving through the at least one lens at the detector, wherein a reflective object located in the detection volume above the at least one lens reflects light projected at the reflective object from an emitter to one or more of the detectors, and wherein each emitter-detector pair, including one of the emitters and one of the detectors, when synchronously activated, is expected to generate a greater detection signal at the activated detector than the other detectors, were they also to be synchronously activated with any of the emitters, when the reflective object is located at a specific 3D location in the detection volume corresponding to the emitter-detector pair, and a processor connected to the emitters and to the detectors, configured to sequentially activate each emitter and synchronously co-activate one or more of the detectors, and to identify gestures performed by the reflective object above the at least one lens, based on amounts of light detected by the detector of each synchronously activated emitter-detector pair.
Prior art proximity sensors formed as an elongated strip mounted along an edge of a detection area, designed for detecting absolute coordinates of an object, require precision assembly equipment. It is presently costly and inefficient to manufacture these proximity sensors suitable for screens whose edges are longer than 350 mm. Embodiments of the present invention are therefore also directed at enabling the use of prior art proximity sensors for screens whose edges are longer than 350 mm, by using multiple sensors, each sensor being operative to detect objects in a portion of the display area. Embodiments of the present invention are also applicable to the low-cost sensors discussed above.
There is thus further provided, in accordance with an embodiment of the present invention a user interface device including a display, a computer rendering a user interface on the display and operable to respond to touch gestures reported as having occurred at locations on the display, a pair of sensors mounted along opposite edges of the display, each sensor including a plurality of light emitters operable to project light beams in a plane above and across an upper surface of the display, wherein locations in the plane correspond to locations within the display, a plurality of light detectors operable to detect amounts of the projected light beams reflected by an object inserted into the plane, and a control unit operable to simultaneously activate individual ones of the emitters and the detectors, and to calculate a location of the object in the plane based on outputs from the detectors, and a processor receiving the calculated object locations from the sensors, and configured to remove object locations, received from one of the pair of sensors, that were generated by detection of light emitted by the other of the pair of sensors, calculate an output location based on a weighted sum of the non-removed object locations, temporally filter the output location based on previous output locations, and report the temporally filtered output location as a touch location on the display to the computer.
Embodiments of the present invention are also directed at enabling intuitive graphical user interfaces (GUIs) intended for use by a finger hovering above the display on which the GUI is presented, and not touching the display. In this case, due to parallax effects, the user is unsure how the hovering finger is mapped onto the GUI.
There is thus yet further provided, in accordance with an embodiment of the present invention a contactless user interface system, including a display rendering a plurality of graphical user interface (GUI) elements at locations within the display, a sensor configured to identify coordinates within a detection plane, above and across the display, of an object inserted into the detection plane, wherein coordinates within the detection plane correspond to locations within the display, and a processor causing the display to render a cursor when the coordinates within the detection plane, identified by the sensor, do not correspond to locations within the display where any of the GUI elements are located, and to erase the rendered cursor when the coordinates within the detection plane, identified by the sensor, correspond to locations within the display where any of the GUI elements are located.
The present invention will be more fully understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:
In the disclosure and figures, the following numbering scheme is used. Like numbered elements are similar but not necessarily identical.
In the present description and claims, the term “gesture” includes pointing gestures, such as tapping on an icon in a touchscreen user interface. In the context of in-air gestures, the term “gesture” includes both, pointing at a location and also jabbing a location, which is similar to a tap gesture in-air, namely, a pointer is inserted into an in-air detection plane at a location in the plane, and removed from the detection plane at the same location.
Reference is made to
The proximity sensor illustrated in
Sensors for detecting movement gestures are suitable inter alia for detecting in-air hand gestures. This is a different type of user interface than a touchscreen, as a gesture is defined by relative positioning; namely, how an object moves while performing the gesture, rather than absolute positioning used for touchscreens; namely, where the object is located within a defined area. Another suitable application is vehicle liftgate mechanisms that open a liftgate in the rear of a vehicle in response to a user waving a hand opposite the liftgate or sliding a foot underneath the vehicle. Movement gesture sensors are also useful for collision avoidance systems. For example, a sensor along an automatic door detects objects that the door is approaching as the door opens, and sends a command to halt the door's movement before the door collides with the object. If the sensor can report characteristics of an approaching object, such as size and shape, the door may be selectively halted according to the type of object, e.g., a curb or wall may be treated differently than a bush or hedge, which may be treated differently than a person.
According to the teachings of the present invention a low-cost, robust and automotive-compliant hand gesture sensor incorporates the following features:
TABLE II below lists some design characteristics, that dictate corresponding geometric parameters, for an embodiment of the gesture sensor according to the teachings of the present invention.
The parameters in Table II are illustrative for a sensor designed for detecting hand gestures. Other embodiments of the invention will have different pitches and other parameters. For example, if more powerful lasers and photodiodes are used, the cross section is significantly smaller. Similarly, reducing the detection range reduces the sensor dimensions.
Reference is made to
As shown in
TABLE III indicates that the maximum sideways tolerance for a hand-wave-gesture sensor (0.32 mm) is greater than the tolerance for a touch sensor by a factor of ten, making it feasible to use a standard soldering process to solder SMD components in a hand-wave-gesture sensor. The tolerance in Table III refers to random errors. However, tolerances for systematic errors, e.g., resulting in a skewed detection area, are much greater. The hand-wave-gesture sensor is also suitable for use as a touch sensor for user interfaces in which a user activates icons or buttons, in particular, when the icons or buttons are large and/or where the icons or button are arranged with large amounts of space between them, as such user interfaces have a high tolerance for error in the detected touch coordinates.
Reference is made to
Reference is made to
The gesture sensors in
One difference between gesture detection and touchscreen touch detection is that touchscreens require precision when correlating a detected location with a corresponding screen location, whereas gestures, and especially in-air gestures, do not need to map hotspots to specific locations; rather, gesture detection requires that the relationship between hotspots within the detection plane is maintained. This feature of gesture sensors relaxes many of the tolerance requirements for touchscreen sensors, as discussed below.
Reference is made to
Reference is made to
Reference is made to
Other offsets between components and lenses in proximity sensors in accordance with an embodiment of the present invention may occur, inter alia lens shrinkage and PCB shrinkage, resulting in hotspot layouts that are trapezoidal. This would be a significant problem for touchscreen sensors as it causes a mismatch between hotspots and screen locations. But this issue is less of a problem for gesture sensors—provided that the ordinal relationship between neighboring hotspots is maintained. For example, lens shrinkage results in hotspots moving closer to their neighbors the further they are located from the sensor, but a series of detections over time still indicates the direction of the object's movement. And if the PCB shrinks in relation to the lenses, the distance between neighboring hotspots will increase as the hotspots are located further from the sensor, and the hotspot arrangement expands in a fan shape as it moves away from the sensor. But in this case too, a series of object detections over time indicates the object's movement. Thus, gesture sensors are extremely tolerant of systematic errors.
Reference is made to
Reference is made to
From a comparison between the proximity sensor of
In certain embodiments of the invention reflections made by an object near the sensor are detected with stray reflections that make these detections unreliable. Thus, the nearest rows of hotspots shown in
Detection signals according to the present invention are mapped as an image, whereby the detected reflection value for each emitter-detector pair is mapped to the corresponding hotspot location in the sensor 2D detection plane. Thus, the sensors depicted in
Software for gesture detection takes a different approach than touch detection, in accordance with embodiments of the present invention. Specifically, a sensor used for touch detection must identify the exact location within the detection area at which the object is located, whereas the object's exact location is not needed in order to identify a gesture. In practice, touch detection algorithms require much noise filtering, which is both costly and discards much sensor data, whereas matrix-based algorithms are less costly and retain more of the original data. Matrix-based algorithms are more useful for object recognition and movement detection than for calculating the exact location of an object. Correlation with defined shapes is used to identify the type or characteristics of the object performing the gesture, for example, to confirm that a hand is performing the gesture, and correlation with time shift identifies movement.
Reference is made to
For slow-moving objects it may be difficult to ascertain whether a gesture is being performed based on image correlation using only pairs of successive images. In some embodiments of the invention, multiple images are stored and a current image is correlated with older images to identify movement gestures. In some embodiments, in order to optimize the memory requirements for storing multiple images, one image is stored and used for image correlation with a series of newer images. For example, for an image n, images n−1, n−2, n−4, n−8 are stored and used for image correlation. At each new frame, the image stored as image n−1 is updated; at every second frame, the image stored as image n−2 is updated with what was previously stored as image n−1; at every fourth frame, the image stored as image n−4 is updated with what was previously stored as image n−2; and at every eighth frame, the image stored as image n−8 is updated with what was previously stored as image n−4. The system stores a timestamp for each image to be used when correlating an older image with a new current image so that the movement indicated by the image correlation is normalized by the time interval between the current image and the older image.
In some embodiments of the invention the 2D array of detection signals is transformed into 1D arrays, namely, a horizontal profile and a vertical profile, to simplify processing. For example, the detection signals for all hotspots along each emitter beam are summed to provide a single value for each emitter beam. I.e., for the sensors in
Reference is made to
As discussed hereinabove with respect to
Lenses 410 and 411 are easy to mass produce, making proximity sensors according to the present invention cheaper to produce than sensor 351 of
Referring still to
Reference is made to
Sensors according to the present invention are modular. Reference is made to
Reference is made to
Reference is made to
In some embodiments of the invention, the proximity sensor housing is further encased in hermetically sealed plastic transparent to infrared light to protect the sensor from the environment, particularly when the sensor is mounted on an exposed surface such as an exterior panel of a car door or liftgate. One example of plastic sealing is heat-shrink tubing ordinarily made of polyolefin, which shrinks radially (but not longitudinally) when heated, to between one-half and one-sixth of its diameter.
Reference is made to
Reference is made to
Applications for a gesture sensor as described hereinabove include automatic car doors and liftgates. In automatic car doors and liftgates, a sensor according to the present invention, mounted in the moving door or liftgate panel, detects objects that the door or liftgate is approaching, such as a curb, tree or neighboring parked car. The distance between the object and the door or liftgate panel is reported to the door or liftgate control system. Correlation with defined shapes is used to identify the type or characteristics of the approaching object, e.g., whether that object is a curb, a shrub, or a person. The door or liftgate control system halts movement of the door or liftgate panel before it collides with the object.
Applications for a gesture sensor as described hereinabove include mobile phones, tablet computers, connected-home products and vehicle cabins. Certain embodiments of these products include a radar sensor and an optical proximity sensor in accordance with embodiments of the invention. The radar and optical sensors complement each other by providing redundant detection of gestures. For certain gestures, the optical sensor provides faster and more accurate detection than the radar sensor. For example, the optical gesture sensor detects the location of the object with greater precision than the radar sensor. The optical and radar sensors are not capable of capturing personal features in an image such as, for example, time-of-flight cameras. Thus, privacy concerns are alleviated. In certain applications, before a user is detected approaching the device, one of the sensors is activated in standby mode to detect the approaching user and the other sensor is in sleep mode. The sensor in sleep mode is woken up when the sensor in standby mode detects an approaching user.
In certain embodiments of the invention, the emitters and detectors in a gesture sensor with cylindrical and Fresnel lenses are arranged along a planar curve, thereby providing detection of objects passing through a volume, rather than a two-dimensional plane, above the sensor. This sensor is included in mobile phones, tablet computers, connected-home devices and vehicle cabins to detect in-air gestures. For example, the sensor is mounted behind the display cover glass in a mobile phone, tablet computer, connected-home device or vehicle infotainment system, and projects light beams through the cover glass to detect in-air gestures performed opposite the cover glass.
Reference is made to
In mobile phones, tablet computers, connected-home devices and vehicle cabins, the optical sensor components are mounted along one or more edges of the display. The emitter beams are directed upward through the display glass. In contrast to optical touchscreens that project beams parallel to the display cover glass, this gesture sensor configured to detect gestures opposite the display glass does not require a raised bezel around the display, as the light beams are directed out of the display rather than parallel to the display surface.
Reference is made to
Such inputs are not limited to selecting on-screen icons. Additional in-air gestures detected by sensor 384 and reported to the system include swipe gestures, e.g., to pan or scroll the display, and two-finger pinch gestures to zoom-in and zoom-out. If the system is used in a self-checkout kiosk, the user may enter a signature to validate credit card payment by gesturing in-air in detection plane 503 and sensor 384 reports the signature gesture coordinates to the system.
The system illustrated in
In another example a panel having physical slider controls and rotatable knobs replaces display 501. A swipe gesture detected in detection plane 503 opposite a slider control is reported as movement along the slider to increase or decrease an input value. A two-finger spread gesture detected in detection plane 503 opposite a rotatable knob is interpreted as rotating the knob to the right, and a two-finger pinch gesture detected in detection plane 503 opposite the rotatable knob is interpreted as rotating the knob to the left.
When the display or button panel dimensions are large, it is advantageous to enable a system in which multiple sensors are used instead of one large sensor. In embodiments of the invention, two sensors are mounted facing each other along opposite edges of the display or button panel, where each sensor's detection zone covers only a portion of the display, but the two sensors together cover the full display.
In order to combine the outputs from sensors 385 and 386 into a touchscreen input, both sensors 385 and 386 send their respective outputs to processor 520 that combines the outputs and then sends the combined output as touch input to computer 521 running the display and user interface. In some embodiments, processor 520 is an ARM® processor in a Raspberry Pi™ single board computer running the Linux® operating system. ARM is a registered trademark of Arm Limited. RASPBERRY PI is a trademark of RASPBERRY PI FOUNDATION. LINUX is a registered trademark of Linus Torvalds in the U.S. and other countries.
Reference is made to
Function 1005 Map Coordinates—the sensors are facing different directions, so their outputs are not addressing the same coordinate space. This function maps each sensor's outputs to a shared coordinate space.
Function 1006 Debounce—using the output from function 1005, this function tries to detect fast flickering between ‘down’ and ‘up’ states, and suppress such flickering. This mostly occurs when an object is in the overlapping areas 507 and 510, where the signals are weak and the border of the active/non-active area is wavy.
Function 1007 State Arbitrator—using the output from function 1006 this function defines a state for the current touch. The states are discussed hereinbelow with reference to
Function 1008 Deghost—this function identifies touch outputs from each sensor that were likely generated by the sensor detecting an emitter beam from the opposite sensor. This is based, for example, on a touch coordinate that suddenly appears at a distance away from a previous touch location.
Function 1009 Weighted Position—if two sensors both reported valid touch locations, this function combines the two locations based on a weighted sum, assigning greater weight to the output from the sensor closer to the touch location.
Function 1010 Smoother—this function applies a state-based rolling average filter to the output of function 1009. The result is the output of the Merge Touches function. All group threads are merged to provide a single output for the main thread, and this main thread output is sent to the system as touch data from a single HID (absolute mouse, digitizer, or other HID profiles) device.
Reference is made to
The present invention covers systems featuring electronic display screens as well as systems featuring static images; e.g., printed matter. In particular, the invention includes printed matter that a user interacts with when the sensor of the present invention detects gestures at locations that correspond to symbols in the printed matter. The invention also includes a hologram of a printed or etched image that the user interacts with when the sensor of the present invention detects gestures at locations that correspond to symbols in the hologram. The present invention also includes input devices in which the input GUI is projected into airspace or onto a reflective surface, such as a head up display.
A contactless interface according to the present invention provides a GUI featuring one or more interactive elements and one or more in-air detection planes for detecting user input gestures. The system receives user input by mapping the detected gestures to respective ones of the GUI elements. As user interaction with the detection planes is performed in-air, without touching any surfaces, systems of the present invention prevent the transmission of pathogens that would otherwise occur via touch input surfaces touched by multiple individuals.
One of the issues addressed by the present invention is how the system distinguishes between user selection of a GUI element and activation of that element. Another issue addressed by the present invention is, how to intuitively communicate to users when and how their gestures are being received and interpreted as input by the system.
Reference is made to
Indications to the user are not limited to graphical changes on the display. In some embodiments of the invention, sensors 315 and 316 include visible-light emitters that are selectively activated to provide feedback to the user. For example, when an object enters detection plane 270, a first visible-light emitter is activated to illuminate the object in a first color. When the object is translated into detection plane 272 of proximity sensor 316, a second visible-light emitter, on proximity sensor 315 or on proximity sensor 316, is activated to illuminate the object in a second color. When finger 512 moves within detection plane 270, visible-light emitter activations indicate whether the corresponding cursor on the display 514 has selected a GUI element; e.g., a low-intensity illumination of the finger indicates that the cursor is located between GUI elements, and a high-intensity illumination indicates that the cursor is located at one of the GUI elements and has thus selected that element.
Reference is made to
Reference is made to
In certain embodiments of the invention, display 514 is replaced by a reflection of a display, e.g., a head up display, where the reflection surface is set back from detection plane 270.
In certain embodiments of the invention, detection plane 270 is created by an optical sensor having emitters and detectors on opposite edges of plane 270. In other embodiments of the invention, two detection planes 270 and 272 are created by two stacked sensors having emitters and detectors on opposite edges of planes 270 and 272. Optical sensors having emitters and detectors on opposite edges of one or more detection planes are described in U.S. Pat. No. 8,416,217, the contents of which are hereby incorporated herein in their entirety by reference.
In certain embodiments of the invention, an interface for entering a personal PIN or other sensitive data is provided where the display is encased in a hood or box that blocks others from viewing the data being entered by the user. In this case, the hood or box opening provides a frame surrounding detection plane 270 for housing an optical sensor having emitters and detectors on opposite edges of the detection plane. When an inclined or sloping reflective surface at the far end of this hood is the display seen by the user of this hooded input system, users intuitively understand that they are not required to touch the distant, inclined display surface. This makes interaction with a detection plane near the opening of the hood intuitive to the user.
In other embodiments, a holograph with GUI elements, e.g., representing numbered buttons in an elevator for selecting a floor or product names in a vending machine, is generated in-air. The sensor is configured to create a detection plane that is coplanar with the holograph, enabling the user to interact with the holograph as a standard button interface, except that no surfaces are touched. A system having a holograph coplanar with a detection plane is described in U.S. Pat. No. 10,282,034 and illustrated therein at least at FIG. 40.
In other embodiments, a physical button keypad, such as an elevator keypad, or product buttons or printed selection options on a vending machine, are provided at a distance behind detection plane 270 discussed hereinabove to enable contactless button selection. In some embodiments, the detection plane is situated a distance 30 mm above the keypad. The selected button is highlighted as a conventional button is, when pressed. In this case, simply inserting finger 512 at the location in detection plane 270 corresponding to a specific one of the buttons activates that button.
In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made to the specific exemplary embodiments without departing from the broader spirit and scope of the invention. For example, technologies such as radar, cameras, time-of-flight cameras and capacitive sensors may be used instead of the optical sensors discussed hereinabove. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2020/067599 | 12/30/2020 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2021/138516 | 7/8/2021 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4243879 | Carroll et al. | Jan 1981 | A |
4267443 | Carroll et al. | May 1981 | A |
4301447 | Funk et al. | Nov 1981 | A |
4542291 | Zimmerman | Sep 1985 | A |
4588258 | Toopman | May 1986 | A |
4593191 | Alles | Jun 1986 | A |
4641426 | Hartman et al. | Feb 1987 | A |
4672364 | Lucas | Jun 1987 | A |
4703316 | Sherbeck | Oct 1987 | A |
4761637 | Lucas et al. | Aug 1988 | A |
4928094 | Smith | May 1990 | A |
4988981 | Zimmerman et al. | Jan 1991 | A |
5036187 | Yoshida et al. | Jul 1991 | A |
5070411 | Suzuki | Dec 1991 | A |
5103085 | Zimmerman | Apr 1992 | A |
5162783 | Moreno | Nov 1992 | A |
5194863 | Barker et al. | Mar 1993 | A |
5220409 | Bures | Jun 1993 | A |
5414413 | Tamaru et al. | May 1995 | A |
5463725 | Henckel et al. | Oct 1995 | A |
5559727 | Deley et al. | Sep 1996 | A |
5577733 | Downing | Nov 1996 | A |
5603053 | Gough et al. | Feb 1997 | A |
5729250 | Bishop et al. | Mar 1998 | A |
5748185 | Stephan et al. | May 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5880462 | Hsia | Mar 1999 | A |
5889236 | Gillespie et al. | Mar 1999 | A |
5900863 | Numazaki | May 1999 | A |
5914709 | Graham et al. | Jun 1999 | A |
5936615 | Waters | Aug 1999 | A |
5943044 | Martinelli et al. | Aug 1999 | A |
5946134 | Benson et al. | Aug 1999 | A |
5977888 | Fujita et al. | Nov 1999 | A |
5988645 | Downing | Nov 1999 | A |
6010061 | Howell | Jan 2000 | A |
6035180 | Kubes et al. | Mar 2000 | A |
6091405 | Lowe et al. | Jul 2000 | A |
6161005 | Pinzon | Dec 2000 | A |
6333735 | Anvekar | Dec 2001 | B1 |
6340979 | Beaton et al. | Jan 2002 | B1 |
6362468 | Murakami et al. | Mar 2002 | B1 |
6377238 | McPheters | Apr 2002 | B1 |
6421042 | Omura et al. | Jul 2002 | B1 |
6429857 | Masters et al. | Aug 2002 | B1 |
6492978 | Selig et al. | Dec 2002 | B1 |
6646633 | Nicolas | Nov 2003 | B1 |
6690365 | Hinckley et al. | Feb 2004 | B2 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6707449 | Hinckley et al. | Mar 2004 | B2 |
6757002 | Oross et al. | Jun 2004 | B1 |
6762077 | Schuurmans et al. | Jul 2004 | B2 |
6788292 | Nako et al. | Sep 2004 | B1 |
6803906 | Morrison et al. | Oct 2004 | B1 |
6836367 | Seino et al. | Dec 2004 | B2 |
6864882 | Newton | Mar 2005 | B2 |
6874683 | Keronen et al. | Apr 2005 | B2 |
6875977 | Wolter et al. | Apr 2005 | B2 |
6947032 | Morrison et al. | Sep 2005 | B2 |
6954197 | Morrison et al. | Oct 2005 | B2 |
6972401 | Akitt et al. | Dec 2005 | B2 |
6972834 | Oka et al. | Dec 2005 | B1 |
6985137 | Kaikuranta | Jan 2006 | B2 |
7030861 | Westerman et al. | Apr 2006 | B1 |
7046232 | Inagaki et al. | May 2006 | B2 |
7054045 | McPheters et al. | May 2006 | B2 |
7133032 | Cok | Nov 2006 | B2 |
7162124 | Gunn, III et al. | Jan 2007 | B1 |
7170590 | Kishida | Jan 2007 | B2 |
7176905 | Baharav et al. | Feb 2007 | B2 |
7184030 | McCharles et al. | Feb 2007 | B2 |
7221462 | Cavallucci | May 2007 | B2 |
7225408 | ORourke | May 2007 | B2 |
7232986 | Worthington et al. | Jun 2007 | B2 |
7339580 | Westerman et al. | Mar 2008 | B2 |
7352940 | Charters et al. | Apr 2008 | B2 |
7369724 | Deane | May 2008 | B2 |
7372456 | McLintock | May 2008 | B2 |
7429706 | Ho | Sep 2008 | B2 |
7518738 | Cavallucci et al. | Apr 2009 | B2 |
7619617 | Morrison et al. | Nov 2009 | B2 |
7659887 | Larsen et al. | Feb 2010 | B2 |
7855716 | McCreary et al. | Dec 2010 | B2 |
7924264 | Ohta | Apr 2011 | B2 |
8022941 | Smoot | Sep 2011 | B2 |
8091280 | Hanzel et al. | Jan 2012 | B2 |
3115745 | Gray | Feb 2012 | A1 |
8120625 | Hinckley | Feb 2012 | B2 |
8139045 | Jang et al. | Mar 2012 | B2 |
8169404 | Boillot | May 2012 | B1 |
3193498 | Cavallucci et al. | Jun 2012 | A1 |
8243047 | Chiang et al. | Aug 2012 | B2 |
8269740 | Sohn et al. | Sep 2012 | B2 |
8289299 | Newton | Oct 2012 | B2 |
8316324 | Boillot | Nov 2012 | B2 |
8350831 | Drumm | Jan 2013 | B2 |
8426799 | Drumm | Apr 2013 | B2 |
8471814 | LaFave et al. | Jun 2013 | B2 |
8482547 | Christiansson et al. | Jul 2013 | B2 |
8508505 | Shin et al. | Aug 2013 | B2 |
8558815 | Van Genechten et al. | Oct 2013 | B2 |
8581884 | Fahraeus et al. | Nov 2013 | B2 |
8604436 | Patel et al. | Dec 2013 | B1 |
8648677 | Su et al. | Feb 2014 | B2 |
8922340 | Salter et al. | Dec 2014 | B2 |
8933876 | Galor et al. | Jan 2015 | B2 |
9050943 | Muller | Jun 2015 | B2 |
9207800 | Eriksson et al. | Dec 2015 | B1 |
9223431 | Pemberton-Pigott | Dec 2015 | B2 |
10282034 | Eriksson et al. | May 2019 | B2 |
10534479 | Holmgren et al. | Jan 2020 | B2 |
20010002694 | Nakazawa et al. | Jun 2001 | A1 |
20010022579 | Hirabayashi | Sep 2001 | A1 |
20010026268 | Ito | Oct 2001 | A1 |
20010028344 | Iwamoto et al. | Oct 2001 | A1 |
20010030642 | Sullivan et al. | Oct 2001 | A1 |
20010043189 | Brisebois et al. | Nov 2001 | A1 |
20010055006 | Sano et al. | Dec 2001 | A1 |
20020067348 | Masters et al. | Jun 2002 | A1 |
20020075243 | Newton | Jun 2002 | A1 |
20020103024 | Jeffway, Jr. et al. | Aug 2002 | A1 |
20020109843 | Ehsani et al. | Aug 2002 | A1 |
20020152010 | Colmenarez et al. | Oct 2002 | A1 |
20020175900 | Armstrong | Nov 2002 | A1 |
20030034439 | Reime et al. | Feb 2003 | A1 |
20030174125 | Torunoglu et al. | Sep 2003 | A1 |
20030231308 | Granger | Dec 2003 | A1 |
20030234346 | Kao | Dec 2003 | A1 |
20040031908 | Neveux et al. | Feb 2004 | A1 |
20040046960 | Wagner et al. | Mar 2004 | A1 |
20040056199 | OConnor et al. | Mar 2004 | A1 |
20040090428 | Crandall, Jr. et al. | May 2004 | A1 |
20040140961 | Cok | Jul 2004 | A1 |
20040198490 | Bansemer et al. | Oct 2004 | A1 |
20040201579 | Graham | Oct 2004 | A1 |
20050024623 | Xie et al. | Feb 2005 | A1 |
20050073508 | Pittel et al. | Apr 2005 | A1 |
20050093846 | Marcus et al. | May 2005 | A1 |
20050104860 | McCreary et al. | May 2005 | A1 |
20050122308 | Bell et al. | Jun 2005 | A1 |
20050133702 | Meyer | Jun 2005 | A1 |
20050174473 | Morgan et al. | Aug 2005 | A1 |
20050271319 | Graham | Dec 2005 | A1 |
20060001654 | Smits | Jan 2006 | A1 |
20060018586 | Kishida | Jan 2006 | A1 |
20060028455 | Hinckley et al. | Feb 2006 | A1 |
20060077186 | Park et al. | Apr 2006 | A1 |
20060132454 | Chen et al. | Jun 2006 | A1 |
20060161870 | Hotelling et al. | Jul 2006 | A1 |
20060161871 | Hotelling et al. | Jul 2006 | A1 |
20060229509 | Al-Ali et al. | Oct 2006 | A1 |
20060236262 | Bathiche et al. | Oct 2006 | A1 |
20060238517 | King et al. | Oct 2006 | A1 |
20060244733 | Geaghan | Nov 2006 | A1 |
20070024598 | Miller et al. | Feb 2007 | A1 |
20070052693 | Watari | Mar 2007 | A1 |
20070077541 | Champagne et al. | Apr 2007 | A1 |
20070084989 | Lange et al. | Apr 2007 | A1 |
20070103436 | Kong | May 2007 | A1 |
20070146318 | Juh et al. | Jun 2007 | A1 |
20070152984 | Ording et al. | Jul 2007 | A1 |
20070176908 | Lipman et al. | Aug 2007 | A1 |
20080008472 | Dress et al. | Jan 2008 | A1 |
20080012835 | Rimon et al. | Jan 2008 | A1 |
20080012850 | Keating, III | Jan 2008 | A1 |
20080013913 | Lieberman et al. | Jan 2008 | A1 |
20080016511 | Hyder et al. | Jan 2008 | A1 |
20080055273 | Forstall | Mar 2008 | A1 |
20080056068 | Yeh et al. | Mar 2008 | A1 |
20080068353 | Tieberman et al. | Mar 2008 | A1 |
20080080811 | Deane | Apr 2008 | A1 |
20080089587 | Kim et al. | Apr 2008 | A1 |
20080093542 | Lieberman et al. | Apr 2008 | A1 |
20080096620 | Lee et al. | Apr 2008 | A1 |
20080100572 | Boillot | May 2008 | A1 |
20080100593 | Skillman et al. | May 2008 | A1 |
20080117183 | Yu et al. | May 2008 | A1 |
20080122792 | Izadi et al. | May 2008 | A1 |
20080122796 | Jobs et al. | May 2008 | A1 |
20080122803 | Izadi et al. | May 2008 | A1 |
20080134102 | Movold et al. | Jun 2008 | A1 |
20080158172 | Hotelling et al. | Jul 2008 | A1 |
20080158174 | Land et al. | Jul 2008 | A1 |
20080211779 | Pryor | Sep 2008 | A1 |
20080221711 | Trainer | Sep 2008 | A1 |
20080224836 | Pickering | Sep 2008 | A1 |
20080259053 | Newton | Oct 2008 | A1 |
20080273019 | Deane | Nov 2008 | A1 |
20080278460 | Arnett et al. | Nov 2008 | A1 |
20080297487 | Hotelling et al. | Dec 2008 | A1 |
20090009944 | Yukawa et al. | Jan 2009 | A1 |
20090027357 | Morrison | Jan 2009 | A1 |
20090058833 | Newton | Mar 2009 | A1 |
20090066673 | Molne et al. | Mar 2009 | A1 |
20090096994 | Smits | Apr 2009 | A1 |
20090102815 | Juni | Apr 2009 | A1 |
20090122027 | Newton | May 2009 | A1 |
20090135162 | Van De Wijdeven et al. | May 2009 | A1 |
20090139778 | Butler et al. | Jun 2009 | A1 |
20090153519 | Suarez Rovere | Jun 2009 | A1 |
20090166098 | Sunder | Jul 2009 | A1 |
20090167724 | Xuan et al. | Jul 2009 | A1 |
20090173730 | Baier et al. | Jul 2009 | A1 |
20090189857 | Benko et al. | Jul 2009 | A1 |
20090195402 | Izadi et al. | Aug 2009 | A1 |
20090198359 | Chaudhri | Aug 2009 | A1 |
20090280905 | Weisman et al. | Nov 2009 | A1 |
20090322673 | Cherradi El Fadili | Dec 2009 | A1 |
20100002291 | Fukuyama | Jan 2010 | A1 |
20100013763 | Futter et al. | Jan 2010 | A1 |
20100023895 | Benko et al. | Jan 2010 | A1 |
20100031203 | Morris et al. | Feb 2010 | A1 |
20100066975 | Rehnstrom | Mar 2010 | A1 |
20100079407 | Suggs | Apr 2010 | A1 |
20100079409 | Sirotich et al. | Apr 2010 | A1 |
20100079412 | Chiang et al. | Apr 2010 | A1 |
20100095234 | Lane et al. | Apr 2010 | A1 |
20100134424 | Brisebois et al. | Jun 2010 | A1 |
20100149073 | Chaum et al. | Jun 2010 | A1 |
20100185341 | Wilson et al. | Jul 2010 | A1 |
20100208234 | Kaehler | Aug 2010 | A1 |
20100238138 | Goertz et al. | Sep 2010 | A1 |
20100238139 | Goertz et al. | Sep 2010 | A1 |
20100245289 | Svajda | Sep 2010 | A1 |
20100289755 | Zhu et al. | Nov 2010 | A1 |
20100295821 | Chang et al. | Nov 2010 | A1 |
20100299642 | Merrell et al. | Nov 2010 | A1 |
20100302185 | Han et al. | Dec 2010 | A1 |
20100321289 | Kim et al. | Dec 2010 | A1 |
20110005367 | Hwang et al. | Jan 2011 | A1 |
20110043325 | Newman et al. | Feb 2011 | A1 |
20110043826 | Kiyose | Feb 2011 | A1 |
20110044579 | Travis et al. | Feb 2011 | A1 |
20110050639 | Challener et al. | Mar 2011 | A1 |
20110050650 | McGibney et al. | Mar 2011 | A1 |
20110057906 | Raynor et al. | Mar 2011 | A1 |
20110063214 | Knapp | Mar 2011 | A1 |
20110074734 | Wassvik et al. | Mar 2011 | A1 |
20110074736 | Takakura | Mar 2011 | A1 |
20110075418 | Mallory et al. | Mar 2011 | A1 |
20110087963 | Brisebois et al. | Apr 2011 | A1 |
20110090176 | Christiansson et al. | Apr 2011 | A1 |
20110116104 | Kao et al. | May 2011 | A1 |
20110121182 | Wong et al. | May 2011 | A1 |
20110122560 | Andre et al. | May 2011 | A1 |
20110128234 | Lipman et al. | Jun 2011 | A1 |
20110128729 | Ng | Jun 2011 | A1 |
20110148820 | Song | Jun 2011 | A1 |
20110157097 | Hamada et al. | Jun 2011 | A1 |
20110163956 | Zdralek | Jul 2011 | A1 |
20110163996 | Wassvik et al. | Jul 2011 | A1 |
20110169773 | Luo | Jul 2011 | A1 |
20110169780 | Goertz et al. | Jul 2011 | A1 |
20110169781 | Goertz et al. | Jul 2011 | A1 |
20110175533 | Holman et al. | Jul 2011 | A1 |
20110175852 | Goertz et al. | Jul 2011 | A1 |
20110179368 | King et al. | Jul 2011 | A1 |
20110179381 | King | Jul 2011 | A1 |
20110205175 | Chen | Aug 2011 | A1 |
20110205186 | Newton et al. | Aug 2011 | A1 |
20110221706 | McGibney et al. | Sep 2011 | A1 |
20110227487 | Nichol et al. | Sep 2011 | A1 |
20110227874 | Fahraeus et al. | Sep 2011 | A1 |
20110242056 | Lee et al. | Oct 2011 | A1 |
20110248151 | Holcombe et al. | Oct 2011 | A1 |
20110249309 | McPheters et al. | Oct 2011 | A1 |
20110309912 | Muller | Dec 2011 | A1 |
20110310005 | Chen et al. | Dec 2011 | A1 |
20120050226 | Kato | Mar 2012 | A1 |
20120056821 | Goh | Mar 2012 | A1 |
20120068971 | Pemberton-Pigott | Mar 2012 | A1 |
20120068973 | Christiansson et al. | Mar 2012 | A1 |
20120071994 | Lengeling | Mar 2012 | A1 |
20120086672 | Tseng et al. | Apr 2012 | A1 |
20120098746 | Ogawa | Apr 2012 | A1 |
20120098753 | Lu | Apr 2012 | A1 |
20120098794 | Kleinert et al. | Apr 2012 | A1 |
20120116548 | Goree et al. | May 2012 | A1 |
20120127317 | Yantek et al. | May 2012 | A1 |
20120131186 | Klos et al. | May 2012 | A1 |
20120133956 | Findlay et al. | May 2012 | A1 |
20120162078 | Ferren et al. | Jun 2012 | A1 |
20120176343 | Holmgren et al. | Jul 2012 | A1 |
20120188203 | Yao et al. | Jul 2012 | A1 |
20120188205 | Jansson et al. | Jul 2012 | A1 |
20120212457 | Drumm | Aug 2012 | A1 |
20120212458 | Drumm | Aug 2012 | A1 |
20120218229 | Drumm | Aug 2012 | A1 |
20120223231 | Nijaguna | Sep 2012 | A1 |
20120262408 | Pasquero et al. | Oct 2012 | A1 |
20120306793 | Liu et al. | Dec 2012 | A1 |
20130044071 | Hu et al. | Feb 2013 | A1 |
20130057594 | Pryor | Mar 2013 | A1 |
20130127788 | Drumm | May 2013 | A1 |
20130127790 | Wassvik | May 2013 | A1 |
20130135259 | King et al. | May 2013 | A1 |
20130141395 | Holmgren et al. | Jun 2013 | A1 |
20130215034 | Oh et al. | Aug 2013 | A1 |
20130234171 | Heikkinen et al. | Sep 2013 | A1 |
20130263633 | Minter et al. | Oct 2013 | A1 |
20140049516 | Heikkinen et al. | Feb 2014 | A1 |
20140069015 | Salter et al. | Mar 2014 | A1 |
20140104160 | Eriksson et al. | Apr 2014 | A1 |
20140104240 | Eriksson et al. | Apr 2014 | A1 |
20140213323 | Holenarsipur et al. | Jul 2014 | A1 |
20140291703 | Rudmann et al. | Oct 2014 | A1 |
20140292665 | Lathrop et al. | Oct 2014 | A1 |
20140293226 | Hainzl et al. | Oct 2014 | A1 |
20140320459 | Pettersson et al. | Oct 2014 | A1 |
20140362206 | Kossin | Dec 2014 | A1 |
20150015481 | Li | Jan 2015 | A1 |
20150153777 | Liu et al. | Jun 2015 | A1 |
20150185945 | Lauber | Jul 2015 | A1 |
20150227213 | Cho | Aug 2015 | A1 |
20150248796 | Lyer | Sep 2015 | A1 |
20160026250 | Eriksson et al. | Jan 2016 | A1 |
20160154475 | Eriksson et al. | Jun 2016 | A1 |
20160154533 | Eriksson et al. | Jun 2016 | A1 |
20170115825 | Eriksson et al. | Apr 2017 | A1 |
20170160427 | Costello et al. | Jun 2017 | A1 |
20170185160 | Cho et al. | Jun 2017 | A1 |
20170262134 | Eriksson et al. | Sep 2017 | A1 |
20180045827 | Yoon et al. | Feb 2018 | A1 |
20180120892 | von Badinski | May 2018 | A1 |
20180267216 | Otsubo | Sep 2018 | A1 |
20200001556 | Otsubo | Jan 2020 | A1 |
20200319720 | Murayama et al. | Oct 2020 | A1 |
Number | Date | Country |
---|---|---|
202014104143 | Oct 2014 | DE |
0601651 | Jun 1994 | EP |
1906632 | Apr 2008 | EP |
10-148640 | Jun 1998 | JP |
11-232024 | Aug 1999 | JP |
3240941 | Dec 2001 | JP |
2003-029906 | Jan 2003 | JP |
2013-149228 | Aug 2013 | JP |
2014183396 | Sep 2014 | JP |
1020120120097 | Nov 2012 | KR |
1012682090000 | May 2013 | KR |
1020130053363 | May 2013 | KR |
1020130053364 | May 2013 | KR |
1020130053367 | May 2013 | KR |
1020130053377 | May 2013 | KR |
1020130054135 | May 2013 | KR |
1020130054150 | May 2013 | KR |
1020130133117 | Dec 2013 | KR |
8600446 | Jan 1986 | WO |
8600447 | Jan 1986 | WO |
2008004103 | Jan 2008 | WO |
2008133941 | Nov 2008 | WO |
2010011929 | Jan 2010 | WO |
2010015408 | Feb 2010 | WO |
2010134865 | Nov 2010 | WO |
2012017183 | Feb 2012 | WO |
2012089957 | Jul 2012 | WO |
2012089958 | Jul 2012 | WO |
2013102551 | Jul 2013 | WO |
2014041245 | Mar 2014 | WO |
2014194151 | Dec 2014 | WO |
2015161070 | Oct 2015 | WO |
2016048590 | Mar 2016 | WO |
2016122927 | Aug 2016 | WO |
2018216619 | Nov 2018 | WO |
Entry |
---|
Hodges, S., Izadi, S., Butler, A., Rrustemi A., Buxton, B., “ThinSight: Versatile Multitouch Sensing for Thin Form-Factor Displays.” UIST'07, Oct. 7-10, 2007. <http://www.hci.iastate.edu/REU09/pub/main/telerobotics_team_papers/thinsight_versatile_multitouch_sensing_for_thin_formfactor_displays.pdf>. |
Miyamoto, I., et al., Basic Study of Touchless Human Interface Using Net Structure Proximity Sensors, Journal of Robotics and Mechatronics vol. 25 No. 3, 2013, pp. 553-558. |
Miyamoto, I., et al., Basic Study of Touchless Human Interface Using Net Structure Proximity Sensors, No. 12-3 Proceedings of the 2012 JSME Conference on Robotics and Mechanics, Hamamatsu, Japan, May 27-29, 2012, 2P1-P03(1) to 2P1-P03(3). |
Moeller, J. et al., ZeroTouch: An Optical Multi-Touch and Free-Air Interaction Architecture, Proc. CHI 2012 Proceedings of the 2012 Annual Conference Extended Abstracts on Human Factors in Computing Systems, May 5, 2012, pp. 2165-2174. ACM New York, NY, USA. |
Moeller, J. et al., ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field, CHI EA '11 Proceedings of the 2011 Annual Conference Extended Abstracts on Human Factors in Computing Systems, May 2011, pp. 1165-1170. ACM New York, NY, USA. |
Moeller, J. et al., IntangibleCanvas: Free-Air Finger Painting on a Projected Canvas, CHI EA '11 Proceedings of the 2011 Annual Conference Extended Abstracts on Human Factors in Computing Systems, May 2011, pp. 1615-1620. ACM New York, NY, USA. |
Moeller, J. et al., Scanning FTIR: Unobtrusive Optoelectronic Multi-Touch Sensing through Waveguide Transmissivity Imaging, TEI '10 Proceedings of the Fourth International Conference on Tangible, Embedded, and Embodied Interaction, Jan. 2010, pp. 73-76. ACM New York, NY, USA. |
Plaisant, C., Wallace, D. (1992): Touchscreen Toggle Design. In: Bauersfeld, Penny, Bennett, John, Lynch, Gene (eds.) Proceedings of the ACM CHI 92 Human Factors in Computing Systems Conference Jun. 3-7, 1992, Monterey, California. pp. 667-668. |
Van Loenen, Evert, et al., Entertaible: A Solution for Social Gaming Experiences, Tangible Play Workshop, Jan. 28, 2007, pp. 16-19, Tangible Play Research and Design for Tangible and Tabletop Games, Workshop at the 2007 Intelligent User Interfaces Conference, Workshop Proceedings. |
Butler et al., “SideSight: Multi-touch Interaction Around Smart Devices.” UIST'08, Oct. 2008. http://131.107.65.14/en-us/um/people/shahrami/papers/sidesight.pdf. |
Johnson, M., “Enhanced Optical Touch Input Panel”, IBM Technical Disclosure Bulletin vol. 28, No. 4, Sep. 1985 pp. 1760-1762. |
Rakkolainen, I., Höllerer, T., Diverdi, S. et al., Mid-air display experiments to create novel user interfaces, Multimed. Tools Appl. (2009) 44: 389, doi:10.1007/s11042-009-0280-1. |
Hasegawa, K. et al., SIGGRAPH '15, ACM, SIGGRAPH 2015 Emerging Technologies, Article 18, Jul. 31, 2015, ACM, New York, NY, USA, ISBN: 978-1-4503-3635-2, doi: 10.1145/2782782.2785589. |
U.S. Appl. No. 14/312,787, Non-final Office Action, dated Jan. 8, 2015, 15 pages. |
U.S. Appl. No. 14/312,787, Notice of Allowance, dated Jun. 22, 2015, 9 pages. |
PCT Application No. PCT/US2014/040112, International Preliminary Report on Patentability, dated Dec. 1, 2015, 18 pages. |
PCT Application No. PCT/US2014/040112, Search Report and Written Opinion, dated Dec. 2, 2014, 21 pages. |
European Patent Application No. 14 804 520.6, Extended European Search Report, dated May 24, 2016, 11 pages. |
European Patent Application No. 14 804 520.6, First Office Action, dated May 10, 2017, 9 pages. |
Chinese Patent Application No. 201480030571.8, First Office Action, dated Aug. 16, 2016, 6 pages. |
Chinese Patent Application No. 201480030571.8, Second Office Action, dated May 4, 2017, 4 pages. |
U.S. Appl. No. 14/555,731, dated Dec. 2, 2016, 8 pages. |
PCT Application No. PCT/US2015/057460, International Search Report, dated Jan. 21, 2016, 2 pages. |
PCT Application No. PCT/US2015/057460, Written Opinion, dated Jan. 21, 2016, 6 pages. |
U.S. Appl. No. 15/000,815, Non-final Office Action, dated Jun. 3, 2016, 7 pages. |
U.S. Appl. No. 15/000,815, Final Office Action, dated Jan. 23, 2017, 8 pages. |
PCT Application No. PCT/US2016/013027, International Search Report and Written Opinion, dated May 26, 2016, 13 pages. |
European Patent Application No. 16743860.5, Extended European Search Report, dated Jul. 18, 2018, 8 pages. |
Japanese Patent Appliation No. 2017-539236, First Office Action, dated Sep. 10, 2018, 3 pages. |
U.S. Appl. No. 15/616,106, Non-Final Office Action, dated Mar. 7, 2019, 10 pages. |
U.S. Appl. No. 15/898,585, Non-Fnal Office Action, dated Sep. 13, 2018, 9 pages. |
U.S. Appl. No. 15/990,587, Notice of Allowance, dated Sep. 4, 2019, 8 pages. |
U.S. Appl. No. 16/127,238, Non-fFnal Office Action. dated Mar. 5, 2020, 18 pages. |
U.S. Appl. No. 16/365,662, Non-Final Office Action, dated Aug. 21, 2020, 9 pages. |
U.S. Appl. No. 16/365,662, Notice of Allowance, dated Nov. 20, 2020, 10 pages. |
U.S. Appl. No. 16/694,018, Non-Final Office Action, dated Apr. 7, 2020, 9 pages. |
U.S. Appl. No. 16/739, 142, Notice of Allowance, dated Apr. 28, 2021, 8 pages. |
PCT Application No. PCT/US2020/06599, Search Report and Written Opinion, dated May 3, 2021, 14 pages. |
U.S. Appl. No. 17/198,273, Notice of Allowance, dated Aug. 9, 2022, 10 pages. |
U.S. Appl. No. 17/385,260, Non-final Office action, dated Aug. 16, 2022, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20230037571 A1 | Feb 2023 | US |
Number | Date | Country | |
---|---|---|---|
63080656 | Sep 2020 | US | |
63030919 | May 2020 | US | |
62956058 | Dec 2019 | US |