The present invention relates to industrial process control or monitoring systems. More specifically, the present invention relates to process variable measurement in an industrial process.
In industrial settings, control systems are used to monitor and control inventories of industrial and chemical processes, and the like. Typically, the control system that performs these functions uses field devices distributed at key locations in the industrial process and coupled to control circuitry in a control room by a process control loop. The term “field device” refers to any device that performs a function in a distributed control or process monitoring system, including all devices used in the measurement, control and monitoring of industrial processes.
Some field devices include a process variable sensor used to sense a process variable. Example process variables include flow rate, pressure, level, temperature, pH, valve or motor position, motor speed, actuator position, etc.
Many types of process variable sensors are based upon intrusive technologies in which a sensor must be exposed directly or indirectly to process fluid in order to obtain the process variable measurement.
A field device for monitoring a process variable of an industrial process includes an image capture device. A process component exhibits relative motion as a function of a process variable. The image captures device captures an image which changes due to the relative motion of the process component. An image processor coupled to the image capture device detects relative motion of the process component and measures the process variable based upon the detected relative motion. Output circuitry provides an output related to the measured process variable.
A process variable of an industrial process is measured using image capture techniques. More specifically, imaging techniques are used to observe a process component which exhibits relative motion as a function of a process variable. Changes in the captured images due to the relative motion can be correlated with changes in the process variable and used to measure the process variable. The correlation can be through curve fitting or other techniques which relate the relative motion to the process variable. The correlation may be based on any property of the motion including amplitude, frequency, spectrum of motion, particular patterns of motion, the presence or absence of motion, etc.
Many techniques used for measuring a process variable require an intrusive technology in which a process variable sensor is coupled directly or indirectly to the process fluid. However, there are situations in which a non-intrusive, or less intrusive, technology would be desirable to increase reliability and life span, improve safety, reduce environmental concerns, reduce costs and increase operational flexibility. One type of non-intrusive measurement presently available uses infrared detectors which are capable of measuring process temperatures at a distance. However, for other process variables such as flow rate, level or pressure, sensing components typically physically couple directly or indirectly with the process fluid. Recently researchers at the Massachusetts Institute of Technology (MIT) have used non-invasive video detection techniques to detect the pulse of a patient as blood flows through the patient's face based upon changes in skin color (see MIT News, “Researchers amplify variations in video, making the invisible visible, Larry Hardesty, Jun. 22, 2012, http://web.mit.edu/newsoffice/2012/amplifying-invisible0video-0622/html).
In example embodiments, a method and apparatus are provided for measurement of process variables using image capture devices 100 to capture images of process component 106. The process component 106 exhibits relative motion as a function of a process variable. Changes in captured images are used to detect the relative motion of the process component (displacement, deformation, etc.). These changes are correlated with a process variable. As discussed below, various techniques can be used to induce motion in the process component 106 as a function of a process variable.
In one specific example, an image capture device 100 can be used to measure flow rate in a vortex flow meter.
Typically, in the configuration of
There are many types of devices which can be used to capture two-dimensional (raster) video images. Typically, the images from such devices are sufficient for capturing slow moving objects. However, higher speed motion may be problematic. For example, an area scan camera with a 1,024×1024 pixel sensor resolution operating at a 20 MHz pixel clock rate requires 52 milliseconds to capture an entire image. This can be used to determine the maximum frequency detectable. Specifically, with 1,024 pixels per line, at a 20 MHz clock rate, it would require 51 microseconds to scan an entire line. For all 1,024 lines, a total of 52 milliseconds is required to capture an entire frame. This yields about 19 frames per second. According to Nyquist's theorem, the maximum detectable frequency would be about 5 Hz.
Single line (one-dimensional) line scanners provide an increased capture rate. Examples of line scanners are those used in a fax machine, a computer scanner, etc. Using a single line of pixels, the line scanner is capable of building a continuous image due to relative motion between the line scanner and the image. The vertical resolution is therefore based on this relative motion. For example, a line scanner can produce a 1,024 by N image, where N continuously grows as long as the scanner is operating. As line scan cameras output only a single line of pixels per exposure, less time is required to capture the image. Once pixel information is transferred to an output register, the active pixels are available for the next exposure. The line scan rate is the reciprocal of the line readout time, or 1/51 microseconds giving 19,000 lines per second. The resolution determines the smallest feature that can be scanned. For example, scanning one inch with 1,024 pixels yields a horizontal resolution of 1/1,1024=0.001 inches. In the context of sensing a process variable such as flow rate, speed of the scanner is related to the maximum frequency which can be measured which is proportional to flow rate. However, to obtain information related to mass flow, both high speed and pixel resolution are required in order to measure magnitude of the motion. In this configuration, the maximum detectable frequency of the movement would be about 5 kHz.
Another example image capture device is a single pixel sensor. A single pixel can be used to measure motion as an object moves into and out of the field of view of the single pixel. According to Nyquist's Theorem, a single pixel with a 20 MHz clock rate, can detect motion at a frequency of up to 10 MHz.
When process device 12 is configured as a process controller, the image capture device 100 can be used to provide feedback for use in controlling the control element 20. For example, the image capture device 100 can be used to obtain a process variable measurement. The measured value of the process variable is then used to change the control signal applied to the control element 20. The process variable may be a process variable related to a process fluid such as pressure, temperature, flow rate, etc., or may be a process variable related to the control element itself such as valve position, motor speed, motor position, actuator position, etc.
As discussed above, image capture techniques are used to obtain process variable information from an industrial process. Image capture device 100 is arranged to receive light or other electromagnetic radiation 104 from process component 106. Device 100 is preferably directional and includes at least one radiation sensor. Device 100 may be an individual (discrete) sensor or may comprise a plurality of sensors fabricated in a single device. The output from device 100 is provided to processing circuitry 102 which provides a processed output to the microprocessor 24. For example, processing circuitry 102 can include amplification circuitry, noise reduction circuitry, an analog to digital converter, comparison circuitry, etc. The output from processing circuitry 102 is provided to microprocessor 24 in a digital format. Processing circuitry can be implemented in device 100, as separate circuitry, or by microprocessor 24 and can be analog to digital circuitry.
Processing circuitry 120 or microprocessor 24 can detect motion based upon changes in an image, for example based upon comparison to a threshold or a relative comparison between various regions in a captured image. In another example configuration, a baseline image is stored in memory 26. Changes in a captured image with respect to the baseline image are detected and used to sense motion of the process component 106. The detection of motion of the process component 106 can be based upon outputs from one or more individual sensors (pixels), or may be a function of a relationship between the outputs of multiple sensors. Comparing sensor outputs may be used to assist in the reduction of erroneous measurements due to background noise, ambient light conditions, etc.
Although measurement of vortex shedding frequency or magnitude is described above, other process variables may also be measured. For example, fluid level can be measured with a process component 106 configured as a float which floats in a process fluid in a container. Position of component 106 is related to fluid level. Position of the float can be detected by capturing an image and locating the float position in the image by monitoring changes in the image as the float moves up and down with the fluid level. In another example, position can be detected by providing component 106 with a reflectivity that varies across its surface. The variations in reflectivity will cause the amount of reflected radiation 104 which reaches device 100 to vary as the component 106 moves with the fluid level. These amplitude variations are sensed by device 100 and are a function of fluid level. Similarly, color variations can be provided across a surface of process component 106 which can be detected by detector 100 and correlated with movement of the process component 106. Texturing of process component 106 may also be employed to enhance detection of movement.
Other types of mechanical motion can detected and used to measure process variables. Such motion includes motion from motors, mixers, valves, actuators, vibrators, lifts, etc. If component 106 rotates, process variables such as speed (RPM) and angular position can be determined based upon the observed motion. Motion due to mechanical expansion or contraction can also be detected and used to measure pressure, temperature or level changes. Example motion of a repetitive nature which can be used to obtain a process variable includes motion due to flow passing a vortex shedding bar or due to the Coriolis effect. Motion due to vibrations can be observed and used to determine a process variable. In another example, changes in a shape of a storage tank are observed and correlated with fill level in the tank in a manner similar to a strapping table. Similarly, pressure can be determined by detecting motion due to deformation of piping acting as a Bourdon tube. In yet another example embodiment, motion of the process component is due to an applied weight or force is detected. In such a configuration, the value of the applied force or the amount of the applied weight is related to amount of motion of the process component. For example, a weight or force applied to an end of an elongate armature will cause the end of the armature to move. The amount of movement is related to the applied weight or force and the stiffness of the armature. In another example, a process component with a large temperature coefficient can be used to measure temperature as the component expands or contracts with temperature.
Motion of component 106 can be amplified or otherwise its signal strength increased using appropriate techniques. For example, a mechanical extension on an element acts as a lever arm to increase the amount of movement. Techniques can be used to enhance detection of motion including utilization of different surface colors or changes in surface texturing. In order to reduce noise which may lead to the false detection of movement, reference marks or other indicators may be provided on a surface. The thickness of walls or other components may be reduced to thereby increase the amount they move (deflect or deform) when a force is applied. An optional lensing system 130 can provide magnification or filtering to the image.
The signal from the process component 106 travels by electromagnetic waves 104 to the image capture device 100. Any appropriate frequency (wavelength) range may be used as appropriate including infrared, visible and/or ultraviolet radiation. An optional radiation source 200 shown in
The image capture device can comprise any appropriate image detector or array including a zero-dimensional (single pixel sensor), line (one-dimensional array), raster (two-dimensional array) or cube (three-dimensional array). Further, combinations of these various types of arrays may also be implemented. The image capture device 100 may be implemented using standard video camera technology, or other technologies including infrared cameras, etc. In some configurations, a single image capture device 100 can be used to capture images of multiple components to detect their motion and thereby determine multiple process variables.
Once the image has been captured, the processing circuitry 102 or microprocessor 24 may perform additional computations on the captured image. Example signal processing techniques include using a Fast Fourier Transform (FFT) to process the image to obtain frequency related information, using derivatives or the like to identify motion in the image, digital filtering techniques to reduce noise, and amplitude enhancing techniques to increase sensitivity. Anti-aliasing techniques may also be employed to reduce erroneous readings. Diagnostic algorithms can identify a failing component or other diagnostic condition in the process. A diagnostic algorithm can also detect a failing components in the image capture device itself, such as failing pixels.
Techniques may be used to increase the resolution of an image capture device. For example, regions of interest, such as, regions where a change is occurring, can be scanned at a higher rate than regions in an image where no change is occurring. Note that an image capture device which is used to capture motion should be sufficiently stabilized such that its own motion due to vibrations or the like does not introduce errors in the measurements.
In the above discussion, reference is made to obtaining images of a process component which moves, however, the discussion is applicable to relative movement due to motion of a process component. More specifically, in the above discussion the image capture device provides a reference location from which motion of the process component is observed. In another example configuration, the image capture device 100 moves with the process component 106 and captures an image of a reference element. For example, in
A measured process variable can be transmitted over any appropriate process control loop and communicated to a control system as desired. This allows standard control and monitoring systems to utilize an image capture device 100, including a safety system over layer or the like. The use of an image capture device 100 provides a number of advantages over many competing technologies including a less invasive configuration, low cost, low power, simplified installation and limited user configuration.
Although the present invention has been described with reference to preferred embodiments, workers skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the invention. Although the term “image” has been used herein, the present invention may utilize any appropriate frequency or frequency range radiation. This includes visible light, infrared and ultraviolet radiation. In general aspects, a process variable is measured using relative motion of a process component. This includes observing motion that occurs in which the amount or manner of the motion is related to a process variable, motion in which the rate or speed of the motion is related to a process variable as well as configurations in which the shape or contours of a process component change as a function of a process variable, among others. As used herein, the term “motion” includes displacement type motion, deformation or shape changes, vibrations, repetitive motion, linear or non-linear motion, etc. The value of a process variable can be correlated to the detected motion using empirical or modeling techniques as desired. However, any appropriate technique may be employed including, for example, neural networks or the like.
Number | Name | Date | Kind |
---|---|---|---|
3857277 | Moore | Dec 1974 | A |
4306457 | Fukui | Dec 1981 | A |
4736250 | Blazo | Apr 1988 | A |
4900161 | Wolf et al. | Feb 1990 | A |
4933545 | Saaski et al. | Jun 1990 | A |
4947247 | Farver | Aug 1990 | A |
5056046 | Mutchler | Oct 1991 | A |
5109277 | James | Apr 1992 | A |
5128537 | Halg | Jul 1992 | A |
5144430 | Boelart | Sep 1992 | A |
5292195 | Crisman, Jr. | Mar 1994 | A |
5404218 | Nave et al. | Apr 1995 | A |
5619046 | Engstrom et al. | Apr 1997 | A |
5638174 | Henderson | Jun 1997 | A |
5654977 | Morris | Aug 1997 | A |
6000844 | Cramer et al. | Dec 1999 | A |
6040191 | Grow | Mar 2000 | A |
6059453 | Kempf et al. | May 2000 | A |
6259810 | Gill et al. | Jul 2001 | B1 |
6346704 | Kenway | Feb 2002 | B2 |
6461573 | Yamamoto et al. | Oct 2002 | B1 |
6484585 | Sittler et al. | Nov 2002 | B1 |
6518744 | Tallman | Feb 2003 | B1 |
6573331 | Camberlin | Jun 2003 | B1 |
6631287 | Newman et al. | Oct 2003 | B2 |
6820487 | Esashi et al. | Nov 2004 | B2 |
6901101 | Frick | May 2005 | B2 |
6941813 | Boukhny et al. | Sep 2005 | B2 |
6967571 | Tsujita | Nov 2005 | B2 |
7019630 | Katou | Mar 2006 | B2 |
7060965 | Vidovic et al. | Jun 2006 | B2 |
7208735 | Sierra et al. | Apr 2007 | B2 |
7248297 | Catrysse et al. | Jul 2007 | B2 |
7372485 | Bodnar et al. | May 2008 | B1 |
7407323 | Hutcherson | Aug 2008 | B2 |
7409867 | Toy et al. | Aug 2008 | B2 |
7466240 | Evans et al. | Dec 2008 | B2 |
7472215 | Mok et al. | Dec 2008 | B1 |
7636114 | Aoyama | Dec 2009 | B2 |
7680460 | Nelson et al. | Mar 2010 | B2 |
7768425 | Evans et al. | Aug 2010 | B2 |
7809379 | Hedtke et al. | Oct 2010 | B2 |
7852271 | Grunig et al. | Dec 2010 | B2 |
7852383 | Harada | Dec 2010 | B2 |
8098302 | Fakuda et al. | Jan 2012 | B2 |
8108790 | Morrison, Jr. et al. | Jan 2012 | B2 |
8121078 | Siann et al. | Feb 2012 | B2 |
8191005 | Baier et al. | May 2012 | B2 |
8208752 | Ishii | Jun 2012 | B2 |
8310541 | Moore | Nov 2012 | B2 |
8410946 | Ansari et al. | Apr 2013 | B2 |
8538560 | Brown et al. | Sep 2013 | B2 |
8706448 | Orth | Apr 2014 | B2 |
8898036 | Sittler et al. | Nov 2014 | B2 |
9019108 | Chillar et al. | Apr 2015 | B2 |
9049239 | Kenney et al. | Jun 2015 | B2 |
9201414 | Kantzes et al. | Dec 2015 | B2 |
9201419 | Timsjo et al. | Dec 2015 | B2 |
9247374 | Tomimatsu et al. | Jan 2016 | B2 |
9537699 | Kenney et al. | Jan 2017 | B2 |
9696429 | Turon et al. | Jul 2017 | B2 |
20010042834 | Kenway | Nov 2001 | A1 |
20030027949 | Yarnamoto et al. | Feb 2003 | A1 |
20040041538 | Sklovsky | Mar 2004 | A1 |
20040156549 | Persiantsev | Aug 2004 | A1 |
20040218099 | Washington | Nov 2004 | A1 |
20040233458 | Frick | Nov 2004 | A1 |
20050008072 | Angerer | Jan 2005 | A1 |
20050012817 | Hampapur et al. | Jan 2005 | A1 |
20050025368 | Glukhovsky | Feb 2005 | A1 |
20050063444 | Frick | Mar 2005 | A1 |
20050111696 | Baer | May 2005 | A1 |
20050164684 | Chen et al. | Jul 2005 | A1 |
20050220331 | Kychakoff et al. | Oct 2005 | A1 |
20060026971 | Sharpe | Feb 2006 | A1 |
20060092153 | Chu et al. | May 2006 | A1 |
20060148410 | Nelson | Jul 2006 | A1 |
20060278827 | Sierra et al. | Dec 2006 | A1 |
20070019077 | Park | Jan 2007 | A1 |
20070052804 | Money et al. | Mar 2007 | A1 |
20070073439 | Habibi et al. | Mar 2007 | A1 |
20070125949 | Murata et al. | Jun 2007 | A1 |
20080165195 | Rosenberg | Jul 2008 | A1 |
20080278145 | Wenger | Nov 2008 | A1 |
20090078047 | Dam | Mar 2009 | A1 |
20090249405 | Karaoguz et al. | Oct 2009 | A1 |
20090285259 | Allen et al. | Nov 2009 | A1 |
20100013918 | Ta 'Eed | Jan 2010 | A1 |
20100220180 | Lee et al. | Sep 2010 | A1 |
20110230942 | Herman et al. | Sep 2011 | A1 |
20110317066 | Cabman et al. | Dec 2011 | A1 |
20120025081 | Rapp et al. | Feb 2012 | A1 |
20120041582 | Wallace | Feb 2012 | A1 |
20120109342 | Braun et al. | May 2012 | A1 |
20120157009 | Hollander | Jun 2012 | A1 |
20120161958 | Turon et al. | Jun 2012 | A1 |
20130009472 | Orth | Jan 2013 | A1 |
20130085688 | Miller et al. | Apr 2013 | A1 |
20130099922 | Lohbihler | Apr 2013 | A1 |
20130120561 | Heintze | May 2013 | A1 |
20130163812 | Mukasa | Jun 2013 | A1 |
20130176418 | Pandey et al. | Jul 2013 | A1 |
20130222608 | Baer | Aug 2013 | A1 |
20130250125 | Garrow et al. | Sep 2013 | A1 |
20130294478 | Puroll et al. | Nov 2013 | A1 |
20140003465 | Elke | Jan 2014 | A1 |
20140128118 | Tomimatsu et al. | May 2014 | A1 |
20150116482 | Bronmark et al. | Apr 2015 | A1 |
20150130927 | Luxen et al. | May 2015 | A1 |
20160091370 | Schnaare | Mar 2016 | A1 |
Number | Date | Country |
---|---|---|
1882078 | Dec 2006 | CN |
101014091 | Aug 2007 | CN |
101019419 | Aug 2007 | CN |
101277383 | Oct 2008 | CN |
101460971 | Jun 2009 | CN |
201322868 | Oct 2009 | CN |
101600046 | Dec 2009 | CN |
101647216 | Feb 2010 | CN |
101681161 | Mar 2010 | CN |
101685295 | Mar 2010 | CN |
1012483618 | May 2012 | CN |
102830669 | Dec 2012 | CN |
102999022 | Mar 2013 | CN |
103380446 | Oct 2013 | CN |
103947170 | Jul 2014 | CN |
204350309 | May 2015 | CN |
0 423 903 | Apr 1991 | EP |
1 026 493 | Aug 2000 | EP |
1 244 899 | Dec 2000 | EP |
52-140779 | Nov 1977 | JP |
53-86111 | Jul 1978 | JP |
S58-090882 | May 1983 | JP |
61-136340 | Jun 1986 | JP |
62-179647 | Aug 1987 | JP |
64-73880 | Mar 1989 | JP |
H07-325900 | Dec 1995 | JP |
H09-265316 | Oct 1997 | JP |
H10-294933 | Nov 1998 | JP |
11-23350 | Jan 1999 | JP |
H11-75176 | Mar 1999 | JP |
11-189603 | Jul 1999 | JP |
2001-84031 | Mar 2001 | JP |
2001-221666 | Aug 2001 | JP |
2001-238198 | Aug 2001 | JP |
2001-256475 | Sep 2001 | JP |
2002-300569 | Oct 2002 | JP |
2004-288092 | Oct 2004 | JP |
2006-031418 | Feb 2006 | JP |
2007-108836 | Apr 2007 | JP |
2008-527493 | Jul 2008 | JP |
2008-257513 | Oct 2008 | JP |
2009-210042 | Sep 2009 | JP |
2012-175631 | Sep 2010 | JP |
2010-536092 | Nov 2010 | JP |
2010-283444 | Dec 2010 | JP |
2011-185926 | Sep 2011 | JP |
2011-209033 | Oct 2011 | JP |
2012-037519 | Feb 2012 | JP |
2013-009079 | Jan 2013 | JP |
2013-533570 | Aug 2013 | JP |
2014-523033 | Sep 2014 | JP |
2419926 | May 2011 | RU |
I220364 | Aug 2004 | TW |
WO 0159419 | Aug 2001 | WO |
WO 0223148 | Mar 2002 | WO |
WO 2004011935 | Feb 2004 | WO |
WO 2005033643 | Apr 2005 | WO |
WO 2006092052 | Sep 2006 | WO |
WO 2007-019676 | Feb 2007 | WO |
2006081154 | Sep 2007 | WO |
WO 2007-139123 | Dec 2007 | WO |
WO 2008136752 | Nov 2008 | WO |
WO 2009074708 | Jun 2009 | WO |
WO 2011004020 | Jan 2011 | WO |
WO 2011137264 | Nov 2011 | WO |
WO 2013006307 | Jan 2013 | WO |
WO 2013009715 | Jan 2013 | WO |
Entry |
---|
Stephens et al., “Heat transfer performance for batch oscillatory flow mixing”, Elsevier 2002. |
Stephens et al., “Heat transfer performance for batch oscillatory flow mixing”, 2002 Elsevier. |
“Notification of Transmittal of the International Search Report and the Written Opinion” for PCT/US2007/012050, dated Feb. 4, 2008. |
Journal of Lightwave Technology, vol. 19, No. 10, Oct. 2001, “Self-Calibrated Interferometric-Intensity-Based Optical Fiber Sensors”, Wang et al., pp. 1495-1501. |
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, PCT/US2014/069968, dated Mar. 19, 2015. |
Invitation to Pay Additional Fees, PCT/US2014/051628, dated Nov. 25, 2014. |
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, PCT/US2014/051628, dated Apr. 13, 2015. |
Office Action from related Chinese Patent Application No. 201320868039.6, dated May 19, 2014 (2 pages). |
First Correct Notification for Chinese Patent Application No. 201420426405.7, dated Oct. 31, 2014, 4 pages. |
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, PCT/US2014/051625, dated Oct. 23, 2014. |
Hardesty, Larry. (MIT News Office). MIT News “Researchers amplify variations in video, making the invisible,” dated Jun. 22, 2012, 3 pgs. Found at http://web.mit.edu/newsoffice/2012/amplifying-invisible-video-0622.html. |
Office Action from U.S. Appl. No. 14/224,814, dated Jun. 15, 2016. |
Office Action from U.S. Appl. No. 14/037,989, dated Jun. 3, 2016. |
Office Action from European Application Serial No. 14761468.9, dated May 4, 2016. |
Office Action from U.S. Appl. No. 14/038,090, dated Jun. 28, 2016. |
Office Action from European Application Serial No. 14761467.1, dated May 4, 2016. |
Office Action from Russian Application Serial No. 2016116020, dated May 31, 2016. |
Office Action from European Application Serial No. 14783924.5, dated Jun. 3, 2016. |
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, PCT/US2014/051432, dated Jan. 12, 2015. |
Office Action from U.S. Appl. No. 14/224,814, dated Jul. 8, 2015. |
Office Action from U.S. Appl. No. 14/224,858, dated Jun. 12, 2015. |
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, PCT/US2015/011958, dated May 18, 2015. |
“Integrated Wireless Gas Detection Solution”, www.gassecure.com, Jun. 2014, 2 pgs. |
“GS01 Wireless Gas Detector”, www.gassecure.com, Jun. 2014, 2 pgs. |
Office Action from U.S. Appl. No. 14/224,858, dated Oct. 2, 2015. |
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, PCT/US2015/040310, dated Nov. 11, 2015. |
Final Office Action from U.S. Appl. No. 14/224,814, dated Feb. 1, 2016. |
Office Action from U.S. Appl. No. 14/224,858, dated Feb. 5, 2016. |
Office Action from Chinese Application Serial No. 201310737591.6, dated Oct. 17, 2016. |
Examination Report from Australian Application Serial No. 2014328576, dated Jul. 21, 2016. |
Examination Report from Australian Application Serial No. 2014328666, dated Oct. 11, 2016. |
Office Action from Chinese Application Serial No. 201410024656.7, dated Oct. 9, 2016. |
“The World's Smallest High-Temperature Pressure Sensor is Developed by Researchers at Virginia Tech's Center for Photonics Technology”, http://www.ee.vt.edu/˜photonics/newsStorysmallestsensor.html, one page, Dec. 27, 2005. |
Office Action from Canadian Application Serial No. 2,923,153, dated Jan. 16, 2017. |
Office Action from Japanese Patent Application No. 2016-516983; dated Mar. 8, 2017. |
Office Action from Chinese Patent Application No. 201410061865.9 dated Oct. 28, 2016. |
Communication from European Patent Application No. 15706956.8, dated Nov. 7, 2016. |
Office Action from Canadian Application Serial No. 2,923,156, dated Feb. 2, 2017. |
Office Action from U.S. Appl. No. 14/037,989, dated Feb. 10, 2017. |
Office Action from U.S. Appl. No. 14/499,719, dated Mar. 23, 2017. |
Office Action (including Search Report) from Russian Application Serial No. 2016116020, dated Feb. 10, 2017. |
Office Action from Australian Patent Application No. 2014328576, dated Feb. 24, 2017. |
Office Action from Canadian Patent Application No. 2,923,159, dated Mar. 7, 2017. |
Office Action from Japanese Patent Application No. 2016-516988, dated Mar. 24, 2017. |
Office Action from Canadian Patent Application No. 2,923,159 dated May 19, 2017. |
Office Action from Chinese Patent Application No. 201410366848.6, dated Feb. 24, 2017. |
Communication from European Patent Application No. 15744804.4, dated May 9, 2017. |
Office Action from Chinese Patent Application No. 201310737591.6, dated Jun. 1, 2017. |
Examination Report No. 2 from Australian Patent Application No. 2014328666, dated Jun. 16, 2017. |
“ADNS-5090 Low Power Optical Mouse Sensor Data Sheet”, Avago Technologies, dated Apr. 25, 2012. |
Office Action from Chinese Patent Application No. 201410024656.7, dated Jun. 8, 2017. |
Office Action from Japanese Patent Application No. 2016-517425, dated Jun. 6, 2017. |
Office Action from Chinese Patent Application No. 201410061865.9, dated Jun. 9, 2017. |
Third Examination Report from Australian Patent Application No. 2014328576, dated Jun. 29, 2017. |
Office Action from U.S. Appl. No. 14/038,090, dated Jul. 28, 2017. |
Office Action from Canadian Patent Application No. 2,943,542, dated Jul. 31, 2017. |
Office Action from Russian Patent Application No. 2016116017, dated Jun. 8, 2017. |
Office Action from Russian Patent Application No. 2016116039, dated Jul. 13, 2017. |
Third Examination Report from Australian Patent Application No. 2014328666, dated Oct. 10, 2017. |
Office Action from Japanese Patent Application No. 2016-558794, dated Oct. 24, 2017. |
Examination Report from Australian Patent Application No. 2015324515 dated Sep. 4, 2017. |
Office Action from Chinese Patent Application No. 201410366848.6, dated Nov. 6, 2017. |
Office Action from U.S. Appl. No. 14/499,719, dated Oct. 6, 2017. |
Final Rejection from Japanese Patent Application No. 2016-516988, dated Nov. 8, 2017, 11 pages. |
Office Action from Canadian Patent Application No. 2,923,156, dated Nov. 30, 2017. |
Office Action from Canadian Patent Application No. 2,957,246, dated Dec. 8, 2017. |
Final Office Action from U.S. Appl. No. 14/038,090, dated Jan. 24, 2018, 33 pages. |
Office Action from Chinese Patent Application No. 201310737591.6, dated Nov. 29, 2017. |
Office Action from Japanese Patent Application No. 2016-516983, dated Dec. 6, 2017. |
Office Action from Canadian Patent Application No. 2,923,153, dated Dec. 13, 2017. |
Office Action from Canadian Patent Application No. 2,923,153, dated Aug. 24, 2018. |
Office Action from Chinese Patent Application No. 201410831781.9, dated Nov. 28, 2017, 13 pages. |
Office Action from Chinese Patent Application No. 201410024656.7 dated Dec. 28, 2017. |
Office Action from Japanese Patent Application No. 2016-517425, dated Jan. 9, 2018. |
Office Action from Japanese Patent Application No. 2017-516333, dated Mar. 20, 2018. |
Office Action from U.S. Appl. No. 14/037,989, dated Dec. 29, 2017. |
Office Action from Russian Patent Application No. 2017114674, dated May 31, 2018. |
Office Action from Canadian Patent Application No. 2,957,246, dated Jul. 30, 2018. |
Office Action from Japanese Patent Application No. 2017-516333, dated Jul. 31, 2018. |
Office Action from Chinese Patent Application No. 201310737591.6, dated May 24, 2018. |
Office Action from U.S. Appl. No. 14/037,989, dated Aug. 16, 2018. |
Office Action from U.S. Appl. No. 14/038,090, dated Aug. 9, 2018. |
Office Action from Canadian Patent Application No. 2,923,156, dated Jun. 19, 2018. |
Office Action from Chinese Patent Application No. 201410024656.7, dated Sep. 20, 2018. |
Office Action from Chinese Patent Application No. 201410831781.9, dated Aug. 9, 2018. |
Office Action from U.S. Appl. No. 14/499,719, dated Jul. 9, 2018. |
Office Action from U.S. Appl. No. 14/037,989, dated Nov. 29, 2018. |
Office Action from Russian Patent Application No. 2017114674, dated Oct. 26, 2018. |
Trial Decision from Japanese Patent Application No. 2016-516988 (Appeal No. 2017-18657), dated Oct. 31, 2018. |
Examination Report from Indian Patent Application No. 201627005256, dated Dec. 22, 2018. |
Final Office Action from U.S. Appl. No. 14/038,090, dated Feb. 7, 2019. |
Communication from European Patent Application No. 15744804.4, dated Jan. 31, 2019. |
Office Action from Chinese Patent Application No. 201410831781.9, dated Mar. 4, 2019. |
Office Action from Japanese Patent Application No. 2017-516333, dated Dec. 18, 2018. |
Office Action from Canadian Patent Application No. 2,923,156, dated Mar. 21, 2019. |
Office Action from Canadian Patent Application No. 2,923,153 dated Mar. 21, 2019. |
Examination Report from Indian Patent Application No. 201627004690, dated Mar. 27, 2019. |
Office Action from U.S. Appl. No. 14/037,989, dated Jun. 6, 2019. |
Office Action from U.S. Appl. No. 14/038,090, dated Jun. 28, 2019. |
Communication from European Patent Application No. 14783924.5, dated Jan. 2, 2020. |
Office Action from Japanese Patent Application No. 2018-004260, dated May 28, 2019. |
Communication from European Patent Application No. 14761467.1, dated May 29, 2019. |
Office Action from Chinese Patent Application No. 201410024656.7, dated Jun. 20, 2019. |
Office Action from U.S. Appl. No. 14/499,719, dated Aug. 21, 2019. |
Office Action from U.S. Appl. No. 14/037,989, dated Sep. 17, 2019. |
Office Action from Australian Patent Application No. 2018222951, dated Jul. 12, 2019. |
Office Action from Chinese Patent Application No. 201410831781.9, dated Sep. 18, 2019. |
Appeal Decision from Japanese Patent Application No. 2016-517425, dated Oct. 29, 2019. |
Examination Report from Indian Patent Application No. 201627004614, dated Dec. 12, 2019. |
Communication from European Patent Application No. 14761468.9, dated Nov. 7, 2019. |
Communication from European Patent Application No. 14761467.1, dated Dec. 5, 2019. |
Office Action from European Patent Application No. 14783924.5, dated Mar. 16, 2018. |
Number | Date | Country | |
---|---|---|---|
20150085104 A1 | Mar 2015 | US |