The technology disclosed herein may be applicable to laboratory instruments.
From time to time, laboratory instruments, such as instruments used to analyze blood samples from patients, may experience issues that are beyond the capabilities of the individuals at the location of the instrument to solve. To address this, external service personnel having greater expertise may be involved to provide either onsite or remote support. However, the involvement of these external service personnel may be complicated by the fact that the laboratory instruments may generate or display highly sensitive data that should not be exposed beyond a highly select population that may not include the service personnel. Accordingly, there is a need for technology that can allow external service personnel to provide assistance with resolving issues related to laboratory equipment while preventing those service personnel from being exposed to highly sensitive data.
Embodiments disclosed herein may be used to implement methods and machines for providing service to laboratory instruments without inappropriately exposing sensitive information. For example, embodiments disclosed herein may be used to perform a method which comprises capturing an image of a field of view of a wearer of a head mounted camera and generating a modified image by performing a set of steps. Such steps may include identifying one or more display screens in the wearer's field of view. Such steps may also include, for each of the one or more display screens, determining whether an interface is displayed on that display screen that matches an item from a predefined interface library. Such steps may also include, for each displayed interface with a matching item from the predefined interface library, either overlaying an image corresponding to that interface from the predefined interface library or masking confidential information in that interface based on information from the predefined interface library indicating one or more confidential information locations in that interface. Such a method may also include presenting the modified image, and such presentation may comprise transmitting the modified image to a remote service technician or displaying the modified image on a display of an augmented reality headpiece on which the head mounted camera is integrated.
Other embodiments are also possible. For example, the disclosed technology may be used to implement a system comprising a head mounted camera and a processor. In such a system, the processor may be configured with a set of computer instructions operable to, when executed, determine whether images captured by the head mounted camera should be made available for non-immediate viewing. In some embodiments, such a determination may be made after the head mounted camera has been activated. Additionally, in some embodiments of this type of system, the computer instructions configuring the processor may make operable making an image captured by the head mounted camera available for non-immediate viewing if and only if a determination was made that images captured by the head mounted camera should be made available for non-immediate viewing prior to that image being captured, and no determination was made that images captured by the head mounted camera should be made available for non-immediate viewing more recently than the most recent determination that images captured by the head mounted camera should be made available for non-immediate viewing.
Further information on how the disclosed technology could potentially be implemented is set forth herein, and variations on the examples will be immediately apparent to and could be practiced without undue experimentation by those of ordinary skill in the art based on the material which is set forth in this document. Accordingly, exemplary methods and machines described in this summary should be understood as being illustrative only, and should not be treated as limiting the scope of protection provided by this or any related document.
The technology disclosed herein can be used to address problems related to servicing laboratory equipment while preventing exposure of confidential information.
According to a first aspect, some embodiments may include a method comprising capturing an image of a field of view of a wearer of a head mounted camera, automatically generating a modified image in real time, and presenting the modified image. In such methods, automatically generating the modified image may comprise steps such as identifying one or more portions of the image of the field of view of the wearer of the head mounted camera to be masked and masking each of the one or more identified portions. Additionally, in methods such as referred to above, presenting the modified image may comprise transmitting the modified image to a remote service technician, displaying the modified image on a display of an augmented reality headpiece wherein the head mounted camera is integrated with the augmented reality headpiece, or displaying the modified image on a display of a laboratory instrument.
In some embodiments such as described in the context of the first aspect, the step of identifying one or more portions of the image of the field of view of the wearer of the head mounted camera to be masked may comprise a set of steps. Such steps may include identifying one or more display screens in the wearer's field of view. Such steps may also comprise, for each of the one or more display screens, determining whether an interface is displayed on that display screen that matches an item from a predefined interface library. Such steps may also include, for each displayed interface with a matching item from the predefined interface library, identifying one or more portions of that interface as portions of the image to be masked based on information from the matching item from the predefined interface library. In some such embodiments, masking each of the one or more identified portions may comprise, for each of the one or more display screens having a matching item from the predefined interface library, overlaying an image corresponding to that interface from the predefined interface library.
In some embodiments such as described in the context of the first aspect, identifying one or more portions of the image of the field of view of the wearer of the head mounted camera to be masked may comprise determining a focal distance of the head mounted camera and identifying each portion of the image of the field of view of the wearer of the head mounted camera that is farther from the head mounted camera than the identified focal distance as a portion of the image of the field of view of the wearer of the head mounted camera to be masked.
In some embodiments such as described in the context of the first aspect, the method might comprise, prior to capturing the image of the field of view of the wearer of the head mounted camera, providing one or more notation media exemplary images. Further, in such an embodiment, identifying one or more portions of the image of the field of view of the wearer of the head mounted camera to be masked may comprise identifying one or more notation media in the wearer's field of view based on information from the notation media exemplar images, and identifying each notation media in the wearer's field of view as a portion of the image of the field of the wearer to be masked. Similarly, in some such embodiments, the information from the notation media exemplar images may comprise one or more notation media exemplar colors.
In some embodiments such as described in the context of the first aspect, the method may comprise, prior to capturing the image of the field of view of the wearer of the head mounted camera, specifying an imagable area of a laboratory. In such an embodiment, the method may also comprise, after capturing the image of the field of view of the wearer of the head mounted camera, determining (which determination may be based on one or more wireless transceivers located at a border of the imagable area, or based on distance from a wireless transceiver located inside the imagable area) whether the head mounted camera is located in the imagable area of the laboratory. Additionally, in some embodiments where presenting the modified image comprises transmitting the modified image to a remote service technician, based on a determination that the head mounted camera is not located in the imagable area of the laboratory, the method may include automatically deactivating transmission functionality of the head mounted camera. In some such embodiments, deactivating transmission functionality of the head mounted camera may be done by deactivating the head mounted camera.
In some embodiments such as described in the context of the first aspect, presenting the modified image may comprise transmitting the modified image to a remote service technician using an internet connection. Additionally, in some such embodiments, the method may comprise, simultaneously with transmitting the modified image to a remote service technician, displaying the image of the field of view of the wearer of the head mounted camera on a display of an augmented reality headpiece. In some such embodiments, the head mounted camera may be integrated with the augmented reality headpiece. Additionally, in some embodiments the head mounted camera may be comprised by a pair of instrumented safety glasses.
In some embodiments such as described in the context of the first aspect, the image of the field of view of the wearer of the head mounted camera may be captured as part of a video stream. In some such embodiments, the method may comprise transmitting the modified image to the remote service technician, and that transmission may be performed by transmitting a version of the video stream that includes the modified image rather than the captured image.
In some embodiments such as described in the context of the first aspect, the automatic generation of the modified image may be completed no more than ten seconds after capturing the image of the field of view of the wearer of the head mounted camera. In some such embodiments, the delay between capturing the field of view of the wearer of the head mounted camera and completion of generation of the modified image may be no more than 5 seconds, no more than 1 second, no more than 0.5 second, or no more than 0.1 second. Additionally, in some embodiments such as described in the context of the first aspect, the image of the field of view of the wearer of the head mounted camera may include confidential information outside of the one or more portions identified as portions to be masked.
According to a second aspect, some embodiments may include a system comprising a head mounted camera and a processor. In such embodiments, the processor may be configured with a set of computer instructions operable to, when executed, perform a variety of steps. Such steps may include, after the head mounted camera has been activated, determining whether images captured by the head mounted camera should be made available for non-immediate viewing and making an image captured by the head mounted camera available for non-immediate viewing if and only if various conditions are satisfied. Such conditions may include, in some embodiments, a determination is made that that image should be made available for non-immediate viewing, and/or both of the following statements are true: (1) a determination was made that images captured by the head mounted camera should be made available for non-immediate viewing prior to that image being capture and (2) no determination was made that images captured by the head mounted camera should not be made available for non-immediate viewing more recently than the most recent determination that images captured by the head mounted camera should be made available for non-immediate viewing.
In some embodiments such as described in the context of the second aspect, the computer instructions may be operable to, when executed, determine whether images captured by the head mounted camera should be made available for non-immediate viewing by performing a set of steps comprising determining if the head mounted camera is located within a predefined area. Such a set of steps may also include, based on a determination that the head mounted camera is not located within the predefined area, determining that images captured by the head mounted camera should not be made available for non-immediate viewing. In some such embodiments, the system may comprise a wireless transceiver located inside the predefined area, and the determination that the head mounted camera is not located within the predefined area is based on a distance between the head mounted camera and the wireless transceiver. In some other such embodiments, the system may comprise one or more wireless transceivers located at a border of the predefined area, and the determination that the head mounted camera is not located within the predefined area is based on detection of the head mounted camera crossing the border of the predefined area.
In some embodiments such as described in the context of the second aspect, the system may comprise a laboratory instrument, and the computer instructions may be operable to, when executed, determine whether images captured by the head mounted camera should be made available for non-immediate viewing by performing a set of steps. Such steps may include determining an orientation of the head mounted camera relative to the laboratory instrument, and, based on a determination that the orientation of the head mounted camera is offset from the laboratory instrument by 90 degrees or more, determining that images captured by the head mounted camera should not be made available for non-immediate viewing.
In some embodiments such as described in the context of the second aspect, the computer instructions may be operable to, when executed, determine whether images captured by the head mounted camera should be made available for non-immediate viewing by performing a set of steps comprising, based on a determination that elapsed time since a most recent occurrence of an authorization action is greater than a threshold duration, determining that images captured by the head mounted camera should not be made available for non-immediate viewing.
In some embodiments such as described in the context of the second aspect, the system may comprise a laboratory instrument configured to wirelessly communicate with the head mounted camera and/or to encrypt images captured by the head mounted camera.
In some embodiments such as described in the context of the second aspect, the head mounted camera may comprise an exterior light source. In such an embodiment, the head mounted camera may be configured to activate the exterior light source when the head mounted camera is activated and when a determination is made that images captured by the head mounted camera should be made available for non-immediate viewing. Similarly, in some such embodiments, the head mounted camera may be configured to deactivate the exterior light source when the head mounted camera is deactivated and when a determination is made that images captured by the head mounted camera should not be made available for non-immediate viewing.
In a third aspect, some embodiments may include a machine comprising a head mounted camera and a means for generating modified images which lack confidential information included in images captured by the head mounted camera.
Various other aspects and embodiments are also possible and could be implemented by those of ordinary skill in the art without undue experimentation based on the disclosure set forth herein. Accordingly, the above discussion and the features from the examples set forth in this description should not be treated as implying limitations on the protection provided by this document or by any other document which claims the benefit of this disclosure.
Turning now to
It should be understood that, while
Of course, it should be understood that, like the configurations of components described above, instrumented safety glasses 101 such as shown in
Turning now to
It should be understood that, just as it is possible that some embodiments may use devices that vary from the instrumented glasses 101 shown in
Turning now to
As with the glasses 101 and environment of
Other types of variations in addition (or alternative) to optimization for video processing are also possible. For example, in some embodiments, an additional step might be included of determining whether an image should be provided for non-immediate viewing (i.e., making the image viewable other than locally in real time, such as by sending it to a remote service technician, or by saving it for later review) at all and conditioning the performance of steps such as shown in
Alternatively, in some embodiments image capture equipment (and/or various functionality of that equipment, such as image transmission or storage functionality) might be configured to only remain active as long as it was within a predetermined radius (which, preferably, would be configurable by a user) or was oriented within a set tolerance (e.g., within 90 degrees) of a set point in an imaging area, such as an access point 206 or a transceiver incorporated into the relevant piece of equipment 204. As another alternative, in some embodiments image capture equipment could be equipped with navigation measurement equipment such as one or more accelerometers and/or GPS receiver and could use that equipment to determine whether its current position in a building was within an acceptable imaging area. Non-location-based approaches to automatically activating/deactivating image capture, processing and/or provision functionality could also (or alternatively) be included in some embodiments. For example, in some embodiments, software controlling operation of a pair of instrumented safety glasses could be configured to automatically check captured images for a prespecified symbol and deactivate some or all of the functionality of those glasses (e.g., by ceasing transmission of images to a remote technician) unless the symbol was recognized. In such an embodiment, the relevant symbol would preferably be affixed to or displayed on a laboratory instrument that may need service, so that if the instrumented glasses' wearer did something unrelated to the machine after service was finished but without turning off the glasses, the glasses could automatically stop capturing images (or performing other tasks) based on the fact that the user was no longer looking at the machine.
As another example of a type of variation that could be present in some embodiments, in some cases a method such as shown in
Of course, it should be understood that the advance configuration steps described above are also intended to be illustrative, and should not be treated as limiting. As an example of an alternative approach to advance configuration, consider the scenario in which instrumented safety glasses were provided by a manufacturer of laboratory instruments to assist the manufacturer in providing service to its customers during those instruments' operational lives. In such a scenario, the manufacturer might maintain a library of interfaces that would be presented by its machines, and when a pair of instrumented glasses captured an image of a machine, interfaces presented by the machine could be matched against the library. Then, when a match was found, the picture of the interface in a captured image could be overlaid with a generic interface image from the library or specific aspects of the interface in the captured image could be masked based on information in the library indicating locations in the relevant interface where confidential information would be displayed. An example of this type of selective masking is shown in
As yet another variation, in some embodiments, this type of predefined interface library may be configurable and/or created by a user rather than by an instrument manufacturer. For example, a user may be provided with an interface through which he or she could upload examples of interfaces that would be displayed by instruments in his or her lab, and could then specify how those interfaces should be masked (e.g., by selecting one or more portions of the interface displaying information that can be used to identify, contact or locate an individual, or other confidential information; by specifying that the entire interface should be masked, etc.). Of course, it is also possible that, in some embodiments, a manufacturer could use this type of configuration interface as well, for example, to add interfaces presented by other manufacturers' machines to a predefined interface library. Accordingly, the preceding discussion of pre-configuration, like the discussion of the operation and components of a pair of instrumented glasses, should not be treated as implying limits on the scope of protection provided by this document or any other document claiming the benefit of this disclosure.
Another example of a type of variation which may be supported by some embodiments, is functionality that could be adapted for allowing aspects of the disclosed technology to be utilized by technicians that are local rather than remote relative to the relevant laboratory instrument. For example, to prevent confidential information from being inadvertently exposed to an onsite service technician, such a technician could be required to wear virtual reality headgear that would block out his or her vision and would replace it with a modified image generated in a manner such as described above so that he or she could still service the relevant machine without the risk of breaking confidentiality. Similarly, in some embodiments, aspects of the disclosed technology could be used to allow a remote technician to provide guidance to the operator of a laboratory instrument to enable him or her to perform service that would otherwise require the service of a specialized technician. For example, if the operator of a laboratory instrument was outfitted with augmented or virtual reality headgear, in some embodiments a remote service technician could be provided an interface that could allow him or her to highlight portions of a field of view image captured by the headgear, and that highlighting could then be presented to the operator (potentially as an overlay on an image that had been scrubbed of confidential information to match what was presented to the remote service technician, or as an overlay on the original image despite the fact that such image would be different from what was made available to the remote technician) as guidance to servicing the instrument. This type of approach could also be applied in contexts where virtual reality headgear is available, for example by displaying information that would otherwise be presented to a user of the laboratory instrument using augmented reality headgear to be presented instead (or in addition) on a display of the laboratory instrument itself, and/or on a display of a device proximate to the laboratory instrument (e.g., a tablet held by the operator, a local computer, etc.).
As another example of a type of alternative application of the disclosed technology, it is possible that images captured by a pair of instrumented glasses or other type of recording headgear could be saved and used as material for subsequent training and/or evaluation, with unnecessary material removed using approaches such as described previously so that such training and evaluation would not expose confidential information to those who should not have access to it. Accordingly, the discussion of using the disclosed technology to facilitate remote service while maintaining confidentiality should be understood as being illustrative only, and should not be treated as limiting.
Further variations on, features for, and potential implementations and applications of the inventors' technology will be apparent to, and could be practiced without undue experimentation by, those of ordinary skill in the art in light of this disclosure. Accordingly, neither this document, nor any document which claims the benefit of this document's disclosure, should be treated as being limited to the specific embodiments of the inventor's technology which are described herein.
As used herein, the singular forms “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise. The invention has now been described in detail for the purposes of clarity and understanding. However, it will be appreciated that certain changes and modifications may be practice within the scope of the appended claims.
As used herein, the term “based on” means that something is determined at least in part by the thing that it is indicated as being “based on.” To indicate that something must be completely determined based on something else, it would be described as being based “exclusively” on whatever it is completely determined by.
As used herein, the term “camera” means a device for capturing and/or recording visual images. Examples of “cameras” include digital cameras that capture images using a charge coupled device and/or complementary metal-oxide-semiconductor sensor.
As used herein, a “computer” should be understood to refer to a group of devices (e.g., a device comprising a processor and a memory) capable of storing and executing instructions for performing one or more logical and/or physical operations on data to produce a result. A “computer” may include, for example, a single-core or multi-core microcontroller or microcomputer, a desktop, laptop or tablet computer, a smartphone, a server, or groups of the foregoing devices (e.g., a cluster of servers which are used in combination to perform operations on data for purposes such as redundancy and availability). In the claims, the word “server” should be understood as being a synonym for “computer,” and the use of different words should be understood as intended to improve the readability of the claims, and not to imply that a “sever” is not a computer. Similarly, the various adjectives preceding the words “server” and “computer” in the claims are intended to improve readability, and should not be treated as limitations.
As used herein, the term “machine” refers to a device or combination of devices.
As used herein, “means for generating modified images which lack confidential information included in images captured by the head mounted camera” should be understood as a limitation set forth as a means for performing a specified function as provided for in 35 U.S.C. § 112(f), where the function is “generating modified images which lack confidential information included in images captured by the head mounted camera,” and the corresponding structure is a computer configured as described in paragraphs 34 and 37-39, and depicted in
As used herein, the term “network” refers to any collection of networks using standard protocols. For example, the term includes a collection of interconnected (public and/or private) networks that are linked together by a set of standard protocols (such as TCP/IP, HTTP, etc.) to form a global, distributed network. The term is also intended to encompass variations that may be made in the future, including changes and additions to existing standard protocols or integration with other media (e.g., television, radio, etc.).
As used herein, the term “sample” refers to any biological sample, and the phrase “biological sample” is meant to cover any specimen of biological material which has been isolated from its natural environment, such as the body of an animal or a human being. It can be in solid form such as tissues, bones, ligaments, and the like. It can also be in liquid form such as blood, spinal fluid, and the like.
As used herein, the term “set” refers to a number, group, or combination of zero or more things of similar nature, design, or function.
As used herein, modifiers such as “first,” “second,” and so forth are simply labels used to improve readability, and are not intended to imply any temporal or substantive difference between the items they modify. For example, referring to items as a “first program” and a “second program” in the claims should not be understood to indicate that the “first program” is created first, or that the two programs would necessarily cause different things to happen when executed by a computer. Similarly, when used in the claims, the words “computer” and “server” should be understood as being synonyms, with the different terms used to enhance the readability of the claims and not to imply any physical or functional difference between items referred to using those different terms.
As used herein, “laboratory” should be understood as a facility in which experiments or tests are performed on materials such as samples of biological materials collected from people or animals for purposes of medical diagnosis or treatment.
This is a continuation of International Patent Application No. PCT/US19/056473, entitled “Service Glasses with Selective Data Provision,” filed Oct. 16, 2019, which claims benefit of, provisional patent application 62/758,147, titled “Service Glasses with Selective Data Provision,” filed in the United States Patent Office on Nov. 9, 2018, the disclosures of which are hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
6091546 | Spitzer | Jul 2000 | A |
6452572 | Fan et al. | Sep 2002 | B1 |
7230582 | Dove et al. | Jun 2007 | B1 |
7372451 | Dempski et al. | May 2008 | B2 |
7447330 | Yamasaki | Nov 2008 | B2 |
7511838 | Hunter | Mar 2009 | B2 |
7714895 | Pretlove et al. | May 2010 | B2 |
7715037 | Castellani et al. | May 2010 | B2 |
8373618 | Friedrich et al. | Feb 2013 | B2 |
8430507 | Howell et al. | Apr 2013 | B2 |
8434863 | Howell et al. | May 2013 | B2 |
8471783 | Rhodes | Jun 2013 | B2 |
8531355 | Maltz | Sep 2013 | B2 |
8531394 | Maltz | Sep 2013 | B2 |
8621362 | Castellani et al. | Dec 2013 | B2 |
8681073 | Robbins et al. | Mar 2014 | B1 |
8681256 | Sako et al. | Mar 2014 | B2 |
8872941 | Asukai et al. | Oct 2014 | B2 |
8934015 | Chi et al. | Jan 2015 | B1 |
8982013 | Sako et al. | Mar 2015 | B2 |
9122321 | Perez et al. | Sep 2015 | B2 |
9128520 | Geisner et al. | Sep 2015 | B2 |
9132342 | Balachandreswaran et al. | Sep 2015 | B2 |
9153074 | Zhou et al. | Oct 2015 | B2 |
9160993 | Lish et al. | Oct 2015 | B1 |
9213163 | Lewis et al. | Dec 2015 | B2 |
9255813 | Liu et al. | Feb 2016 | B2 |
9286711 | Geisner et al. | Mar 2016 | B2 |
9323983 | Monnerat et al. | Apr 2016 | B2 |
9329689 | Osterhout et al. | May 2016 | B2 |
9330313 | Jung et al. | May 2016 | B2 |
9342751 | Heo et al. | May 2016 | B2 |
9470894 | Lee et al. | Oct 2016 | B2 |
9493125 | Heo | Nov 2016 | B2 |
9547184 | Howell et al. | Jan 2017 | B2 |
9667855 | Kim et al. | May 2017 | B2 |
9686466 | Billinghurst et al. | Jun 2017 | B1 |
9690099 | Bar-Zeev et al. | Jun 2017 | B2 |
9706106 | Kang et al. | Jul 2017 | B2 |
9710099 | Rhee et al. | Jul 2017 | B2 |
9729767 | Longbotham et al. | Aug 2017 | B2 |
9729819 | Im et al. | Aug 2017 | B2 |
9734402 | Jang et al. | Aug 2017 | B2 |
9766463 | Border et al. | Sep 2017 | B2 |
9787890 | Cho et al. | Oct 2017 | B2 |
9841599 | Border | Dec 2017 | B2 |
9860411 | Ju et al. | Jan 2018 | B2 |
9866757 | He et al. | Jan 2018 | B2 |
9874998 | Woo et al. | Jan 2018 | B2 |
9892561 | Choukroun et al. | Feb 2018 | B2 |
9904369 | Lai et al. | Feb 2018 | B2 |
10546557 | Jain | Jan 2020 | B2 |
10593066 | Dhua | Mar 2020 | B1 |
20070052672 | Ritter et al. | Mar 2007 | A1 |
20080100570 | Friedrich et al. | May 2008 | A1 |
20120127284 | Bar-Zeev et al. | May 2012 | A1 |
20130083011 | Geisner et al. | Apr 2013 | A1 |
20130083063 | Geisner et al. | Apr 2013 | A1 |
20130278636 | Ota et al. | Oct 2013 | A1 |
20130286163 | Dror et al. | Oct 2013 | A1 |
20140085183 | Na | Mar 2014 | A1 |
20140380446 | Niu et al. | Dec 2014 | A1 |
20150062161 | Kim et al. | Mar 2015 | A1 |
20150095041 | Kim | Apr 2015 | A1 |
20150156803 | Ballard | Jun 2015 | A1 |
20150235610 | Miller et al. | Aug 2015 | A1 |
20150293345 | Laxhuber et al. | Oct 2015 | A1 |
20150309316 | Osterhout et al. | Oct 2015 | A1 |
20150362729 | Jang et al. | Dec 2015 | A1 |
20160027215 | Burns | Jan 2016 | A1 |
20160034032 | Jeong | Feb 2016 | A1 |
20160078449 | Banerjee | Mar 2016 | A1 |
20160147492 | Fugate | May 2016 | A1 |
20160171780 | Vardi | Jun 2016 | A1 |
20160314759 | Shin et al. | Oct 2016 | A1 |
20160350975 | Nakagawa | Dec 2016 | A1 |
20170061212 | Tanaka et al. | Mar 2017 | A1 |
20170064207 | Kim et al. | Mar 2017 | A1 |
20170064209 | Cohen et al. | Mar 2017 | A1 |
20170069135 | Komaki et al. | Mar 2017 | A1 |
20170078755 | Jang et al. | Mar 2017 | A1 |
20170097802 | Jeong | Apr 2017 | A1 |
20170180646 | Kim et al. | Jun 2017 | A1 |
20170206509 | Beyk et al. | Jul 2017 | A1 |
20170230641 | Scavezze et al. | Aug 2017 | A1 |
20170318226 | Jung et al. | Nov 2017 | A1 |
20170337352 | Williams | Nov 2017 | A1 |
20170351920 | Tanaka et al. | Dec 2017 | A1 |
20170364162 | Fujimaki et al. | Dec 2017 | A1 |
20180011344 | Calilung et al. | Jan 2018 | A1 |
20180082020 | Rajagopal | Mar 2018 | A1 |
20180107805 | Anantharaman | Apr 2018 | A1 |
20180157333 | Ross | Jun 2018 | A1 |
20180293041 | Harviainen | Oct 2018 | A1 |
20190088017 | Sato | Mar 2019 | A1 |
20190122437 | Pinti | Apr 2019 | A1 |
20190197254 | Salgar | Jun 2019 | A1 |
20190206141 | Deng | Jul 2019 | A1 |
20190340333 | Srinivasan | Nov 2019 | A1 |
20190340815 | Yildiz | Nov 2019 | A1 |
20190370544 | Wright, Jr. | Dec 2019 | A1 |
20200013373 | Sugaya | Jan 2020 | A1 |
20200082600 | Jones | Mar 2020 | A1 |
20200097065 | Iyer | Mar 2020 | A1 |
20200322506 | Ikegame et al. | Oct 2020 | A1 |
20200404122 | Kim | Dec 2020 | A1 |
20210200886 | Ramamurthy | Jul 2021 | A1 |
Number | Date | Country |
---|---|---|
201054051 | Apr 2008 | CN |
101258436 | Sep 2008 | CN |
102866506 | Jan 2013 | CN |
101819334 | Apr 2013 | CN |
203399222 | Jan 2014 | CN |
102348068 | Nov 2014 | CN |
102591016 | Dec 2014 | CN |
104182050 | Dec 2014 | CN |
204009265 | Dec 2014 | CN |
104570399 | Apr 2015 | CN |
204390168 | Jun 2015 | CN |
104950448 | Sep 2015 | CN |
204636276 | Sep 2015 | CN |
104090385 | Nov 2015 | CN |
105158900 | Dec 2015 | CN |
105158927 | Dec 2015 | CN |
105158931 | Dec 2015 | CN |
105182536 | Dec 2015 | CN |
105184276 | Dec 2015 | CN |
105204643 | Dec 2015 | CN |
105205471 | Dec 2015 | CN |
105223706 | Jan 2016 | CN |
105224923 | Jan 2016 | CN |
105242401 | Jan 2016 | CN |
105259657 | Jan 2016 | CN |
105353508 | Feb 2016 | CN |
105353509 | Feb 2016 | CN |
105353510 | Feb 2016 | CN |
105355196 | Feb 2016 | CN |
105357421 | Feb 2016 | CN |
103186922 | Aug 2016 | CN |
205427327 | Aug 2016 | CN |
106028000 | Oct 2016 | CN |
205847478 | Dec 2016 | CN |
106291985 | Jan 2017 | CN |
205864618 | Jan 2017 | CN |
103856590 | May 2017 | CN |
103529929 | Jun 2017 | CN |
104221077 | Jul 2017 | CN |
104423038 | Jul 2017 | CN |
107272224 | Oct 2017 | CN |
107340853 | Oct 2017 | CN |
206584114 | Oct 2017 | CN |
107680069 | Feb 2018 | CN |
2712213 | Mar 2014 | EP |
2741172 | Aug 2015 | EP |
3352456 | Jul 2018 | EP |
2533553 | Jun 2016 | GB |
2000-349999 | Dec 2000 | JP |
2002-369054 | Dec 2002 | JP |
2004-363987 | Dec 2004 | JP |
2007-173992 | Jul 2007 | JP |
4051702 | Feb 2008 | JP |
2008-146109 | Jun 2008 | JP |
2009-147647 | Jul 2009 | JP |
2012-168642 | Sep 2012 | JP |
2013-236213 | Nov 2013 | JP |
2014-212473 | Nov 2014 | JP |
2015-115723 | Jun 2015 | JP |
2015-228009 | Dec 2015 | JP |
2016-045724 | Apr 2016 | JP |
2016-146044 | Aug 2016 | JP |
5965410 | Aug 2016 | JP |
2016-218905 | Dec 2016 | JP |
2016-224086 | Dec 2016 | JP |
2017-010119 | Jan 2017 | JP |
2017-049762 | Mar 2017 | JP |
2017-142857 | Aug 2017 | JP |
2017-195552 | Oct 2017 | JP |
2017-211766 | Nov 2017 | JP |
2018-011242 | Jan 2018 | JP |
2018-116572 | Jul 2018 | JP |
2018-156293 | Oct 2018 | JP |
2018-179222 | Nov 2018 | JP |
JP WO 2018-179222 | Apr 2019 | JP |
2018-037951 | Sep 2019 | JP |
10-0653303 | May 2006 | KR |
2014-0072651 | Jun 2014 | KR |
2014-0146889 | Dec 2014 | KR |
2015-0001912 | Jan 2015 | KR |
2015-0130767 | Nov 2015 | KR |
2016-0066068 | Jun 2016 | KR |
2017-0087247 | Jul 2017 | KR |
WO 2001095061 | Dec 2001 | WO |
WO 2013077895 | May 2013 | WO |
WO 2014144918 | Jan 2015 | WO |
WO 2015032014 | Mar 2015 | WO |
WO 2016010328 | Jan 2016 | WO |
WO 2016069588 | May 2016 | WO |
WO 2018014534 | Jan 2018 | WO |
Entry |
---|
International Search Report and Written Opinion dated Feb. 6, 2020, for International Application No. PCT/US2019/056473, 17 pages. |
International Search Report and Written Opinion dated Jun. 12, 2020, for International Application No. PCT/US2020/028652, 8 pages. |
U.S. Appl. No. 17/289,345, entitled “Securing Data of Objects in a Laboratory Environment,” filed Apr. 28, 2021. |
Japanese Patent Application No. 2021-524368, Office Action dated Oct. 19, 2023, 16 pages. |
Number | Date | Country | |
---|---|---|---|
20210349677 A1 | Nov 2021 | US |
Number | Date | Country | |
---|---|---|---|
62758147 | Nov 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US2019/056473 | Oct 2019 | WO |
Child | 17314247 | US |