This disclosure relates generally to instruments.
Generally, laboratory environment may be associated with laboratory instruments which perform various operations. The laboratory environment may also include personnel operating the laboratory instruments. Normally, various operations of the laboratory instruments may be captured in the form of images and videos by an image capturing device such as a camera associated with the laboratory instrument. Due to this, the image capturing device may capture images and videos of objects present in proximity of the laboratory instruments along with the images and videos of operations of the laboratory instruments. The objects present in proximity of the laboratory instrument may include face region of one or more personnel in the laboratory environment, one or more equipment in the laboratory environment such as standalone analyzers, table top analyzers, one or more regions of the equipments such as consoles or display region. Usually, capturing images and videos of such objects may carry a risk of exposing information which may be confidential such as personal information of a user proximal to the laboratory instrument or information associated with one or more objects, thereby compromising on data privacy/security.
The present disclosure provides a method, a system and a laboratory instrument to secure data associated with the objects in the laboratory environment thereby overcoming the current disadvantage.
The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the disclosure and should not be taken as an acknowledgement or any form of suggestion that this information forms prior art already known to a person skilled in the art
One or more shortcomings of the prior art may be overcome, and additional advantages may be provided through embodiments of the present disclosure. Additional features and advantages may be realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.
Embodiments of the present disclosure relate to a method for securing data associated with objects captured in a laboratory environment. In one embodiment, the method includes receiving images of a plurality of objects in the laboratory environment. In a further embodiment, the method includes identifying one or more objects from the images which matches predefined target images. The predefined target objects may include secure data. The method further includes applying a virtual masking object on and around the one or more objects matching with the predefined target objects for securing data associated with the one or more objects. In some embodiment, the virtual masking object is applied on the one or more objects when the identified one or more objects are one of face region of a user in a laboratory environment or ID card of the user which consist of confidential data of the user. In some embodiment, the virtual masking object is applied around the one or more objects when the identified one or more objects are equipments or region of equipments which match the predefined target objects. The virtual masking object is applied at a predefined distance from a laboratory instrument configured with an image capturing device for capturing the images.
The foregoing summary is only illustrative in nature and is not intended to be in any way limiting on the embodiments disclosed herein. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same reference numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and with reference to the accompanying figures, in which:
It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.
In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the teachings of this disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense. A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
In the present document, the word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any embodiment or implementation of the present subject matter described herein as “exemplary” is not necessarily construed to be as preferred or advantageous over other embodiments that may be disclosed.
While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof have been illustrated by way of example in the drawings and will be described in detail below. It should be understood, however that this is not intended to limit the disclosure to the forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
The terms “comprises”, “comprising”, “includes” or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device or method that includes a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises . . . a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or method. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open-ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
According to one embodiment, a method of securing data associated with objects captured in a specific environment may be provided. In an example embodiment, such a specific environment may be a laboratory environment. Such a method may comprise receiving images of a plurality of objects, wherein the plurality of objects may be associated with a device. Such a method may comprise identifying, from the images, one or more objects matching with predefined target objects. Such a method may comprise applying a virtual masking object on the one or more objects matching with the predefined target objects for securing data associated with the one or more matching objects.
According to a further embodiment, a method such as described in the preceding paragraph may be configured to allow a user to mark the plurality of objects dynamically in real time. According to a further embodiment, a method such as described in the preceding paragraph may be provided in which applying the virtual masking object around the one or more objects may comprise identifying a predefined distance around an equipment in the specific environment comprising an image capturing device configured to capture the image of the plurality of objects in the specific environment. In such methods, applying the virtual masking object around the one or more objects may also comprise applying the virtual masking object at the predefined distance from the image capturing device. According to a further embodiment, a method such as described in the preceding paragraph may be provided in which applying the virtual masking object around the one or more objects comprises marking in real time by a user a predefined distance around an equipment in the specific environment comprising an image capturing device configured to capture the image of the plurality of objects in the specific environment. In such methods, applying the virtual masking object around the one or more objects may also comprise applying the virtual masking object at the predefined distance from the image capturing device.
According to a further embodiment, a method such as described in any of the preceding two paragraphs may be provided in which the one or more objects may comprise at least one of a face region of one or more personnel in the specific environment or a region selected in real time by the user, identification card of the one or more personnel in the specific environment, one or more equipment in the specific environment, and one or more regions of one or more equipment.
According to a further embodiment, a method such as described in any of the preceding three paragraphs may be provided in which the one or more objects matching with the predefined target objects may be identified based on processing of the one or more objects using a machine learning technique.
According to further embodiment, a method such as described in any of the preceding four paragraphs may be provided in which an image of the plurality of objects in the laboratory environment may be a still image or a video.
According to a further embodiment, a method such as described in any of the preceding five paragraphs may be provided wherein the specific environment is a laboratory environment, and where in the device is a laboratory instrument.
According to a further embodiment, a system for securing data associated with objects captured in a laboratory environment may be provided. Such a system may comprise a processor and a memory unit communicatively coupled to the processor and storing processor-executable instructions which, on execution, cause the processor to perform a method as described in the context of any of the preceding six paragraphs. According to another embodiment, a laboratory instrument may be provided which comprises at least one image capturing device to capture an image of a plurality of objects in a laboratory environment and a processor configured to perform a method such as described in the context of any of the preceding six paragraphs.
Embodiments disclosed herein may include a method, system and a laboratory instrument (generally referred as diagnostic instruments) for securing data associated with objects in a laboratory environment. In some embodiments, the objects may include, but are not limited to, equipment such as laboratory instruments, one or more personnel in the laboratory environment and one or more regions of the equipments. In some embodiments the data may be personal information associated with the one or more personnel and confidential data associated with the equipments. In some embodiments, the laboratory instrument may include, but are not limited to, a diagnostic instrument and a non-diagnostic instrument. In a further embodiment, the laboratory instrument may include health care related instruments. In some embodiment, the phrase “one or more personnel” and the word “personnel” may be alternatively used. The laboratory instrument may be associated with one or more devices to perform at least one operation of the laboratory instrument. In some embodiments, at least one image capturing device and a processor may be coupled to the laboratory instrument. In some embodiments, the phrase “at least one image capturing device” and the word “image capturing device/s” may be alternatively used. In some embodiments, the image capturing device(s) may be configured to capture at least one of images and videos during operations of the laboratory instrument. In some embodiments, the image capturing device(s) may be stationary having a fixed Field of View (FOV). In some other embodiments, the image capturing device(s) may be movable having a varying FOVs. In some embodiments, the image capturing device associated with the laboratory instrument may capture images of the plurality of objects in the laboratory environment. In certain other embodiments, the processor may receive the captured image. Upon receiving the captured image, the processor may identify one or more objects matching with predefined target objects. In some embodiments, the predefined target objects may be face region of one or more personnel in the laboratory instrument. In some other embodiments, the predefined target object may be equipments such as table-top analyzers and stand alone analyzers. In yet some other embodiments, the phrase “one or more equipment” and the word “equipments” may be alternatively used. In some other embodiments, the predefined target object may be one or more regions of the one or more equipment such as a console screen. In some other embodiment, a user may be allowed to mark a region of interest in real time, defining the pre-defined target object.
In some embodiments, the processor may apply a virtual masking object on or/and around the identified one or more objects to secure data associated with the identified one or more objects. In some other embodiments, if the identified object is face region of the one or more personnel in the laboratory environment or identification (ID) card of the one or more personnel, the virtual masking object may include, but not limited to, an emoticon may be applied on the identified face region or the identified ID card. In some embodiments, if the identified one or more objects are one or more equipment or one or more regions of the one or more equipment such as a table top analyzer, standalone analyzer, the processor may apply the virtual masking object around the identified one or more objects. In some embodiments, the virtual masking object may be applied at a predefined distance from the laboratory instrument comprising the image capturing device. In some embodiments, the virtual masking object may include, but is not limited to, augmented reality based curtain panel. In some embodiments, the virtual masking object may be applied to mask the identified objects and hence prevents exposure of the data related to the identified objects to the outside environment. In some embodiment, the outside environment may be a remote server associated with the one or more equipment in the laboratory environment. In some other embodiment, the outside environment may be any other environment external to the laboratory environment.
Reference is now made to
The laboratory environment 100 may include one or more equipments such as one or more laboratory instruments (laboratory instrument 11011 to laboratory instrument 101n, collectively referred as laboratory instruments 101) and one or more users or one or more personnel's (collectively referred as users/user 103) in the laboratory environment 100 for operating or viewing operations of the laboratory instruments 101. As an example, the laboratory instruments 101 may be a diagnostic instrument, a non-diagnostic instrument or any other health care instrument. As an example, the laboratory instruments 101 may include, but are not limited to, table top analyzers and standalone analyzers. In some embodiments, user 103 may be a laboratory technician, a visitor or any other person viewing laboratory instruments 101 or present proximate to the laboratory instruments 101.
Reference is now made to
In some embodiments, the predefined target objects may be defined by the user 103 in real-time. As an example, the predefined target objects may be face region of users 103 in the laboratory environment 100 and a standalone equipment in the laboratory environment 100. The features of the face region and the standalone equipment are extracted and stored in the memory unit 109. As an example, the feature of the face region may be eyes, nose and mouth. As an example, the features of the standalone equipment may be display region, input region and the like.
The image capturing device 105 may capture images of the plurality of objects in the environment. Some of the objects being captured in the images may contain secure data associated with the objects. If the objects being captured match with the predefined objects, the processor 107 may identify such objects as the objects containing secure data and hence mask them to prevent exposure of such data to the outside environment.
In some embodiment, when the one or more objects which match with the predefined objects are identified, the processor 107 may apply a virtual masking object on and around the identified one or more objects to prevent exposure of the secure data associated with the identified one or more objects. In some other embodiments, the virtual masking object may be applied on or around the one or more objects based on the identified one or more objects.
In some embodiments, the predefined target object may be face region of the user 103 which discloses identity of the user 103. The one or more features of the face region may be extracted and stored in the memory unit 109 associated with the processor 107. The image capturing device 105 configured in the laboratory instrument 101 may capture images of the plurality of objects in the laboratory environment 100. The plurality of objects may include face region of the user 103 and one or more equipments. The processor 107 may identify the one or more objects from the images which match with the predefined target objects. The processor 107 may identify the one or more objects based on comparison of one or more features of the one or more objects with features of the target objects using a machine learning technique. If the features of the one or more objects match with the features of the predefined target objects, the processor 107 may identify the one or more objects as matching with the predefined target objects. In this scenario, there may be one or more users 103 in the image captured by the image capturing device 105. The processor 107 may detect face region of the one or more users 103 matching with the predefined target object. The processor 107 may apply the virtual masking object on the face region to prevent exposure of the face region of the user 103 to the outside environment. In some embodiment, the virtual masking object may be an emoticon or any other masking object to mask the face region of the user 103 such that the face region is not exposed to the outside environment which discloses identity of the user 103.
In some embodiment, the predefined target object may be a standalone equipment. The features of the standalone equipment may be extracted and stored in the memory unit 109. The image capturing device 105 configured in the laboratory instrument 101 may capture images of the plurality of objects in the laboratory environment 100. The plurality of objects may include one or more users 103 and one or more equipments. The processor 107 may identify the one or more objects from the images which match with the predefined target objects. The processor 107 may identify the one or more objects based on comparison of one or more features of the one or more objects with features of the target objects using a machine learning technique. If the one or more features of the one or more objects match with the one or more features of the predefined target objects, the processor 107 may identify the one or more objects as matching with the predefined target objects. In this scenario, there may be one or more equipments in the laboratory environment 100. Among the one or more objects, the feature of one of the equipment may match with the features of the predefined target object. The processor 107 identifies one of the one or more objects and applies a virtual masking object around the identified object such that the identified object is not exposed to the outside environment.
As an example, the virtual masking object may be a virtual curtain panel. The virtual curtain panel may be placed at a predefined distance from the image capturing device 105 configured in the laboratory instrument 101. As an example, the predefined distance may be 2 meters from the laboratory instrument 101 configured with the image capturing device 105. The virtual masking object may be placed at 2 meters from the laboratory instrument 101 comprising the image capturing device 105. Once the virtual curtain panel is placed, the view of the identified objects which matches with predefined target objects is blocked for the laboratory instrument 101 comprising the image capturing device 105. This prevents the exposure of the secure data associated with the identified objects to the outside environment.
Reference is now made to
Reference is now made to
Reference is now made to
Reference is now made to
As illustrated in
The order in which method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement method 400. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 400 can be implemented in any suitable hardware, software, firmware, or combination thereof.
At block 401, method 400 may include receiving, by a processor configured in a laboratory instrument, images of plurality of objects in a laboratory environment. The plurality of objects may include, but not limited to, equipments, region of the equipments, face region of the users and Identification (ID) card of the users. The images may be captured by an image capturing device associated with the laboratory instrument.
At block 403, the method 400 may include identifying, by the processor, from the images, one or more objects matching with predefined target objects. The predefined target objects may be defined by the user in real-time. The predefined target objects are the objects which may include security data and hence must be masked to prevent exposure of the security data to the outside environment.
At block 405, the method 400 may include applying, by the processor, a virtual masking object on and around the one or more objects which matches the predefined target object to secure the data associated with the identified one or more objects. In some other embodiments, the virtual masking object may be applied on or around the one or more objects based on the identified one or more objects. The virtual masking object may be applied on the identified object when the identified object is either face region of a user or ID card of the user. The virtual masking object may be applied around the laboratory instrument comprising the image capturing device when the identified object is equipments or region of the equipments which may comprise secure data.
In an embodiment, the present disclosure discloses a method, system and a laboratory instrument for securing data associated with one or more objects in a laboratory environment. In a further embodiment, the present disclosure provides a method for applying a virtual masking object on or around one or more objects matched with predefined target objects. The virtual masking objects may be placed on the one or more objects when the one or more objects are such as face region of users in the laboratory environment. The virtual masking objects may be placed around the one or more objects when the one or more objects are one or more equipment or region of one or more equipment that must be blocked from the view of the image capturing device. By applying the virtual masking objects on or around the one or more objects, the exposure of the objects to the outside environment is prevented, thereby providing data security.
As described herein a description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
When a single device or article is described herein, it will be apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be apparent that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the disclosure need not include the device itself.
The specification describes a method, system and laboratory instrument for securing data associated with objects in a laboratory environment. The illustrated steps are set out to explain exemplary embodiments shown, and it should be anticipated that on-going technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not as a limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the embodiments of the present disclosure are intended to be illustrative, but not limiting, of the scope of the disclosure, which is set forth in the following claims.
This application is a National Stage Entry of PCT Application No. PCT/US20/28652, entitled “Securing Data of Objects in a Laboratory Environment,” filed Apr. 17, 2020, which claims priority to U.S. Provisional Application No. 62/835,833, entitled “Securing Data of Objects in a Laboratory Environment,” filed Apr. 18, 2019, the disclosures of which are incorporated by reference herein. This is related to, and claims the benefit of, previously filed provisional application 62/835,833, filed in the United States on Apr. 18, 2019 and titled securing data of objects in a laboratory environment. The disclosure of that application is hereby incorporated by reference in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2020/028652 | 4/17/2020 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2020/214897 | 10/22/2020 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6091546 | Spitzer | Jul 2000 | A |
6452572 | Fan et al. | Sep 2002 | B1 |
7230582 | Dove et al. | Jun 2007 | B1 |
7372451 | Dempski et al. | May 2008 | B2 |
7447330 | Yamasaki | Nov 2008 | B2 |
7511838 | Hunter | Mar 2009 | B2 |
7714895 | Pretlove et al. | May 2010 | B2 |
7715037 | Castellani et al. | May 2010 | B2 |
8373618 | Friedrich et al. | Feb 2013 | B2 |
8430507 | Howell et al. | Apr 2013 | B2 |
8434863 | Howell et al. | May 2013 | B2 |
8471783 | Rhodes | Jun 2013 | B2 |
8531355 | Maltz | Sep 2013 | B2 |
8531394 | Maltz | Sep 2013 | B2 |
8621362 | Castellani et al. | Dec 2013 | B2 |
8681073 | Robbins et al. | Mar 2014 | B1 |
8681256 | Sako et al. | Mar 2014 | B2 |
8832784 | Budko | Sep 2014 | B2 |
8872941 | Asukai et al. | Oct 2014 | B2 |
8934015 | Chi et al. | Jan 2015 | B1 |
8982013 | Sako et al. | Mar 2015 | B2 |
9122321 | Perez et al. | Sep 2015 | B2 |
9128520 | Geisner et al. | Sep 2015 | B2 |
9132342 | Balachandreswaran et al. | Sep 2015 | B2 |
9153074 | Zhou et al. | Oct 2015 | B2 |
9160993 | Lish et al. | Oct 2015 | B1 |
9213163 | Lewis et al. | Dec 2015 | B2 |
9255813 | Liu et al. | Feb 2016 | B2 |
9286711 | Geisner et al. | Mar 2016 | B2 |
9323983 | Monnerat et al. | Apr 2016 | B2 |
9329689 | Osterhout et al. | May 2016 | B2 |
9330313 | Jung et al. | May 2016 | B2 |
9342751 | Heo et al. | May 2016 | B2 |
9470894 | Lee et al. | Oct 2016 | B2 |
9493125 | Heo | Nov 2016 | B2 |
9547184 | Howell et al. | Jan 2017 | B2 |
9667855 | Kim et al. | May 2017 | B2 |
9686466 | Billinghurst et al. | Jun 2017 | B1 |
9690099 | Bar-Zeev et al. | Jun 2017 | B2 |
9706106 | Kang et al. | Jul 2017 | B2 |
9710099 | Rhee et al. | Jul 2017 | B2 |
9729767 | Longbotham et al. | Aug 2017 | B2 |
9729819 | Im et al. | Aug 2017 | B2 |
9734402 | Jang et al. | Aug 2017 | B2 |
9766463 | Border et al. | Sep 2017 | B2 |
9787890 | Cho et al. | Oct 2017 | B2 |
9841599 | Border | Dec 2017 | B2 |
9860411 | Ju et al. | Jan 2018 | B2 |
9866757 | He et al. | Jan 2018 | B2 |
9874998 | Woo et al. | Jan 2018 | B2 |
9892561 | Choukroun et al. | Feb 2018 | B2 |
9904369 | Lai et al. | Feb 2018 | B2 |
20070052672 | Ritter et al. | Mar 2007 | A1 |
20080100570 | Friedrich et al. | May 2008 | A1 |
20120127284 | Bar-Zeev et al. | May 2012 | A1 |
20130083011 | Geisner et al. | Apr 2013 | A1 |
20130083063 | Geisner et al. | Apr 2013 | A1 |
20130286163 | Dror et al. | Oct 2013 | A1 |
20140085183 | Na | Mar 2014 | A1 |
20140380446 | Niu et al. | Dec 2014 | A1 |
20150062161 | Kim et al. | Mar 2015 | A1 |
20150095041 | Kim | Apr 2015 | A1 |
20150235610 | Miller et al. | Aug 2015 | A1 |
20150293345 | Laxhuber et al. | Oct 2015 | A1 |
20150309316 | Osterhout et al. | Oct 2015 | A1 |
20150362729 | Jang et al. | Dec 2015 | A1 |
20160034032 | Jeong | Feb 2016 | A1 |
20160078449 | Banerjee | Mar 2016 | A1 |
20160171780 | Vardi | Jun 2016 | A1 |
20160314759 | Shin et al. | Oct 2016 | A1 |
20170061212 | Tanaka et al. | Mar 2017 | A1 |
20170064207 | Kim et al. | Mar 2017 | A1 |
20170064209 | Cohen et al. | Mar 2017 | A1 |
20170069135 | Komaki et al. | Mar 2017 | A1 |
20170078755 | Jang et al. | Mar 2017 | A1 |
20170097802 | Jeong | Apr 2017 | A1 |
20170180646 | Kim et al. | Jun 2017 | A1 |
20170206509 | Beyk et al. | Jul 2017 | A1 |
20170230641 | Scavezze et al. | Aug 2017 | A1 |
20170318226 | Jung et al. | Nov 2017 | A1 |
20170351920 | Tanaka et al. | Dec 2017 | A1 |
20170364162 | Fujimaki et al. | Dec 2017 | A1 |
20180011344 | Calilung et al. | Jan 2018 | A1 |
Number | Date | Country |
---|---|---|
201054051 | Apr 2008 | CN |
101258436 | Sep 2008 | CN |
102866506 | Jan 2013 | CN |
101819334 | Apr 2013 | CN |
203399222 | Jan 2014 | CN |
102348068 | Nov 2014 | CN |
102591016 | Dec 2014 | CN |
104182050 | Dec 2014 | CN |
204009265 | Dec 2014 | CN |
104570399 | Apr 2015 | CN |
204390168 | Jun 2015 | CN |
104950448 | Sep 2015 | CN |
204636276 | Sep 2015 | CN |
104090385 | Nov 2015 | CN |
105158900 | Dec 2015 | CN |
105158927 | Dec 2015 | CN |
105158931 | Dec 2015 | CN |
105182536 | Dec 2015 | CN |
105184276 | Dec 2015 | CN |
105204643 | Dec 2015 | CN |
105205471 | Dec 2015 | CN |
105223706 | Jan 2016 | CN |
105224923 | Jan 2016 | CN |
105242401 | Jan 2016 | CN |
105259657 | Jan 2016 | CN |
105353508 | Feb 2016 | CN |
105353509 | Feb 2016 | CN |
105353510 | Feb 2016 | CN |
105355196 | Feb 2016 | CN |
105357421 | Feb 2016 | CN |
103186922 | Aug 2016 | CN |
205427327 | Aug 2016 | CN |
106028000 | Oct 2016 | CN |
205847478 | Dec 2016 | CN |
106291985 | Jan 2017 | CN |
205864618 | Jan 2017 | CN |
103856590 | May 2017 | CN |
103529929 | Jun 2017 | CN |
104221077 | Jul 2017 | CN |
104423038 | Jul 2017 | CN |
107272224 | Oct 2017 | CN |
206584114 | Oct 2017 | CN |
107340853 | Nov 2017 | CN |
107680069 | Feb 2018 | CN |
2712213 | Mar 2014 | EP |
2741172 | Aug 2015 | EP |
3352456 | Jul 2018 | EP |
3352456 | Jul 2018 | EP |
2533553 | Jun 2016 | GB |
2002-369054 | Dec 2002 | JP |
2004-363987 | Dec 2004 | JP |
4051702 | Feb 2008 | JP |
2008-146109 | Jun 2008 | JP |
2013-236213 | Nov 2013 | JP |
2014-212473 | Nov 2014 | JP |
2015-228009 | Dec 2015 | JP |
2016-045724 | Apr 2016 | JP |
2016-146044 | Aug 2016 | JP |
5965410 | Aug 2016 | JP |
2016-218905 | Dec 2016 | JP |
2016-224086 | Dec 2016 | JP |
2017-010119 | Jan 2017 | JP |
2017-049762 | Mar 2017 | JP |
2017-142857 | Aug 2017 | JP |
2017-195552 | Oct 2017 | JP |
10-0653303 | May 2006 | KR |
2014-0072651 | Jun 2014 | KR |
2014-0146889 | Dec 2014 | KR |
2015-0001912 | Jan 2015 | KR |
2015-0130767 | Nov 2015 | KR |
2016-0066068 | Jun 2016 | KR |
2017-0087247 | Jul 2017 | KR |
WO 2001095061 | Dec 2001 | WO |
WO 2013077895 | May 2013 | WO |
WO 2014144918 | Jan 2015 | WO |
WO 2015032014 | Mar 2015 | WO |
WO 2016010328 | Jan 2016 | WO |
WO 2016069588 | May 2016 | WO |
WO 2018014534 | Jan 2018 | WO |
Entry |
---|
International Search Report and Written Opinion dated Feb. 6, 2020, for International Application No. PCT/US2019/056473, 17 pages. |
International Search Report and Written Opinion dated Jun. 12, 2020, for International Application No. PCT/US2020/028652, 8 pages. |
U.S. Appl. No. 17/314,247, entitled “Service Glasses with Selective Data Provision,” filed May 7, 2021. |
Number | Date | Country | |
---|---|---|---|
20210397821 A1 | Dec 2021 | US |
Number | Date | Country | |
---|---|---|---|
62835833 | Apr 2019 | US |