The present invention relates generally to a near eye display system, and specifically to a display able to combine video-based and optic-based augmented reality.
A near eye display system may be used in an augmented reality situation, where a scene that is being viewed by a user of the assembly is altered, typically by being augmented or supplemented. The alteration is computer processor generated, and typically involves presenting real time video, and/or non-real time images, to the user while the user is gazing at the scene.
U. S. Patent Application 2010/0149073, to Chaum et al., whose disclosure is incorporated herein by reference, describes a near eye display system. The system includes a source of modulated light, and a “proximal optic” positionable adjacent to an eye of a system user to receive the modulated light. The proximal optic has a plurality of groups of optically redirecting regions.
U. S. Patent Application 2012/0068913, to Bar-Zeev et al., whose disclosure is incorporated herein by reference, describes an optical see-through head-mounted display device. The device includes a see-through lens which combines an augmented reality image with light from a real-world scene, while an opacity filter is used to selectively block portions of the real-world scene so that the augmented reality image appears more distinctly.
U. S. Patent Application 2013/0050258, to Liu et al., whose disclosure is incorporated herein by reference, describes a see-through head-mounted display device that provides an augmented reality image which is associated with a real-world object. Initially, the object is identified by a user, e.g., based on the user gazing at the object for a period of time, making a gesture such as pointing at the object and/or providing a verbal command.
Documents incorporated by reference in the present patent application are to be considered an integral part of the application except that, to the extent that any terms are defined in these incorporated documents in a manner that conflicts with definitions made explicitly or implicitly in the present specification, only the definitions in the present specification should be considered.
An embodiment of the present invention provides apparatus, including:
a retaining structure, configured to be positioned in proximity to an eye of a subject, the eye of the subject having a pupil with a pupil diameter;
an optical combiner mounted on the structure in front of the eye;
a pixelated screen, having an array of variably transparent pixels, coating the optical combiner;
at least one image capturing device mounted on the structure configured to capture an image of a scene viewed by the eye;
a projector mounted on the structure and configured to project at least one of a portion of the captured image and a stored image onto a section of the screen at a selected location thereof; and
a processor, configured to render the section of the screen at least partially opaque, to select the location of the section in response to a region of interest in the scene identified by analysis of the captured image, and to determine a dimension of the section in response to the pupil diameter.
The processor may be configured to identify the region of interest in response to radiation received by the image capturing device from at least one marker located at the region of interest.
The apparatus may include a further image capturing device configured to identify the region of interest in response to received radiation from at least one marker located at the region of interest. The at least one image capturing device may be configured to operate in the visible spectrum, and the further image capturing device may be configured to operate in the non-visible spectrum. The apparatus may include a radiator configured to radiate radiation in the non-visible spectrum towards the region of interest.
In a disclosed embodiment the apparatus includes at least one marker positioned in proximity to the region of interest, and wherein the processor is configured to detect the marker in the captured image so as to identify the region of interest.
In a further disclosed embodiment the processor is configured to determine an initial pupil diameter in response to the dimension of the section being set by the subject to occlude an object of known size while the subject gazes at the object in a known ambient light brightness. Typically, the processor is configured to determine a brightness of the scene in response to the captured image of the scene, and the processor is configured to determine the pupil diameter in response to the initial pupil diameter and the brightness of the scene.
In a yet further disclosed embodiment the processor is configured to determine an initial pupil diameter in response to analysis of a reflected image of the subject while the subject gazes into a mirror in a known ambient light brightness. Typically, the processor is configured to determine a brightness of the scene in response to the captured image of the scene, and the processor is configured to determine the pupil diameter in response to the initial pupil diameter and the brightness of the scene.
In an alternative embodiment the dimension of the section is determined so that the region of interest is occluded. Typically, a region surrounding the region of interest is partially occluded. A fraction of occlusion in the region surrounding the region of interest may be determined in response to the pupil diameter. The processor may be configured to derive from the captured image an image corresponding to the region surrounding the region of interest, and the projector may be configured to project the derived image onto an area of the screen surrounding the at least partially opaque section of the screen. An intensity of the projected derived image may be determined in response to the fraction of occlusion.
In a further alternative embodiment the dimension of the section is determined in response to a size of the region of interest.
The dimension of the section may be determined so that an area greater than the region of interest is occluded. Alternatively, the dimension of the section may be determined so that an area less than the region of interest is occluded.
In a yet further alternative embodiment the retaining structure is a spectacle frame. Alternatively, the retaining structure is a helmet having a head-up display.
Typically, the at least one image capturing device includes two image capturing devices capturing respective images of the scene, and the processor is configured to identify the region of interest by analysis of the respective images.
There is further provided, according to an embodiment of the present invention, a method, including:
positioning a retaining structure in proximity to an eye of a subject, the eye of the subject having a pupil with a pupil diameter;
mounting an optical combiner on the structure in front of the eye;
coating the optical combiner with a pixelated screen, having an array of variably transparent pixels;
mounting at least one image capturing device on the structure so as to capture an image of a scene viewed by the eye;
mounting a projector on the structure the projector being configured to project at least one of a portion of the captured image and a stored image onto a section of the screen at a selected location thereof;
rendering the section of the screen at least partially opaque;
selecting the location of the section in response to a region of interest in the scene identified by analysis of the captured image; and
determining a dimension of the section in response to the pupil diameter.
There is further provided, according to an embodiment of the present invention, apparatus, including:
a retaining structure, configured to be positioned in proximity to an eye of a subject;
an optical combiner mounted on the structure in front of the eye;
a pixelated screen, having an array of variably transparent pixels, coating the optical combiner;
at least one image capturing device mounted on the structure configured to capture an image of a scene viewed by the eye;
a processor, configured to render a section of the screen at least partially opaque, and
a projector mounted on the structure and configured to project at least one of a portion of the captured image and a stored image onto the section of the screen so that there is misalignment between the scene viewed by the eye through the combiner and the at least one portion of the captured image and the stored image.
Typically, for a scene at 50 cm from the eye, the misalignment is no more than 2 cm.
The projector may be configured to project the portion of the captured image and the stored image, in registration with each other, onto the section of the screen.
There is further provided, according to an embodiment of the present invention, apparatus, including:
a retaining structure, configured to be positioned in proximity to an eye of a subject;
an optical combiner mounted on the structure in front of the eye;
a rotator connected to the optical combiner and configured to rotate the optical combiner about an axis;
a pixelated screen, having an array of variably transparent pixels, coating the optical combiner;
at least one image capturing device mounted on the structure configured to capture an image of a scene viewed by the eye; and
a processor, configured to render a section of the screen at least partially opaque, and to activate the rotator so that the optical combiner is oriented to be orthogonal to a region of interest in the scene.
The processor is typically configured to select the section of the screen so as to occlude the region of interest.
The axis may be a vertical axis.
There is further provided, according to an embodiment of the present invention, a method, including:
positioning a retaining structure in proximity to an eye of a subject;
mounting an optical combiner on the structure in front of the eye;
coating the optical combiner with a pixelated screen comprising an array of variably transparent pixels;
mounting at least one image capturing device on the structure, the device being configured to capture an image of a scene viewed by the eye;
rendering a section of the screen at least partially opaque;
mounting a projector on the structure; and
configuring the projector to project at least one of a portion of the captured image and a stored image onto the section of the screen so that there is misalignment between the scene viewed by the eye through the combiner and the at least one portion of the captured image and the stored image.
There is further provided, according to an embodiment of the present invention, a method, including:
positioning a retaining structure in proximity to an eye of a subject;
mounting an optical combiner on the structure in front of the eye;
connecting a rotator to the optical combiner, the rotator being configured to rotate the optical combiner about an axis;
coating the optical combiner with a pixelated screen having an array of variably transparent pixels;
mounting at least one image capturing device on the structure, the device being configured to capture an image of a scene viewed by the eye;
rendering a section of the screen at least partially opaque; and
activating the rotator so that the optical combiner is oriented to be orthogonal to a region of interest in the scene.
The present disclosure will be more fully understood from the following detailed description of the embodiments thereof, taken together with the drawings, in which:
An embodiment of the present invention provides a near eye assembly having a retaining structure that is configured to be positioned in proximity to the eye of a user of the assembly. Typically, the retaining structure comprises a spectacle frame. Alternatively, the retaining structure comprises a head up-display which may be mounted on a helmet worn by the assembly user.
An optical combiner is mounted on the structure in front of the user eye. Typically, two combiners are mounted, one in front of each eye. The optical combiner at least partially transmits elements of a scene in front of the assembly through the combiner. In addition, the optical combiner may receive a visible radiation transmission derived from a scene, and/or a visual transmission such as a presentation of data or a marker, and redirects the transmission back to the user's eye.
A pixelated screen, comprising an array of variably transparent pixels, coats the optical combiner. Typically, the pixels are liquid crystal display (LCD) pixels.
There is at least one image capturing device, typically two such devices, one for each eye, mounted on the structure. The capturing device is typically a visible spectrum camera that is configured to capture an image of a scene viewed by the user's eye.
A projector, typically a micro-projector, is mounted on the structure. Typically two projectors, one for each eye, are mounted on the structure. The projector is configured to project at least one of a portion of the captured image as a video, as well as a stored image, onto a section of the screen that a processor renders at least partially opaque. The at least partially opaque section is also referred to herein as an occlusion mask, or just as a mask.
The processor is configured to select the location of the section in response to a region of interest in the scene identified by analysis of the captured image. Typically, at least one marker is positioned near the region of interest, and the processor analyzes the captured image to locate the marker and so identify the region of interest. Rendering the section opaque occludes the region of interest from the user's eye.
In addition, the processor determines a dimension of the section, typically, in the case of the section being circular, the diameter of the section. The dimension is determined in response to the pupil diameter.
By setting the dimension of the section according to the pupil diameter, embodiments of the present invention more exactly control the area of the region of interest that is occluded. In addition, because of the finite size of the pupil, there is a region surrounding region of interest that is partially occluded. In some embodiments the processor operates the micro-projector to overlay relevant portions of the captured image on the partially occluded region, so as to compensate for the partial occlusion.
As stated above, a portion of the captured image may be projected as a video onto the occlusion mask. In some embodiments the captured image portion video corresponds to the occluded region of interest. There is a non-occluded region surrounding the occluded region of interest, and this non-occluded region is visible to the user through the combiner. In embodiments of the invention the video and the visible non-occluded region are typically not in accurate registration, due to slight inevitable movements of the display relative to the user's eye.
In some embodiments a stored image, such as an image of a tool, is overlaid on, and in accurate registration with, the occluded region video.
The inventors have found that registering the stored image with the video, even though the video is not fully registered with the surrounding visible region, provides an acceptable image for the user. The inventors have found that for a non-occluded region that appears to be 50 cm from the user's eye, the video and the non-occluded region may be out of registration by up to 2 cm, while still being acceptable to the user.
Thus, in contrast to prior art augmented reality systems, embodiments of the present invention are configured to operate with mis-alignment between the visible portion of a scene and an augmented reality portion of the scene. However, there is no mis-alignment between elements within the augmented reality video, i.e., the elements projected onto the occlusion mask.
In some embodiments, the optical combiner may be rotated about an axis by the processor. In the case of two combiners, they may be independently rotated about respective axes. The independent rotations may be used to orient both combiners so that each is orthogonal to the direction of gaze of the user's eyes.
Reference is now made to
System 20 is operated by a medical professional 22, who wears an augmented reality assembly 24, described in more detail below with respect to
System 20 comprises and is under overall control of a processor 26. In one embodiment processor 26 is assumed to be incorporated within a stand-alone computer 28, and the processor typically communicates with other elements of the system, including assembly 24, wirelessly, as is illustrated in
The medical procedure exemplified here is on a patient 30, and during the procedure professional 22 gazes along gaze directions 32 at a region of interest (ROI) 34. ROI 34 typically, but not necessarily, comprises a portion of the patient. In some embodiments one or more ROI acquisition markers 35, comprising marker elements 36, are positioned in, and/or in proximity to, ROI 34, and the functions of such markers are described below. Typically there are at least three marker elements 36 for a given marker 35. In a disclosed embodiment the size of ROI 34 may be predefined by professional 22, for example based on a computerized tomography (CT) image of the patient, and the position of the ROI may also be a predefined distance to the right and a predefined distance below the marker. In an alternative embodiment marker elements 36 of marker 35 define ROI 34 to be a region within a surface having elements 36 in the perimeter of the surface. Typically, a margin in an approximate range of 1-5 cm is added to ROI 34 to compensate for mis-alignment between a video projection and a directly viewed scene, described in more detail below.
During the procedure professional 22 may use a surgical device 38, such as a surgical knife, to perform part of the procedure. Typically device 38 comprises one or more identifying elements 39 which may be used to track the device.
Thus spectacles 50 comprise planar optical combiners 52, comprising combiners 52A and 52B in front of, respectively, the left and right eyes of professional 22. Optical combiners 52 are mounted on a retaining structure 54 which holds elements of assembly 24, and which is herein assumed to comprise a spectacle frame, so that structure 54 is also referred to herein as frame 54.
In some embodiments, combiner frames 82A and 82B are fixed to retaining structure 54 and vertical retaining rods 84A and 84B attached to the combiner frames support the optical combiners, so that the combiners are able to rotate about vertical axes defined by the rods. Retaining rods 84A and 84B, and thus combiners 52A and 52B, may be rotated independently of each other about their vertical axes by respective motors 86A and 86B, fixed to frames 82A and 82B. Motors 86, typically stepper motors, are controlled by processor 26 so as to rotate their attached combiners to known, typically different, fixed orientations with respect to their respective combiner frames.
Each optical combiner 52 is configured to at least partially transmit elements of a scene through the combiner, so that a portion 56 of patient 30 (
Optical combiners of various types are known in the art. One known type uses a semi reflective surface which transmits an image from an image source after it has passed through a set of lenses which correct deformations caused by the semi reflective surface of the combiner. Another known type uses a waveguide which projects the image directly to the eye of the viewer. Herein, by way of example, combiners 52 are assumed to be of the waveguide type.
In one embodiment, combiners 52 comprise LUMUS DK 32 see through glasses, produced by Lumus Optical of Rechovot, Israel.
Generally similar pixelated variable transparency screens 60A and 60B respectively coat a rear side, i.e., the side away from the eyes of professional 22, of combiners 52A, 52B. Screens 60 are active elements of system 20 and are formed of an array of pixels, the opacity of each of the pixels being controlled by processor 26.
Screens 60 are typically, but not necessarily, liquid crystal displays (LCDs) formed of a rectangular array of liquid crystal pixels. Alternatively, screens 60 are formed of MEMS (microelectromechanical systems). Further alternatively, screens 60 are formed of polymer dispersed liquid crystals (PDLCs). In the following description, by way of example, screens 60 are assumed to be formed of LCDs. LCD display pixels can typically be switched between an opaque state, where approximately 95% of the incoming light is blocked and 5% is transmitted, and a transparent state where approximately 60% of the incoming light is blocked and 40% is transmitted. The LCDs then have a transmission contrast ratio of 1:8.
Fixedly attached to arms of frame 54 are generally similar micro-projectors 64A and 64B. Each micro-projector is located and oriented so as to be able to project onto respective combiner 52A and 52B, a scene, and/or a visual indication, in a form suitable for redirection by the combiners to the left or right eye of professional 22. Micro-projectors 64 are active elements, and the projected scenes/indications are provided to the micro-projectors by processor 26. The projection and redirection are configured so that the images seen by the eyes of professional 22, absent any correcting lenses, appear to be at infinity, due to parallel light coming from the combiners and entering the pupils. In some embodiments display 24 comprises correcting lenses 88A, 88B which redirect light from combiners 52A, 52B so that the images appear to be closer than infinity to the professional's eyes. The power D in diopters of the lenses defines the distance d of the images, according to the formula d=1/D, where d is in meters, and D is a negative number. Lenses 88A, 88B are typically located between the professional's eyes and the respective combiners. For simplicity, lenses 88A, 88B are not shown in other figures of the present application.
At least one image capturing device 68 is attached to frame 54. In the disclosed embodiment there are two generally similar devices 68A and 68B, respectively aligned to be approximately orthogonal to planar combiners 52A and 52B, so as to be able to capture radiation of respective images of scenes viewed by the left and right eyes of professional 22. Typically, devices 68 comprise cameras configured to capture images of scenes in the visible spectrum. The cameras may use rolling shutters, in which cases latency (of projection via micro-projectors 64) may be reduced by processing rows of images rather than complete frames of images. In some embodiments devices 68 may also capture non-visible portions of images, such as portions in the infra-red spectrum. The operation of devices 68 is controlled by processor 26.
In some embodiments of the present invention, assembly 24 comprises a sensor 72 which is configured to capture non-visible images of elements of a scene in front of assembly 24. Typically sensor 72 uses a projector 73 configured to project radiation in the non-visible spectrum detected by the sensor, and has a bandpass filter configured to block visible radiation, such as that projected by surgical lighting. Typically, sensor 72 and projector 73 operate in the near infra-red spectrum.
In some embodiments, assembly 24 comprises a manual and/or electronic control 74 which may be operated by professional 22 to move elements of the assembly in and out of the field of view of the professional. Additionally or alternatively, there may be a button or switch 78 which enables the professional to power active elements of assembly 24, such as the capturing devices and the micro-projectors. In some embodiments switch 78 may be a foot switch. Further additionally or alternatively, assembly 24 may be configured so that it can tilt downwards about a horizontal axis, at an angle up to 40° from the horizontal, so that the professional can look through the assembly when looking down.
Additionally, assembly 24 may comprise a sensor 76, such as an accelerometer, which is configured to measure an inclination of the assembly with respect to the direction of gravity, so measuring the angle of the head of the professional with respect to the vertical. Processor 26 may be configured to use readings from sensor 76 to move elements of assembly 24 in and out of the field of view of the professional, and/or to control whether micro-projectors 64 project images.
Similarly in screen 60B processor 26 has rendered a circular array 80B of the pixels of the screen opaque, while the remaining pixels of the screen are rendered transparent. As for array 80A, array 80B occludes sections of portion 56 from the view of the right eye of professional 22. Thus array 80B is also referred to herein as occluding mask 80B.
In addition to projecting images 90, micro-projectors 64 also project alphanumeric data 92A and 92B onto the non-occluded region of screens 60, as well as markers 96A and 96B onto masks 80A and 80B. Images 90, data 92, and markers 96 are typically stored in database 40, and are provided from the database to micro-projectors 64 by processor 26.
In a mentoring situation images 90, the contents of data 92, and the position of markers 96 are typically under control of a tutor interacting with processor 26 while mentoring professional 22. In some cases the locations of masks 80 may also be provided to processor 26 by the tutor, although typically the locations of the masks depend upon gaze directions 32 of the professional. In a non-mentoring situation, i.e. where professional 22 alone operates system 20, locations of masks 80 are typically automatically set by processor 26, as is described below. Also in a non-mentoring situation, images 90, data 92, and markers 96 may be controlled by professional 22. It will be understood that images 90, data 92 and markers 96 are examples of non-video related visual elements that are seen by professional 22, and that the provision of such elements corresponds to an optic-based augmented reality situation implemented in system 20.
To point to the feature, the tutor interacts with processor 26 so that the processor enhances and emphasizes portions 100A, 100B of the video images acquired by capturing devices 68, the portions corresponding to the region of the chest where the unusual movement is occurring. Micro-projectors 64A, 64B then project portions 100A, 100B onto combiners 52A, 52B. It will be understood that the enhancement of portions 100A, 100B and their projection on the respective combiners is in real-time. The enhancement may take a number of forms. For example, portions 100A, 100B may comprise a wireframe image of the region of the chest having unusual movement, and/or a false-color image of the region. Other suitable methods of real-time enhancement will be apparent to those having ordinary skill in the art, and all such methods are assumed to be within the scope of the present invention.
The professional has made an incision 104 in a portion 106 of patient 30, and ROI 34, defined by marker elements 36, is assumed to be at the location of the incision. In addition, the professional has inserted a lower portion of the distal end of device 38 into the patient so that the lower portion is no longer visible.
Processor 26 has formed mask 80A on combiner 52A so as to occlude ROI 34, and the portion of incision 104 comprised in the ROI. Mask 80A also includes a margin 83, typically corresponding to a margin of approximately 1-5 cm at the ROI. Thus, all elements of the scene outside mask 80A, comprising hand 102 and the proximal end of device 38, are directly visible through combiner 52A by the professional. However, elements of the scene within mask 80A, including a portion of incision 104 and an upper portion of the distal end of device 38 that is outside the patient, are not visible to the professional, since they are occluded by the mask.
Processor 26 overlays on mask 80A a captured image 110 of the ROI and the region corresponding to margin 83, which includes the portion of incision 104 occluded by the mask and which also includes a video image 114 of the upper portion of the distal end of device 38 (outside the patient) that has been captured by image capturing device 68. In addition, the processor overlays on the occlusion mask a stored image 112 corresponding to the lower portion of the distal end of device 38 (within the patient). Stored image 112 is a virtual elongation of image 114 and is retrieved from database 40. The section of the distal end corresponding to image 112 is not visible to capturing device 68.
The processor registers the two overlaid images, image 110 and image 112, with each other, and the registration is possible since by tracking device 38 the processor is aware of the location of the device distal end with respect to the captured image. Thus, there is no misalignment between stored image 112, corresponding to the lower portion of the distal end, and image 114 of the upper portion of the distal end, which is included in captured image 110.
However, there is typically misalignment between the two registered overlaid images 110, 112 and the directly visible portion of scene 101, including the directly visible portion of incision 104, as is illustrated in the figure. The misalignment occurs because while the captured image of the ROI is close to that seen by the professional (in the absence of the occlusion mask), it is not exactly in registration with the viewed scene. The inventors have found that a misalignment of up to 2 cm, in a scene that is 50 cm from the eye of the professional, is acceptable.
A ring 130 surrounding ROI 34 is described in more detail below.
For the first situation, where professional 22 is looking at location 103, the directions of gaze, αR, αL of the professional are shown by lines 103R and 103L. αR, αL are angles that are measured with respect to lines orthogonal to a line connecting eyes 120A, 120B, and their values are given by the following equations:
For the first situation processor 26 rotates combiners 52A and 52B (for clarity the combiners are not shown in the figure for the first situation), within their respective frames 80A and 80B, so that they are orthogonal to lines 103L and 103R. Thus the orientation of the combiners to their frames is given by equations (A).
For the second situation, where professional 22 is looking at location 105, the directions of gaze of the professional are shown by lines 105L and 105R. These directions are respectively changed from the “straight ahead” directions by βL, βR. The values of βL, βR are given by equations (B):
For the second situation processor 26 rotates combiners 52A and 52B, within their respective frames 80A and 80B, so that they are orthogonal to lines 105L and 105R. Thus the orientation of the combiners to their frames is given by equations (B), and these orientations are illustrated in the figure.
γL,=βL+αL, γR,=βR+αR (C)
From the above equations, as well as from the graphs, it is apparent that the angles made by combiners 52A, 52B with their respective frames are different, as professional 26 gazes at a region of interest. In addition, if the professional changes his/her gaze, the changes of the combiner angles to maintain orthogonality with the gaze directions are also different.
It will be understood that calculations based on equations herein, including equations (A), (B), and (C), assume that combiners 52A, 52B transmit rays that are orthogonal to the combiners. Those having ordinary skill in the art will be able to adapt the calculations, mutatis mutandis, for situations where the combiners transmit non-orthogonal rays.
The diagram has been drawn assuming that mask 80A″ just completely occludes ROI 34. Thus a ray HB, from an upper edge H of ROI 34 to an upper edge B of pupil 124A touches an upper edge F of mask 80A″. Similarly, a ray GA, from a lower edge G of ROI 34 to a lower edge A of pupil 124A touches a lower edge E of mask 80A″. Rays HB and GA are assumed to cross at an imaginary point J. A line from upper pupil edge B parallel to the x-axis cuts mask 80A″ at K and ROI 34 at M.
In the description below:
p is the apparent diameter of pupil 124A, as measured externally to eye 120A, corresponding to AB; and
d is the diameter of mask 80A″, corresponding to EF; d=d1 for a realistic case of p>0, d=d0 is the diameter of the mask for a theoretical “pinhole” case of p=0.
In addition,
D is the diameter of ROI 34 (which is occluded by mask 80A″), corresponding to GH;
L is the distance from pupil 124A to ROI 34;
l1 is the distance from pupil 124A to point J; and
l is the distance from pupil 124A to mask 80A″.
In
From equation (1),
If l1=0, (for the theoretical case of p=0), then
If l1>0, for the realistic case of p>0, then
For p>0 (so d=d1) and substituting values of d1, p, l, and L for FK, BK, FM, and BM in equation (5) gives:
Equation (6) rearranges to:
Equation (7) gives dimensions of mask 80A″, i.e., its diameter d1, in terms of the diameter D of ROI 34, the distance 1 of the mask from the pupil, the diameter of the pupil, and the distance L of the ROI from the pupil.
For typical values of 1=2 cm, L=50 cm, p=0.3 cm, and D=15 cm the diameter of mask 80A″ to just give complete occlusion is, from equation (7), approximately 0.9 cm. For the same values but with p=0.15, the mask diameter is approximately 0.7 cm.
While, as described above, mask 80A″ completely occludes ROI 34, there are regions outside ROI 34 that are partly occluded by the mask. The partial occlusion follows from the finite, non-zero diameter of the pupil of eye, in the example described here pupil 124A, and is described in more detail with reference to
In
At mask 80A″ the distances of lower ray 150 and of upper ray 160 from the x-axis are respectively f1(1) and f2(1), and the width of the beam between the upper and lower rays is:
f1(1)−f2(1) (8)
From the diagram,
partial occlusion occurs if:
no occlusion occurs if:
and full occlusion, corresponding to the situation illustrated by
From expressions (8) and (9), and inspection of
(The subscript 2D indicates that the fraction considered here is for the two-dimensional case illustrated in
Since ΔATW|∥ΔANQ
Since ΔBSV|∥ΔBMQ
From equations (13 and (14) the diameter of the cone cross-section from Q at mask 80A″, which is f1(1)−f2(1), is given by:
Substituting equations (14) and (15) into equation (12) gives the following expression for F2D:
Inspection of equation (16) indicates that the fraction of occlusion at point Q is a function of pupil diameter p, and also decreases linearly as R increases.
HG is a cross-section of circular ROI 34, so that it will be understood that GQ is a cross-section of a circular, partially occluded circular ring 130 surrounding ROI 34. As illustrated in
The rays from point Q define a cone of rays emanating from Q, and this cone cuts mask 80A″ in a circle having a diameter V′W′, the diameter being given by equation (15). The cutting of mask 80A″ by the cone of rays from Q is described with reference to
There is a corresponding equation for a three-dimensional fraction of occlusion F3D, given by the following expression:
where A is the area of portion 174, and
AL is the area of circle 170.
F3D may also be written as:
From equation (18), F3D is a function of pupil diameter p, and the equation provides numerical values of F3D for selected values of d, R, p, l, and L.
L=50 cm
l=2 cm
P=0.3 cm
D=15 cm
From equation (7) the diameter of the occlusion mask to fully occlude an ROI with diameter D of 15 cm is d=0.888 cm. The graphs of
From equation (15) the diameter of circle 170 is 0.288 cm, so that the value of the area AL of the circle is 0.065144 cm2.
A solid line graph 200 illustrates the full and partial occlusion vs. distance (from the center of the ROI) for the three-dimensional case comprising equation (18). The measurements of occlusion have been normalized, so that for an LCD screen a full occlusion of 95% is normalized to 1, and a full transparency (of 60% occlusion) is normalized to 0. A broken line graph 204 illustrates the full and partial occlusion vs. distance for the two-dimensional case comprising equation (16). As is apparent from both graphs, there is full occlusion, for a mask of diameter d=0.894 cm, for a region 208 up to approximately 8 cm from the center of the ROI, and partial occlusion in a region 212 from approximately 8 cm to approximately 15 cm. The fraction of partial occlusion decreases monotonically in region 212.
Processor 26 also orients the images from micro-projectors 64, by registering the images projected by the micro-projectors onto combiners 52 with the scene viewed by professional 22 through the combiners. The registration may be accomplished by the professional viewing a scene through combiners 52, together with an image of the same scene as it is captured by devices 60 and projected by the micro-projectors onto the combiners. The professional then adjusts the orientation of the micro-projectors and/or the capturing devices so that the projected image and the viewed scene coincide.
Typically the registration and adjustment of the micro-projectors and the capturing devices is performed for different regions of combiners 52, such as the left and right peripheral regions, the upper and lower peripheral regions, and a central region. In addition, the registration and adjustment may be performed for different scenes according to the distance of the scene from the combiner, such as a scene of relatively near elements, typically up to 1 m from the combiner, and a scene of relatively far elements, typically greater than 1 m from the combiner. The registrations and adjustments of the micro-projectors and the capturing devices are typically different for the different regions of the combiners, as well as for scenes at different distances from the combiners. Processor 26 stores the different registration data acquired during the calibration step for use when the professional is using assembly 24.
During the calibration step the sizes of the pupils of the eyes of professional 22 are measured. In one embodiment professional 22 gazes at a circular object of a known diameter and at a known distance from the professional, and processor 26 presents an occlusion mask on screens 60 to the professional. The professional then adjusts a diameter of the occlusion mask until complete occlusion of the object is achieved. As is apparent from equation (7), the diameter of the completely occluding mask provides a value for the pupil diameter, since d1, l, L and D (terms in equation (7)) are all known.
Alternatively or additionally, the professional may look into a mirror while image capturing devices 60 acquire images of the reflected scene, in this case the professional wearing assembly 24. Processor 26 analyzes the acquired images, by processes that are well known in the art, to identify the pupils of the professional as well as the outlines of combiners 52. The processor then compares the diameters of the pupils with the known dimensions of the combiners, so as to determine values for the diameters.
The measurements of the pupil diameters are taken for different ambient light brightnesses, and the ambient brightness values may be determined from the signal levels of the images acquired by devices 68. Processor 26 stores the values of the pupil diameters, and the corresponding brightness levels.
As stated above, processor 26 is configured to track device 38, using the one or more identifying elements 39 (
In an ROI defining step 302, ROI acquisition marker 35 (
In an imaging step 304, image capturing devices 68 acquire images of the scene in front of assembly 24. Sensor 72, if present, also captures a corresponding image of the scene. Processor 26 analyzes the images to identify marker elements 36, and from the identified elements determines the orientation of ROI 34 with respect to assembly 24, and also the distance of the ROI from the assembly. Even if sensor 72 is not present, it will be understood that having two devices 68 acquiring respective images of the scene simplifies the analysis needed to be performed by the processor to identify elements 36. In addition, having two capturing devices 68 reduces the number of elements 36 required to accurately determine the orientation and distance of the ROI with respect to assembly 24, compared to the number required if only one capturing device 68 is used. With two capturing devices 68 the inventors have found it is sufficient to have one marker with three marker elements to accurately locate the ROI with respect to assembly 24. If sensor 72 is present, its image alone may be sufficient to identify elements 36, although typically processor 26 uses the images from devices 68 to improve the accuracy of the orientation and distance measures of the ROI determined by the sensor.
Processor 26 also analyzes the images acquired by devices 68 in order to determine a measure of the brightness of the scene in front of assembly 24.
In a frame orientation step 305, the processor rotates combiners 52A and 52B with respect to their respective frames so that the combiners are orthogonal to the gaze directions of the professional towards the ROI. The processor uses equations (A), (B) and/or (C) to determine the angles of rotation of the combiners.
In a masking step 306, the processor generates circular occlusion masks 80 in screens 60. The processor, using the orientation of the ROI measured in step 304 and the central adjustment of combiners 52 in step 300, determines positions for the masks that will occlude ROI 34. From the brightness measured in step 304, and from the correspondence between pupil size and brightness stored in initial step 300, the processor estimates a value of the pupil diameter of the professional.
In one embodiment the processor sets the diameter of masks 80 according to equation (7), i.e., inter alia, according to the professional's pupil size, so that the masks fully occlude ROI 34. In this case partially occluded ring 130 surrounds ROI 34, the fraction of partial occlusion within the ring being given by equations (12) and (18).
In some embodiments the processor determines sections of the scene corresponding to partially occluded ring 130, and as acquired by devices 68. The processor then configures micro-projectors 64 to overlay video of the acquired sections onto the partially occluded ring, so as to compensate for the partial occlusion. Typically, processor configures the intensity of the projected video to be the inverse of the fraction of the occlusion.
In an alternative embodiment, rather than setting the diameter of the masks to be according to equation (7), the processor sets the diameter to be reduced from the value determined by the equation. The reduction is typically determined by professional 22. In one embodiment the diameter is set to be 90% of the value determined by equation (7).
In a further alternative embodiment, the processor, using instructions from professional 22, sets the diameter of the masks to be larger than the diameter of equation (7). In one embodiment the diameter is set to be 110% of the value determined by equation (7).
In a mask projection step 308 processor 26 uses micro-projectors 64 to project augmented video onto occlusion masks 80. In the case of the augmented video including two or more types of images being projected onto the masks, processor 26 registers the images with each other. However, the images are not necessarily registered, and are typically misaligned, with the scene surrounding and outside the masks. Thus, as exemplified by
In a further projection step 310, processor 26 uses micro-projectors 64 to project augmented video onto the partially occluded ring surrounding the masks, and/or the non-occluded section of combiners 52. As in step 308, multiple image types are registered together, but are typically misaligned with the visible scene of the non-occluded section.
Typical images that are projected in steps 308 and 310 include, but are not limited to, those described above with respect to
It will be appreciated that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.
Number | Date | Country | Kind |
---|---|---|---|
1504935 | Mar 2015 | GB | national |
This application is a continuation of U.S. patent application Ser. No. 16/159,740, filed 15 Oct. 2018, which is a continuation of U.S. patent application Ser. No. 15/896,102, filed 14 Feb. 2018 (now U.S. Pat. No. 10,134,166), which is a continuation of U.S. patent application Ser. No. 15/127,423, filed 20 Sep. 2016 (now U.S. Pat. No. 9,928,629), in the national phase of PCT Patent Application PCT/IB2016/051642, filed 23 Mar. 2016, which claims the benefit of U.K. Patent Application GB1504935.6, filed 24 Mar. 2015, which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3690776 | Zaporoshan | Sep 1972 | A |
4459358 | Berke | Jul 1984 | A |
4711512 | Upatnieks | Dec 1987 | A |
4863238 | Brewster | Sep 1989 | A |
4944739 | Torre | Jul 1990 | A |
5441042 | Putman | Aug 1995 | A |
5442146 | Bell | Aug 1995 | A |
5510832 | Garcia | Apr 1996 | A |
D370309 | Stucky | May 1996 | S |
5665092 | Mangiardi et al. | Sep 1997 | A |
5771121 | Hentschke | Jun 1998 | A |
5792046 | Dobrovolny | Aug 1998 | A |
5841507 | Barnes | Nov 1998 | A |
6006126 | Cosman | Dec 1999 | A |
6038467 | De Bliek et al. | Mar 2000 | A |
6125164 | Murphy | Sep 2000 | A |
6147805 | Fergason | Nov 2000 | A |
6227667 | Halldorsson | May 2001 | B1 |
6256529 | Holupka et al. | Jul 2001 | B1 |
6285505 | Melville et al. | Sep 2001 | B1 |
6314310 | Ben-Haim et al. | Nov 2001 | B1 |
6349001 | Spitzer | Feb 2002 | B1 |
6444192 | Mattrey | Sep 2002 | B1 |
6449090 | Omar | Sep 2002 | B1 |
6456405 | Horikoshi et al. | Sep 2002 | B2 |
6456868 | Saito et al. | Sep 2002 | B2 |
6474159 | Foxlin et al. | Nov 2002 | B1 |
6518939 | Kikuchi | Feb 2003 | B1 |
6527777 | Justin | Mar 2003 | B2 |
6529331 | Massof et al. | Mar 2003 | B2 |
6549645 | Oikawa | Apr 2003 | B1 |
6578962 | Amir et al. | Jun 2003 | B1 |
6609022 | Vilsmeier et al. | Aug 2003 | B2 |
6610009 | Person | Aug 2003 | B2 |
D480476 | Martinson et al. | Oct 2003 | S |
6659611 | Amir et al. | Dec 2003 | B2 |
6675040 | Cosman | Jan 2004 | B1 |
6683584 | Ronzani et al. | Jan 2004 | B2 |
6690964 | Bieger et al. | Feb 2004 | B2 |
6714810 | Grzeszczuk et al. | Mar 2004 | B2 |
6737425 | Yamamoto | May 2004 | B1 |
6740882 | Weinberg | May 2004 | B2 |
6757068 | Foxlin | Jun 2004 | B2 |
6759200 | Stanton | Jul 2004 | B1 |
6847336 | Lemelson et al. | Jan 2005 | B1 |
6856324 | Sauer | Feb 2005 | B2 |
6856826 | Seeley et al. | Feb 2005 | B2 |
6891518 | Sauer et al. | May 2005 | B2 |
6900777 | Hebert et al. | May 2005 | B1 |
6919867 | Sauer | Jul 2005 | B2 |
6921167 | Nagata | Jul 2005 | B2 |
6966668 | Cugini | Nov 2005 | B2 |
6980849 | Sasso | Dec 2005 | B2 |
6993374 | Sasso | Jan 2006 | B2 |
6997552 | Hung | Feb 2006 | B1 |
6999239 | Martins et al. | Feb 2006 | B1 |
7035371 | Boese et al. | Apr 2006 | B2 |
7043961 | Pandey et al. | May 2006 | B2 |
7103233 | Stearns | Sep 2006 | B2 |
7107091 | Jutras et al. | Sep 2006 | B2 |
7112656 | Desnoyers | Sep 2006 | B2 |
7141812 | Appleby | Nov 2006 | B2 |
7157459 | Ohta | Jan 2007 | B2 |
7169785 | Timmer | Jan 2007 | B2 |
7171255 | Holupka et al. | Jan 2007 | B2 |
7176936 | Sauer et al. | Feb 2007 | B2 |
7187792 | Fu | Mar 2007 | B2 |
7190331 | Genc et al. | Mar 2007 | B2 |
7194295 | Vilsmeier | Mar 2007 | B2 |
7215322 | Genc et al. | May 2007 | B2 |
7229078 | Lechot | Jun 2007 | B2 |
7231076 | Fu | Jun 2007 | B2 |
7235076 | Pacheco | Jun 2007 | B2 |
7239330 | Sauer et al. | Jul 2007 | B2 |
7259266 | Carter | Aug 2007 | B2 |
7260426 | Schweikard | Aug 2007 | B2 |
7269192 | Hayashi | Sep 2007 | B2 |
7281826 | Huang | Oct 2007 | B2 |
7320556 | Vagn-Erik | Jan 2008 | B2 |
7330578 | Wang | Feb 2008 | B2 |
7359535 | Salla | Apr 2008 | B2 |
7364314 | Nilsen et al. | Apr 2008 | B2 |
7366934 | Narayan et al. | Apr 2008 | B1 |
7379077 | Bani-Hashemi | May 2008 | B2 |
7431453 | Hogan | Oct 2008 | B2 |
7435219 | Kim | Oct 2008 | B2 |
7458977 | McGinley | Dec 2008 | B2 |
7462852 | Appleby | Dec 2008 | B2 |
7493153 | Ahmed et al. | Feb 2009 | B2 |
7505617 | Fu | Mar 2009 | B2 |
7507968 | Wollenweber | Mar 2009 | B2 |
7518136 | Appleby | Apr 2009 | B2 |
7525735 | Sottilare et al. | Apr 2009 | B2 |
D592691 | Chang | May 2009 | S |
D592692 | Chang | May 2009 | S |
D592693 | Chang | May 2009 | S |
7536216 | Geiger et al. | May 2009 | B2 |
7542791 | Mire et al. | Jun 2009 | B2 |
7556428 | Sukovic et al. | Jul 2009 | B2 |
7557824 | Holliman | Jul 2009 | B2 |
7563228 | Ma et al. | Jul 2009 | B2 |
7567834 | Clayton | Jul 2009 | B2 |
7586686 | Hall | Sep 2009 | B1 |
D602620 | Cristoforo | Oct 2009 | S |
7605826 | Sauer | Oct 2009 | B2 |
7606613 | Simon et al. | Oct 2009 | B2 |
7607775 | Hermanson | Oct 2009 | B2 |
7620223 | Xu | Nov 2009 | B2 |
7627085 | Boyden et al. | Dec 2009 | B2 |
7630753 | Simon et al. | Dec 2009 | B2 |
7633501 | Wood | Dec 2009 | B2 |
7645050 | Wilt | Jan 2010 | B2 |
7653226 | Guhring et al. | Jan 2010 | B2 |
7689019 | Boese | Mar 2010 | B2 |
7689042 | Brunner | Mar 2010 | B2 |
7689320 | Prisco | Mar 2010 | B2 |
7699486 | Beiner | Apr 2010 | B1 |
7699793 | Gotte | Apr 2010 | B2 |
7719769 | Sugihara et al. | May 2010 | B2 |
D617825 | Chang | Jun 2010 | S |
D619285 | Cristoforo | Jul 2010 | S |
7751865 | Jascob et al. | Jul 2010 | B2 |
7758204 | Klipstein | Jul 2010 | B2 |
7768702 | Hirose et al. | Aug 2010 | B2 |
7769236 | Fiala | Aug 2010 | B2 |
7773074 | Arenson et al. | Aug 2010 | B2 |
7774044 | Sauer et al. | Aug 2010 | B2 |
7822483 | Stone et al. | Oct 2010 | B2 |
D628307 | Krause-Bonte | Nov 2010 | S |
7826902 | Stone et al. | Nov 2010 | B2 |
7831096 | Williamson | Nov 2010 | B2 |
7835778 | Foley | Nov 2010 | B2 |
7835784 | Mire et al. | Nov 2010 | B2 |
7837987 | Shi | Nov 2010 | B2 |
7840093 | Fu et al. | Nov 2010 | B2 |
7840253 | Tremblay et al. | Nov 2010 | B2 |
7840256 | Lakin et al. | Nov 2010 | B2 |
7853305 | Simon | Dec 2010 | B2 |
7854705 | Pawluczyk | Dec 2010 | B2 |
7857271 | Lees | Dec 2010 | B2 |
7860282 | Boese | Dec 2010 | B2 |
D630766 | Harbin | Jan 2011 | S |
7865269 | Prisco | Jan 2011 | B2 |
7874686 | Rossner et al. | Jan 2011 | B2 |
7881770 | Melkent et al. | Feb 2011 | B2 |
7893413 | Appleby | Feb 2011 | B1 |
7894649 | Fu | Feb 2011 | B2 |
7920162 | Masini et al. | Apr 2011 | B2 |
7938553 | Beiner | May 2011 | B1 |
7945310 | Gattani | May 2011 | B2 |
7953471 | Clayton | May 2011 | B2 |
7969383 | Eberl et al. | Jun 2011 | B2 |
7974677 | Mire et al. | Jul 2011 | B2 |
7985756 | Barlow | Jul 2011 | B2 |
7991557 | Liew | Aug 2011 | B2 |
7993353 | Rossner et al. | Aug 2011 | B2 |
7996064 | Simon et al. | Aug 2011 | B2 |
8004524 | Deinzer | Aug 2011 | B2 |
8021300 | Ma et al. | Sep 2011 | B2 |
8022984 | Cheong | Sep 2011 | B2 |
8045266 | Nakamura | Oct 2011 | B2 |
8060181 | Ponce | Nov 2011 | B2 |
8068581 | Boese et al. | Nov 2011 | B2 |
8068896 | Daghighian | Nov 2011 | B2 |
8077943 | Wiliams | Dec 2011 | B2 |
8079957 | Ma et al. | Dec 2011 | B2 |
8085075 | Huffman | Dec 2011 | B2 |
8085897 | Morton | Dec 2011 | B2 |
8090175 | Fu | Jan 2012 | B2 |
8092400 | Warkentine | Jan 2012 | B2 |
8108072 | Zhao | Jan 2012 | B2 |
8112292 | Simon | Feb 2012 | B2 |
8116847 | Gattani et al. | Feb 2012 | B2 |
8120847 | Chang | Feb 2012 | B2 |
8121255 | Sugiyama | Feb 2012 | B2 |
8155479 | Hoffman | Apr 2012 | B2 |
8180429 | Sasso | May 2012 | B2 |
8208599 | Ye | Jun 2012 | B2 |
8221402 | Francischelli | Jul 2012 | B2 |
8239001 | Verard et al. | Aug 2012 | B2 |
8244012 | Liang et al. | Aug 2012 | B2 |
8253778 | Atsushi | Aug 2012 | B2 |
8271069 | Jascob et al. | Sep 2012 | B2 |
8280491 | Kuduvalli et al. | Oct 2012 | B2 |
8285021 | Boese | Oct 2012 | B2 |
8300315 | Kobayashi | Oct 2012 | B2 |
8305685 | Heine | Nov 2012 | B2 |
8306305 | Porat et al. | Nov 2012 | B2 |
8309932 | Haselman | Nov 2012 | B2 |
8317320 | Huang | Nov 2012 | B2 |
8328815 | Farr et al. | Dec 2012 | B2 |
8335553 | Rubner | Dec 2012 | B2 |
8335557 | Maschke | Dec 2012 | B2 |
8340379 | Razzaque et al. | Dec 2012 | B2 |
8369925 | Giesel | Feb 2013 | B2 |
8386022 | Jutras et al. | Feb 2013 | B2 |
8394144 | Zehavi | Mar 2013 | B2 |
8398541 | DiMaio et al. | Mar 2013 | B2 |
8444266 | Waters | May 2013 | B2 |
8457719 | Moctezuma De La Barrera et al. | Jun 2013 | B2 |
8467851 | Mire et al. | Jun 2013 | B2 |
8469902 | Dick | Jun 2013 | B2 |
8494612 | Vetter et al. | Jul 2013 | B2 |
8509503 | Nahum et al. | Aug 2013 | B2 |
8511827 | Hua et al. | Aug 2013 | B2 |
8531394 | Maltz | Sep 2013 | B2 |
8540364 | Waters | Sep 2013 | B2 |
8545012 | Waters | Oct 2013 | B2 |
8548567 | Maschke et al. | Oct 2013 | B2 |
8556883 | Saleh | Oct 2013 | B2 |
8559596 | Thomson | Oct 2013 | B2 |
8567945 | Waters | Oct 2013 | B2 |
8571353 | Watanabe | Oct 2013 | B2 |
8585598 | Razzaque et al. | Nov 2013 | B2 |
8600001 | Schweizer | Dec 2013 | B2 |
8600477 | Beyar | Dec 2013 | B2 |
8605199 | Imai | Dec 2013 | B2 |
8611988 | Miyamoto | Dec 2013 | B2 |
8612024 | Stone et al. | Dec 2013 | B2 |
8634897 | Simon | Jan 2014 | B2 |
8641621 | Razzaque et al. | Feb 2014 | B2 |
8643950 | König | Feb 2014 | B2 |
8644907 | Hartmann et al. | Feb 2014 | B2 |
8674902 | Park | Mar 2014 | B2 |
8686923 | Eberl et al. | Apr 2014 | B2 |
8690581 | Ruf et al. | Apr 2014 | B2 |
8690776 | Razzaque et al. | Apr 2014 | B2 |
8692845 | Fedorovskaya et al. | Apr 2014 | B2 |
8693632 | Allison | Apr 2014 | B2 |
8694075 | Groszmann | Apr 2014 | B2 |
8699765 | Hao | Apr 2014 | B2 |
8705829 | Frank | Apr 2014 | B2 |
8737708 | Hartmann et al. | May 2014 | B2 |
8746887 | Shestak | Jun 2014 | B2 |
8784450 | Moskowitz et al. | Jul 2014 | B2 |
8786689 | Liu | Jul 2014 | B1 |
D710545 | Wu | Aug 2014 | S |
D710546 | Wu | Aug 2014 | S |
8827934 | Chopra et al. | Sep 2014 | B2 |
8831706 | Fu | Sep 2014 | B2 |
8836768 | Rafii | Sep 2014 | B1 |
8838199 | Simon et al. | Sep 2014 | B2 |
8848977 | Bammer et al. | Sep 2014 | B2 |
8855395 | Baturin | Oct 2014 | B2 |
8878900 | Yang et al. | Nov 2014 | B2 |
8885177 | Ben-Yishai et al. | Nov 2014 | B2 |
8890772 | Woo | Nov 2014 | B2 |
8890773 | Pederson | Nov 2014 | B1 |
8890943 | Lee | Nov 2014 | B2 |
8897514 | Feikas | Nov 2014 | B2 |
8900131 | Chopra et al. | Dec 2014 | B2 |
8903150 | Star-Lack | Dec 2014 | B2 |
8908952 | Isaacs et al. | Dec 2014 | B2 |
8911358 | Koninckx et al. | Dec 2014 | B2 |
8917268 | Johnsen | Dec 2014 | B2 |
8920776 | Gaiger | Dec 2014 | B2 |
8922589 | Laor | Dec 2014 | B2 |
8941559 | Bar-Zeev et al. | Jan 2015 | B2 |
8942455 | Chou | Jan 2015 | B2 |
8950877 | Northey et al. | Feb 2015 | B2 |
8953246 | Koenig | Feb 2015 | B2 |
8965583 | Ortmaier et al. | Feb 2015 | B2 |
8969829 | Wollenweber | Mar 2015 | B2 |
8989349 | Thomson | Mar 2015 | B2 |
8992580 | Bar | Mar 2015 | B2 |
8994729 | Nakamura | Mar 2015 | B2 |
8994795 | Oh | Mar 2015 | B2 |
9004711 | Gerolemou | Apr 2015 | B2 |
9005211 | Brundobler et al. | Apr 2015 | B2 |
9011441 | Bertagnoli et al. | Apr 2015 | B2 |
9057759 | Klingenbeck | Jun 2015 | B2 |
9060757 | Lawson et al. | Jun 2015 | B2 |
9066751 | Sasso | Jun 2015 | B2 |
9081436 | Berme | Jul 2015 | B1 |
9084635 | Nuckley et al. | Jul 2015 | B2 |
9085643 | Svanborg | Jul 2015 | B2 |
9087471 | Miao | Jul 2015 | B2 |
9100643 | McDowall | Aug 2015 | B2 |
9101394 | Arata et al. | Aug 2015 | B2 |
9111175 | Strommer | Aug 2015 | B2 |
9123155 | Cunningham et al. | Sep 2015 | B2 |
9125556 | Zehavi | Sep 2015 | B2 |
9129054 | Nawana et al. | Sep 2015 | B2 |
9129372 | Kriston | Sep 2015 | B2 |
9132361 | Smithwick | Sep 2015 | B2 |
9141873 | Takemoto | Sep 2015 | B2 |
9142020 | Deguise et al. | Sep 2015 | B2 |
9149317 | Arthur et al. | Oct 2015 | B2 |
9165203 | McCarthy | Oct 2015 | B2 |
9179984 | Teichman et al. | Nov 2015 | B2 |
D746354 | Chang | Dec 2015 | S |
9208916 | Appleby | Dec 2015 | B2 |
9220573 | Kendrick et al. | Dec 2015 | B2 |
9225895 | Kozinski | Dec 2015 | B2 |
9232982 | Soler et al. | Jan 2016 | B2 |
9235934 | Mandella | Jan 2016 | B2 |
9244278 | Sugiyama et al. | Jan 2016 | B2 |
9247240 | Park | Jan 2016 | B2 |
9259192 | Ishihara | Feb 2016 | B2 |
9265572 | Fuchs et al. | Feb 2016 | B2 |
9269192 | Kobayashi | Feb 2016 | B2 |
9283052 | Ponce | Mar 2016 | B2 |
9286730 | Bar-Zeev et al. | Mar 2016 | B2 |
9289267 | Sauer et al. | Mar 2016 | B2 |
9300949 | Ahearn | Mar 2016 | B2 |
9310591 | Hua et al. | Apr 2016 | B2 |
9320474 | Demri | Apr 2016 | B2 |
9323055 | Baillot | Apr 2016 | B2 |
9330477 | Rappel | May 2016 | B2 |
9335547 | Takano et al. | May 2016 | B2 |
9335567 | Nakamura | May 2016 | B2 |
9341704 | Picard | May 2016 | B2 |
9344686 | Moharir | May 2016 | B2 |
9349066 | Koo | May 2016 | B2 |
9349520 | Demetriou | May 2016 | B2 |
9364294 | Razzaque et al. | Jun 2016 | B2 |
9370332 | Paladini et al. | Jun 2016 | B2 |
9373166 | Azar | Jun 2016 | B2 |
9375639 | Kobayashi et al. | Jun 2016 | B2 |
9378558 | Kajiwara et al. | Jun 2016 | B2 |
9380287 | Nistico | Jun 2016 | B2 |
9387008 | Sarvestani | Jul 2016 | B2 |
9392129 | Simmons | Jul 2016 | B2 |
9395542 | Tilleman et al. | Jul 2016 | B2 |
9398936 | Razzaque et al. | Jul 2016 | B2 |
9400384 | Griffith | Jul 2016 | B2 |
9414041 | Ko | Aug 2016 | B2 |
9424611 | Kanjirathinkal et al. | Aug 2016 | B2 |
9424641 | Wiemker | Aug 2016 | B2 |
9438894 | Park | Sep 2016 | B2 |
9443488 | Borenstein | Sep 2016 | B2 |
9453804 | Tahtali | Sep 2016 | B2 |
9456878 | Macfarlane et al. | Oct 2016 | B2 |
9465235 | Chang | Oct 2016 | B2 |
9468373 | Larsen | Oct 2016 | B2 |
9470908 | Frankel | Oct 2016 | B1 |
9473766 | Douglas | Oct 2016 | B2 |
9492222 | Singh | Nov 2016 | B2 |
9495585 | Bicer et al. | Nov 2016 | B2 |
9498132 | Maier-Hein et al. | Nov 2016 | B2 |
9498231 | Haider et al. | Nov 2016 | B2 |
9507155 | Morimoto | Nov 2016 | B2 |
9513495 | Waters | Dec 2016 | B2 |
9521966 | Schwartz | Dec 2016 | B2 |
9526443 | Berme | Dec 2016 | B1 |
9530382 | Simmons | Dec 2016 | B2 |
9532846 | Nakamura | Jan 2017 | B2 |
9532849 | Anderson et al. | Jan 2017 | B2 |
9538962 | Hannaford et al. | Jan 2017 | B1 |
9545233 | Sirpad | Jan 2017 | B2 |
9546779 | Rementer | Jan 2017 | B2 |
9547174 | Gao et al. | Jan 2017 | B2 |
9547940 | Sun et al. | Jan 2017 | B1 |
9557566 | Fujimaki | Jan 2017 | B2 |
9560318 | Reina et al. | Jan 2017 | B2 |
9561095 | Nguyen | Feb 2017 | B1 |
9561446 | Brecher | Feb 2017 | B2 |
9565415 | Zhang et al. | Feb 2017 | B2 |
9572661 | Robin | Feb 2017 | B2 |
9576556 | Simmons | Feb 2017 | B2 |
9581822 | Morimoto | Feb 2017 | B2 |
9612657 | Bertram et al. | Apr 2017 | B2 |
9629595 | Walker | Apr 2017 | B2 |
9633431 | Merlet | Apr 2017 | B2 |
9645395 | Bolas et al. | May 2017 | B2 |
9646423 | Sun et al. | May 2017 | B1 |
9672597 | Amiot | Jun 2017 | B2 |
9672640 | Kleiner | Jun 2017 | B2 |
9675306 | Morton | Jun 2017 | B2 |
9675319 | Razzaque | Jun 2017 | B1 |
RE46463 | Feinbloom | Jul 2017 | E |
9710968 | Dillavou et al. | Jul 2017 | B2 |
9713502 | Finkman | Jul 2017 | B2 |
9724119 | Hissong | Aug 2017 | B2 |
9724165 | Arata et al. | Aug 2017 | B2 |
9726888 | Giartisio | Aug 2017 | B2 |
9728006 | Varga | Aug 2017 | B2 |
9729831 | Birnkrant | Aug 2017 | B2 |
9757034 | Desjardins | Sep 2017 | B2 |
9757087 | Simon | Sep 2017 | B2 |
9766441 | Rappel | Sep 2017 | B2 |
9767608 | Lee et al. | Sep 2017 | B2 |
9770203 | Berme | Sep 2017 | B1 |
9772102 | Ferguson | Sep 2017 | B1 |
9772495 | Tam | Sep 2017 | B2 |
9791138 | Feinbloom | Oct 2017 | B1 |
9800995 | Libin | Oct 2017 | B2 |
9805504 | Zhang | Oct 2017 | B2 |
9808148 | Miller | Nov 2017 | B2 |
9839448 | Reckling et al. | Dec 2017 | B2 |
9844413 | Daon et al. | Dec 2017 | B2 |
9851080 | Wilt | Dec 2017 | B2 |
9861446 | Lang | Jan 2018 | B2 |
9864214 | Fass | Jan 2018 | B2 |
9872733 | Shoham et al. | Jan 2018 | B2 |
9877642 | Duret | Jan 2018 | B2 |
9885465 | Nguyen | Feb 2018 | B2 |
9886552 | Dillavou et al. | Feb 2018 | B2 |
9892564 | Cvetko et al. | Feb 2018 | B1 |
9898866 | Fuchs et al. | Feb 2018 | B2 |
9901414 | Lively | Feb 2018 | B2 |
9911187 | Steinle | Mar 2018 | B2 |
9927611 | Rudy | Mar 2018 | B2 |
9928629 | Benishti et al. | Mar 2018 | B2 |
9940750 | Dillavou et al. | Apr 2018 | B2 |
9943374 | Merritt et al. | Apr 2018 | B2 |
9947110 | Haimerl | Apr 2018 | B2 |
9956054 | Aguirre-Valencia | May 2018 | B2 |
9958674 | Border | May 2018 | B2 |
9959629 | Dillavou et al. | May 2018 | B2 |
9965681 | Border et al. | May 2018 | B2 |
9968297 | Connor | May 2018 | B2 |
9980780 | Lang | May 2018 | B2 |
9986228 | Woods | May 2018 | B2 |
D824523 | Paoli et al. | Jul 2018 | S |
10010379 | Gibby et al. | Jul 2018 | B1 |
10013531 | Richards | Jul 2018 | B2 |
10015243 | Kazerani et al. | Jul 2018 | B2 |
10016243 | Esterberg | Jul 2018 | B2 |
10022065 | Yishai et al. | Jul 2018 | B2 |
10022104 | Sell et al. | Jul 2018 | B2 |
10023615 | Bonny | Jul 2018 | B2 |
10026015 | Cavusoglu | Jul 2018 | B2 |
10034713 | Yang et al. | Jul 2018 | B2 |
10046165 | Frewin | Aug 2018 | B2 |
10066816 | Chang | Sep 2018 | B2 |
10073515 | Awdeh | Sep 2018 | B2 |
10080616 | Wilkinson et al. | Sep 2018 | B2 |
10082680 | Chang | Sep 2018 | B2 |
10085709 | Lavallee et al. | Oct 2018 | B2 |
10105187 | Corndorf et al. | Oct 2018 | B2 |
10107483 | Oren | Oct 2018 | B2 |
10108833 | Hong et al. | Oct 2018 | B2 |
10123840 | Dorman | Nov 2018 | B2 |
10130378 | Bryan | Nov 2018 | B2 |
10132483 | Feinbloom | Nov 2018 | B1 |
10134166 | Benishti et al. | Nov 2018 | B2 |
10134194 | Kepner | Nov 2018 | B2 |
10139652 | Windham | Nov 2018 | B2 |
10139920 | Isaacs | Nov 2018 | B2 |
10142496 | Rao | Nov 2018 | B1 |
10151928 | Ushakov | Dec 2018 | B2 |
10154239 | Casas | Dec 2018 | B2 |
10159530 | Lang | Dec 2018 | B2 |
10166079 | McLachlin et al. | Jan 2019 | B2 |
10175507 | Nakamura | Jan 2019 | B2 |
10175753 | Boesen | Jan 2019 | B2 |
10181361 | Dillavou et al. | Jan 2019 | B2 |
10186055 | Takahashi | Jan 2019 | B2 |
10188672 | Wagner | Jan 2019 | B2 |
10194131 | Casas | Jan 2019 | B2 |
10194990 | Amanatullah et al. | Feb 2019 | B2 |
10194993 | Roger et al. | Feb 2019 | B2 |
10195076 | Fateh | Feb 2019 | B2 |
10197803 | Badiali et al. | Feb 2019 | B2 |
10197816 | Waisman | Feb 2019 | B2 |
10207315 | Appleby | Feb 2019 | B2 |
10230719 | Vaugn | Mar 2019 | B2 |
10231893 | Lei | Mar 2019 | B2 |
10235606 | Miao | Mar 2019 | B2 |
10240769 | Braganca | Mar 2019 | B1 |
10247965 | Ton | Apr 2019 | B2 |
10251724 | McLachlin et al. | Apr 2019 | B2 |
10274731 | Maimone | Apr 2019 | B2 |
10278777 | Lang | May 2019 | B1 |
10292768 | Lang | May 2019 | B2 |
10296805 | Yang et al. | May 2019 | B2 |
10319154 | Chakravarthula et al. | Jun 2019 | B1 |
10326975 | Casas | Jun 2019 | B2 |
10339719 | Jagga et al. | Jul 2019 | B2 |
10352543 | Braganca | Jul 2019 | B1 |
10357146 | Fiebel | Jul 2019 | B2 |
10357574 | Hilderbrand | Jul 2019 | B2 |
10366489 | Boettger et al. | Jul 2019 | B2 |
10368947 | Lang | Aug 2019 | B2 |
10368948 | Tripathi | Aug 2019 | B2 |
10382748 | Benishti et al. | Aug 2019 | B2 |
10383654 | Yilmaz et al. | Aug 2019 | B2 |
10386645 | Shousha | Aug 2019 | B2 |
10398514 | Ryan et al. | Sep 2019 | B2 |
10405927 | Lang | Sep 2019 | B1 |
10419655 | Sivan | Sep 2019 | B2 |
10420626 | Tokuda et al. | Sep 2019 | B2 |
10420813 | Newell-Rogers | Sep 2019 | B2 |
10424115 | Ellerbrock | Sep 2019 | B2 |
10426554 | Siewerdsen et al. | Oct 2019 | B2 |
10431008 | Djajadiningrat | Oct 2019 | B2 |
10433814 | Razzaque | Oct 2019 | B2 |
10434335 | Takahashi | Oct 2019 | B2 |
10444514 | Abou Shousha et al. | Oct 2019 | B2 |
10447947 | Liu | Oct 2019 | B2 |
10448003 | Grafenberg | Oct 2019 | B2 |
10449040 | Lashinski | Oct 2019 | B2 |
10453187 | Peterson | Oct 2019 | B2 |
10463434 | Siegler et al. | Nov 2019 | B2 |
10465892 | Feinbloom | Nov 2019 | B1 |
10470732 | Baumgart | Nov 2019 | B2 |
10473314 | Braganca | Nov 2019 | B1 |
10485989 | Jordan | Nov 2019 | B2 |
10488663 | Choi | Nov 2019 | B2 |
D869772 | Gand | Dec 2019 | S |
D870977 | Berggren et al. | Dec 2019 | S |
10499997 | Weinstein et al. | Dec 2019 | B2 |
10504231 | Fiala | Dec 2019 | B2 |
10507066 | DiMaio | Dec 2019 | B2 |
10511822 | Casas | Dec 2019 | B2 |
10517544 | Taguchi | Dec 2019 | B2 |
10537395 | Perez | Jan 2020 | B2 |
10540780 | Cousins | Jan 2020 | B1 |
10543485 | Ismagilov | Jan 2020 | B2 |
10546423 | Jones et al. | Jan 2020 | B2 |
10548557 | Lim | Feb 2020 | B2 |
10555775 | Hoffman | Feb 2020 | B2 |
10568535 | Roberts et al. | Feb 2020 | B2 |
10571696 | Urey et al. | Feb 2020 | B2 |
10571716 | Chapiro | Feb 2020 | B2 |
10573087 | Gallop | Feb 2020 | B2 |
10602114 | Casas | Feb 2020 | B2 |
10577630 | Zhang | Mar 2020 | B2 |
10586400 | Douglas | Mar 2020 | B2 |
10592748 | Cousins | Mar 2020 | B1 |
10595716 | Nazareth | Mar 2020 | B2 |
10601950 | Devam et al. | Mar 2020 | B2 |
10603113 | Lang | Mar 2020 | B2 |
10603133 | Wang et al. | Mar 2020 | B2 |
10606085 | Toyama | Mar 2020 | B2 |
10594998 | Casas | Apr 2020 | B1 |
10610172 | Hummel et al. | Apr 2020 | B2 |
10610179 | Altmann | Apr 2020 | B2 |
10613352 | Knoll | Apr 2020 | B2 |
10617566 | Esmonde | Apr 2020 | B2 |
10620460 | Carabin | Apr 2020 | B2 |
10625099 | Takahashi | Apr 2020 | B2 |
10626473 | Mariani | Apr 2020 | B2 |
10631905 | Asfora et al. | Apr 2020 | B2 |
10631907 | Zucker | Apr 2020 | B2 |
10634331 | Feinbloom | Apr 2020 | B1 |
10638080 | Ovchinnikov | Apr 2020 | B2 |
10646285 | Siemionow et al. | May 2020 | B2 |
10650513 | Penney et al. | May 2020 | B2 |
10650594 | Jones | May 2020 | B2 |
10652525 | Woods | May 2020 | B2 |
10660715 | Dozeman | May 2020 | B2 |
10663738 | Carlvik | May 2020 | B2 |
10682112 | Pizaine | Jun 2020 | B2 |
10682767 | Grafenberg et al. | Jun 2020 | B2 |
10687901 | Thomas | Jun 2020 | B2 |
10691397 | Clements | Jun 2020 | B1 |
10702713 | Mori | Jul 2020 | B2 |
10709398 | Schweizer | Jul 2020 | B2 |
10713801 | Jordan | Jul 2020 | B2 |
10716643 | Justin et al. | Jul 2020 | B2 |
10722733 | Takahashi | Jul 2020 | B2 |
10725535 | Yu | Jul 2020 | B2 |
10731832 | Koo | Aug 2020 | B2 |
10732721 | Clements | Aug 2020 | B1 |
10742949 | Casas | Aug 2020 | B2 |
10743939 | Lang | Aug 2020 | B1 |
10747315 | Tungare | Aug 2020 | B2 |
10777094 | Rao | Sep 2020 | B1 |
10777315 | Zehavi | Sep 2020 | B2 |
10781482 | Gubatayao | Sep 2020 | B2 |
10792110 | Leung et al. | Oct 2020 | B2 |
10799145 | West et al. | Oct 2020 | B2 |
10799296 | Lang | Oct 2020 | B2 |
10799316 | Sela et al. | Oct 2020 | B2 |
10810799 | Tepper et al. | Oct 2020 | B2 |
10818019 | Piat | Oct 2020 | B2 |
10818101 | Gallop et al. | Oct 2020 | B2 |
10818199 | Buras et al. | Oct 2020 | B2 |
10825563 | Gibby et al. | Nov 2020 | B2 |
10831943 | Santarone | Nov 2020 | B2 |
10835296 | Elimelech et al. | Nov 2020 | B2 |
10838206 | Fortin-Deschenes et al. | Nov 2020 | B2 |
10839629 | Jones | Nov 2020 | B2 |
10839956 | Beydoun et al. | Nov 2020 | B2 |
10841556 | Casas | Nov 2020 | B2 |
10842002 | Chang | Nov 2020 | B2 |
10842461 | Johnson et al. | Nov 2020 | B2 |
10849691 | Zucker | Dec 2020 | B2 |
10849693 | Lang | Dec 2020 | B2 |
10849710 | Liu | Dec 2020 | B2 |
10861236 | Geri et al. | Dec 2020 | B2 |
10865220 | Ebetino | Dec 2020 | B2 |
10869517 | Halpern | Dec 2020 | B1 |
10869727 | Yanof et al. | Dec 2020 | B2 |
10872472 | Watola | Dec 2020 | B2 |
10877262 | Luxembourg | Dec 2020 | B1 |
10877296 | Lindsey | Dec 2020 | B2 |
10878639 | Douglas et al. | Dec 2020 | B2 |
10893260 | Trail et al. | Jan 2021 | B2 |
10895742 | Schneider | Jan 2021 | B2 |
10895743 | Dausmann | Jan 2021 | B2 |
10895906 | West et al. | Jan 2021 | B2 |
10898151 | Harding et al. | Jan 2021 | B2 |
10921595 | Rakshit | Feb 2021 | B2 |
10928321 | Rawle | Feb 2021 | B2 |
10928638 | Ninan | Feb 2021 | B2 |
10935815 | Castaneda | Mar 2021 | B1 |
10935816 | Ban | Mar 2021 | B2 |
10936537 | Huston | Mar 2021 | B2 |
10939973 | DiMaio | Mar 2021 | B2 |
10939977 | Messinger et al. | Mar 2021 | B2 |
10941933 | Ferguson | Mar 2021 | B2 |
10946108 | Zhang | Mar 2021 | B2 |
10950338 | Douglas | Mar 2021 | B2 |
10951872 | Casas | Mar 2021 | B2 |
10964095 | Douglas | Mar 2021 | B1 |
10964124 | Douglas | Mar 2021 | B1 |
10966768 | Poulos | Apr 2021 | B2 |
10993754 | Kuntz et al. | May 2021 | B2 |
11000335 | Dorman | May 2021 | B2 |
11006093 | Hegyi | May 2021 | B1 |
11013550 | Rioux et al. | May 2021 | B2 |
11013560 | Lang | May 2021 | B2 |
11013562 | Marti | May 2021 | B2 |
11013573 | Chang | May 2021 | B2 |
11013900 | Malek | May 2021 | B2 |
11019988 | Fiebel | Jun 2021 | B2 |
11027027 | Manning | Jun 2021 | B2 |
11029147 | Abovitz et al. | Jun 2021 | B2 |
11030809 | Wang | Jun 2021 | B2 |
11041173 | Zhang | Jun 2021 | B2 |
11045663 | Mori | Jun 2021 | B2 |
11049293 | Chae | Jun 2021 | B2 |
11049476 | Fuchs et al. | Jun 2021 | B2 |
11050990 | Casas | Jun 2021 | B2 |
11057505 | Dharmatilleke | Jul 2021 | B2 |
11058390 | Douglas | Jul 2021 | B1 |
11061257 | Hakim | Jul 2021 | B1 |
11065062 | Frushour | Jul 2021 | B2 |
11067387 | Marell | Jul 2021 | B2 |
11071497 | Hallack | Jul 2021 | B2 |
11079596 | Hua et al. | Aug 2021 | B2 |
11087039 | Duff | Aug 2021 | B2 |
11090019 | Siemionow et al. | Aug 2021 | B2 |
11097129 | Sakata | Aug 2021 | B2 |
11099376 | Steier | Aug 2021 | B1 |
11103320 | LeBoeuf | Aug 2021 | B2 |
D930162 | Cremer et al. | Sep 2021 | S |
11109762 | Steier | Sep 2021 | B1 |
11122164 | Gigante | Sep 2021 | B2 |
11123604 | Fung | Sep 2021 | B2 |
11129562 | Roberts et al. | Sep 2021 | B2 |
11132055 | Jones et al. | Sep 2021 | B2 |
11135015 | Crawford | Oct 2021 | B2 |
11135016 | Frielinghaus et al. | Oct 2021 | B2 |
11141221 | Hobeika | Oct 2021 | B2 |
11153549 | Casas | Oct 2021 | B2 |
11153555 | Healy et al. | Nov 2021 | B1 |
11163176 | Karafin | Nov 2021 | B2 |
11164324 | Liu | Nov 2021 | B2 |
11166006 | Hegyi | Nov 2021 | B2 |
11172990 | Lang | Nov 2021 | B2 |
11179136 | Kohli | Nov 2021 | B2 |
11180557 | Noelle | Nov 2021 | B2 |
11185891 | Cousins | Nov 2021 | B2 |
11202682 | Staunton | Dec 2021 | B2 |
11207150 | Healy | Dec 2021 | B2 |
11217028 | Jones | Jan 2022 | B2 |
11224763 | Takahashi | Jan 2022 | B2 |
11227417 | Berlinger | Jan 2022 | B2 |
11244508 | Kazanzides et al. | Feb 2022 | B2 |
11253216 | Crawford et al. | Feb 2022 | B2 |
11253323 | Hughes et al. | Feb 2022 | B2 |
11257190 | Mao | Feb 2022 | B2 |
11263772 | Siemionow et al. | Mar 2022 | B2 |
11269401 | West et al. | Mar 2022 | B2 |
11272151 | Casas | Mar 2022 | B2 |
11278359 | Siemionow et al. | Mar 2022 | B2 |
11278413 | Lang | Mar 2022 | B1 |
11280480 | Wilt | Mar 2022 | B2 |
11284846 | Graumann | Mar 2022 | B2 |
11311341 | Lang | Mar 2022 | B2 |
11291521 | Im | Apr 2022 | B2 |
11294167 | Ishimoda | Apr 2022 | B2 |
11297285 | Pierce | Apr 2022 | B2 |
11300252 | Nguyen | Apr 2022 | B2 |
11300790 | Cheng et al. | Apr 2022 | B2 |
11304759 | Kovtun et al. | Apr 2022 | B2 |
11307402 | Steier | Apr 2022 | B2 |
11317973 | Calloway | May 2022 | B2 |
11337763 | Choi | May 2022 | B2 |
11348257 | Lang | May 2022 | B2 |
11350072 | Casas | May 2022 | B1 |
11350965 | Yilmaz et al. | Jun 2022 | B2 |
11351006 | Aferzon | Jun 2022 | B2 |
11360315 | Tu | Jun 2022 | B2 |
11382699 | Wassall | Jul 2022 | B2 |
11382700 | Calloway | Jul 2022 | B2 |
11382712 | Elimelech et al. | Jul 2022 | B2 |
11382713 | Healy | Jul 2022 | B2 |
11389252 | Gera et al. | Jul 2022 | B2 |
11432828 | Lang | Sep 2022 | B1 |
11432931 | Lang | Sep 2022 | B2 |
11452568 | Lang | Sep 2022 | B2 |
11460915 | Frielinghaus | Oct 2022 | B2 |
11461983 | Jones | Oct 2022 | B2 |
11464581 | Calloway | Oct 2022 | B2 |
11483532 | Casas | Oct 2022 | B2 |
11490986 | Ben-Yishai | Nov 2022 | B2 |
20020082498 | Wendt et al. | Jun 2002 | A1 |
20030117393 | Sauer et al. | Jun 2003 | A1 |
20030130576 | Seeley | Jul 2003 | A1 |
20030156144 | Morita | Aug 2003 | A1 |
20030210812 | Khamene et al. | Nov 2003 | A1 |
20030225329 | Rossner et al. | Dec 2003 | A1 |
20040019263 | Jutras et al. | Jan 2004 | A1 |
20040030237 | Lee et al. | Feb 2004 | A1 |
20040138556 | Cosman | Jul 2004 | A1 |
20040238732 | State et al. | Dec 2004 | A1 |
20050017972 | Poole et al. | Jan 2005 | A1 |
20050024586 | Teiwes | Feb 2005 | A1 |
20050119639 | McCombs et al. | Jun 2005 | A1 |
20050203367 | Ahmed et al. | Sep 2005 | A1 |
20050203380 | Sauer et al. | Sep 2005 | A1 |
20050215879 | Chuanggui | Sep 2005 | A1 |
20060134198 | Tawa | Jun 2006 | A1 |
20060176242 | Jaramaz | Aug 2006 | A1 |
20070018975 | Chaunggui et al. | Jan 2007 | A1 |
20070058261 | Sugihara et al. | Mar 2007 | A1 |
20080007645 | McCutchen | Jan 2008 | A1 |
20080035266 | Danziger | Feb 2008 | A1 |
20080085033 | Haven et al. | Apr 2008 | A1 |
20080159612 | Fu | Jul 2008 | A1 |
20080183065 | Goldbach | Jul 2008 | A1 |
20080221625 | Hufner et al. | Sep 2008 | A1 |
20080253527 | Boyden et al. | Oct 2008 | A1 |
20080262812 | Arata et al. | Oct 2008 | A1 |
20090018437 | Cooke | Jan 2009 | A1 |
20090062869 | Claverie et al. | Mar 2009 | A1 |
20090099445 | Burger | Apr 2009 | A1 |
20090036902 | Dimaio et al. | May 2009 | A1 |
20090227847 | Tepper et al. | Sep 2009 | A1 |
20090300540 | Russell | Dec 2009 | A1 |
20100106010 | Rubner et al. | Apr 2010 | A1 |
20100114110 | Taft et al. | May 2010 | A1 |
20100149073 | Chaum et al. | Jun 2010 | A1 |
20100210939 | Hartmann et al. | Aug 2010 | A1 |
20100274124 | Jascob et al. | Oct 2010 | A1 |
20110004259 | Stallings et al. | Jan 2011 | A1 |
20110098553 | Robbins et al. | Apr 2011 | A1 |
20110105895 | Kornblau | May 2011 | A1 |
20110216060 | Weising et al. | Sep 2011 | A1 |
20110245625 | Trovato et al. | Oct 2011 | A1 |
20110254922 | Schaerer et al. | Oct 2011 | A1 |
20110306873 | Shenai et al. | Dec 2011 | A1 |
20120014608 | Watanabe | Jan 2012 | A1 |
20120068913 | Bar-Zeev | Mar 2012 | A1 |
20120078236 | Schoepp | Mar 2012 | A1 |
20120109151 | Maier-Hein et al. | May 2012 | A1 |
20120143050 | Heigl | Jun 2012 | A1 |
20120155064 | Waters | Jun 2012 | A1 |
20120182605 | Hall et al. | Jul 2012 | A1 |
20120201421 | Hartmann et al. | Aug 2012 | A1 |
20120216411 | Wevers et al. | Aug 2012 | A1 |
20120289777 | Chopra et al. | Nov 2012 | A1 |
20120306850 | Balan | Dec 2012 | A1 |
20120320100 | Machida et al. | Dec 2012 | A1 |
20130002928 | Imai | Jan 2013 | A1 |
20130009853 | Hesselink | Jan 2013 | A1 |
20130050258 | Lu et al. | Feb 2013 | A1 |
20130057581 | Meier | Mar 2013 | A1 |
20130083009 | Geisner | Apr 2013 | A1 |
20130106833 | Fun | May 2013 | A1 |
20130135734 | Shafer et al. | May 2013 | A1 |
20130190602 | Liao | Jul 2013 | A1 |
20130209953 | Arlinsky et al. | Aug 2013 | A1 |
20130234914 | Fujimaki | Sep 2013 | A1 |
20130234935 | Griffith | Sep 2013 | A1 |
20130237811 | Mihailescu et al. | Sep 2013 | A1 |
20130249787 | Morimoto | Sep 2013 | A1 |
20130249945 | Kobayashi | Sep 2013 | A1 |
20130265623 | Sugiyama et al. | Oct 2013 | A1 |
20130267838 | Fronk | Oct 2013 | A1 |
20130278635 | Maggiore | Oct 2013 | A1 |
20130300760 | Sugano et al. | Nov 2013 | A1 |
20130342571 | Kinnebrew et al. | Dec 2013 | A1 |
20140031668 | Mobasser et al. | Jan 2014 | A1 |
20140049629 | Siewerdsen et al. | Feb 2014 | A1 |
20140088402 | Xu | Mar 2014 | A1 |
20140088990 | Nawana et al. | Mar 2014 | A1 |
20140104505 | Koenig | Apr 2014 | A1 |
20140114173 | Bar-Tal et al. | Apr 2014 | A1 |
20140142426 | Razzaque et al. | May 2014 | A1 |
20140168261 | Margolis et al. | Jun 2014 | A1 |
20140176661 | Smurro et al. | Jun 2014 | A1 |
20140177023 | Gao et al. | Jun 2014 | A1 |
20140189508 | Granchi et al. | Jul 2014 | A1 |
20140198129 | Liu et al. | Jul 2014 | A1 |
20140240484 | Kodama | Aug 2014 | A1 |
20140243614 | Rothberg et al. | Aug 2014 | A1 |
20140256429 | Kobayashi et al. | Sep 2014 | A1 |
20140266983 | Christensen | Sep 2014 | A1 |
20140268356 | Bolas et al. | Sep 2014 | A1 |
20140270505 | McCarthy | Sep 2014 | A1 |
20140275760 | Lee | Sep 2014 | A1 |
20140285404 | Takano et al. | Sep 2014 | A1 |
20140285429 | Simmons | Sep 2014 | A1 |
20140300632 | Laor | Oct 2014 | A1 |
20140300967 | Tilleman et al. | Oct 2014 | A1 |
20140303491 | Shekhar et al. | Oct 2014 | A1 |
20140320399 | Kim et al. | Oct 2014 | A1 |
20140333899 | Smithwick | Nov 2014 | A1 |
20140336461 | Reiter et al. | Nov 2014 | A1 |
20140340286 | Machida et al. | Nov 2014 | A1 |
20140361956 | Mikhailov | Dec 2014 | A1 |
20150005772 | Anglin et al. | Jan 2015 | A1 |
20150018672 | Blumhofer et al. | Jan 2015 | A1 |
20150070347 | Hofmann | Mar 2015 | A1 |
20150084990 | Labor | Mar 2015 | A1 |
20150150641 | Daon et al. | Jun 2015 | A1 |
20150182293 | Yang et al. | Jul 2015 | A1 |
20150209119 | Theodore et al. | Jul 2015 | A1 |
20150261922 | Nawana et al. | Sep 2015 | A1 |
20150277123 | Chaum et al. | Oct 2015 | A1 |
20150282735 | Rossner | Oct 2015 | A1 |
20150287188 | Gazit et al. | Oct 2015 | A1 |
20150287236 | Winn | Oct 2015 | A1 |
20150297314 | Fowler et al. | Oct 2015 | A1 |
20150305828 | Park | Oct 2015 | A1 |
20150310668 | Ellerbrock | Oct 2015 | A1 |
20150350517 | Duret et al. | Dec 2015 | A1 |
20150351863 | Plassky et al. | Dec 2015 | A1 |
20150363978 | Maimone et al. | Dec 2015 | A1 |
20150366620 | Cameron et al. | Dec 2015 | A1 |
20160022287 | Nehls | Jan 2016 | A1 |
20160030131 | Yang et al. | Feb 2016 | A1 |
20160086380 | Vayser et al. | Mar 2016 | A1 |
20160103318 | Du et al. | Apr 2016 | A1 |
20160125603 | Tanji | May 2016 | A1 |
20160133051 | Aonuma et al. | May 2016 | A1 |
20160143699 | Tanji | May 2016 | A1 |
20160153004 | Zhang | Jun 2016 | A1 |
20160175064 | Steinle et al. | Jun 2016 | A1 |
20160178910 | Giudicelli | Jun 2016 | A1 |
20160191887 | Casas | Jun 2016 | A1 |
20160223822 | Harrison | Aug 2016 | A1 |
20160249989 | Devam et al. | Sep 2016 | A1 |
20160256223 | Haimeri et al. | Sep 2016 | A1 |
20160302870 | Wilkinson et al. | Oct 2016 | A1 |
20160324580 | Esterberg | Nov 2016 | A1 |
20160324583 | Kheradpr et al. | Nov 2016 | A1 |
20160339337 | Ellsworth et al. | Nov 2016 | A1 |
20170027650 | Merck et al. | Feb 2017 | A1 |
20170031163 | Gao et al. | Feb 2017 | A1 |
20170068119 | Antaki | Mar 2017 | A1 |
20170076501 | Jagga | Mar 2017 | A1 |
20170086941 | Marti et al. | Mar 2017 | A1 |
20170112586 | Dhupar | Apr 2017 | A1 |
20170014119 | Capote et al. | Jun 2017 | A1 |
20170164919 | LaVallee et al. | Jun 2017 | A1 |
20170164920 | Lavallee et al. | Jun 2017 | A1 |
20170178375 | Benishti et al. | Jun 2017 | A1 |
20170220224 | Kodali | Aug 2017 | A1 |
20170239015 | Sela et al. | Aug 2017 | A1 |
20170251900 | Hansen et al. | Sep 2017 | A1 |
20170252109 | Yang et al. | Sep 2017 | A1 |
20170258526 | Lang | Sep 2017 | A1 |
20170281283 | Siegler et al. | Oct 2017 | A1 |
20170312032 | Amanatullah et al. | Nov 2017 | A1 |
20170348055 | Salcedo et al. | Dec 2017 | A1 |
20170348061 | Joshi et al. | Dec 2017 | A1 |
20170367766 | Mahfouz | Dec 2017 | A1 |
20170367771 | Tako et al. | Dec 2017 | A1 |
20170372477 | Penne | Dec 2017 | A1 |
20180003981 | Urey | Jan 2018 | A1 |
20180018791 | Guoyi | Jan 2018 | A1 |
20180028266 | Barnes et al. | Feb 2018 | A1 |
20180036884 | Chen et al. | Feb 2018 | A1 |
20180049622 | Ryan et al. | Feb 2018 | A1 |
20180055579 | Daon et al. | Mar 2018 | A1 |
20180078316 | Schaewe et al. | Mar 2018 | A1 |
20180082480 | White et al. | Mar 2018 | A1 |
20180092667 | Heigl et al. | Apr 2018 | A1 |
20180092698 | Chopra et al. | Apr 2018 | A1 |
20180092699 | Finley | Apr 2018 | A1 |
20180116732 | Lin et al. | May 2018 | A1 |
20180117150 | O'Dwyer | May 2018 | A1 |
20180133871 | Farmer | May 2018 | A1 |
20180153626 | Yang et al. | Jun 2018 | A1 |
20180182150 | Benishti et al. | Jun 2018 | A1 |
20180185100 | Weinstein et al. | Jul 2018 | A1 |
20180193097 | McLachlin et al. | Jul 2018 | A1 |
20180200002 | Kostrzewski et al. | Jul 2018 | A1 |
20180247128 | Alvi et al. | Aug 2018 | A1 |
20180262743 | Casas | Sep 2018 | A1 |
20180303558 | Thomas | Oct 2018 | A1 |
20180311011 | Van Beek et al. | Nov 2018 | A1 |
20180317803 | Ben-Yishai et al. | Nov 2018 | A1 |
20180318035 | McLachlin et al. | Nov 2018 | A1 |
20180368898 | DiVincenzo et al. | Dec 2018 | A1 |
20190000372 | Gullotti et al. | Jan 2019 | A1 |
20190000564 | Navab et al. | Jan 2019 | A1 |
20190015163 | Abhari et al. | Jan 2019 | A1 |
20190038362 | Nash et al. | Feb 2019 | A1 |
20190038365 | Soper | Feb 2019 | A1 |
20190043238 | Benishti et al. | Feb 2019 | A1 |
20190046272 | Zoabi et al. | Feb 2019 | A1 |
20190046276 | Inglese et al. | Feb 2019 | A1 |
20190053851 | Siemionow et al. | Feb 2019 | A1 |
20190069971 | Tripathi et al. | Mar 2019 | A1 |
20190080515 | Geri | Mar 2019 | A1 |
20190105116 | Johnson et al. | Apr 2019 | A1 |
20190130792 | Rios | May 2019 | A1 |
20190142519 | Siemionow et al. | May 2019 | A1 |
20190144443 | Jackson | May 2019 | A1 |
20190175228 | Elimelech et al. | Jun 2019 | A1 |
20190192230 | Siemionow et al. | Jun 2019 | A1 |
20190201106 | Siemionow et al. | Jul 2019 | A1 |
20190216537 | Eltorai | Jul 2019 | A1 |
20190254753 | Johnson | Aug 2019 | A1 |
20190273916 | Benishti et al. | Sep 2019 | A1 |
20190333480 | Lang | Oct 2019 | A1 |
20190369717 | Frielinghaus | Dec 2019 | A1 |
20190387351 | Lyren | Dec 2019 | A1 |
20200019364 | Pond | Jan 2020 | A1 |
20200020249 | Jarc et al. | Jan 2020 | A1 |
20200038112 | Amanatullah et al. | Feb 2020 | A1 |
20200078100 | Weinstein et al. | Mar 2020 | A1 |
20200085511 | Oezbek et al. | Mar 2020 | A1 |
20200088997 | Lee | Mar 2020 | A1 |
20200159313 | Gibby et al. | Mar 2020 | A1 |
20200100847 | Siegler et al. | Apr 2020 | A1 |
20200117025 | Sauer | Apr 2020 | A1 |
20200129058 | Li | Apr 2020 | A1 |
20200129136 | Harding et al. | Apr 2020 | A1 |
20200129262 | Verard | Apr 2020 | A1 |
20200129264 | Onativia et al. | Apr 2020 | A1 |
20200133029 | Yonezawa | Apr 2020 | A1 |
20200138518 | Lang | May 2020 | A1 |
20200138618 | Lang | May 2020 | A1 |
20200143594 | Lal et al. | May 2020 | A1 |
20200146546 | Chene | May 2020 | A1 |
20200151507 | Siemionow et al. | May 2020 | A1 |
20200156259 | Morales | May 2020 | A1 |
20200163723 | Wolf et al. | May 2020 | A1 |
20200163739 | Messinger et al. | May 2020 | A1 |
20200184638 | Meglan | Jun 2020 | A1 |
20200186786 | Gibby et al. | Jun 2020 | A1 |
20200188028 | Feiner et al. | Jun 2020 | A1 |
20200188034 | Lequette et al. | Jun 2020 | A1 |
20200201082 | Carabin | Jun 2020 | A1 |
20200229877 | Siemionow et al. | Jul 2020 | A1 |
20200237256 | Farshad et al. | Jul 2020 | A1 |
20200237459 | Racheli et al. | Jul 2020 | A1 |
20200237880 | Kent | Jul 2020 | A1 |
20200242280 | Pavloff et al. | Jul 2020 | A1 |
20200246074 | Lang | Aug 2020 | A1 |
20200246081 | Johnson et al. | Aug 2020 | A1 |
20200265273 | Wei | Aug 2020 | A1 |
20200275988 | Johnson | Sep 2020 | A1 |
20200288075 | Bonin et al. | Sep 2020 | A1 |
20200305980 | Lang | Oct 2020 | A1 |
20200315734 | El Amm | Oct 2020 | A1 |
20200321099 | Holladay et al. | Oct 2020 | A1 |
20200323460 | Busza | Oct 2020 | A1 |
20200327721 | Siemionow et al. | Oct 2020 | A1 |
20200330179 | Ton | Oct 2020 | A1 |
20200337780 | Winkler | Oct 2020 | A1 |
20200341283 | McCracken | Oct 2020 | A1 |
20200352655 | Freese | Nov 2020 | A1 |
20200355927 | Marcellin-Dibon | Nov 2020 | A1 |
20200360091 | Murray et al. | Nov 2020 | A1 |
20200375666 | Murphy | Dec 2020 | A1 |
20200377493 | Heiser | Dec 2020 | A1 |
20200377956 | Vogelstein | Dec 2020 | A1 |
20200388075 | Kazanzides et al. | Dec 2020 | A1 |
20200389425 | Bhatia | Dec 2020 | A1 |
20200390502 | Holthuizen et al. | Dec 2020 | A1 |
20200390503 | Casas et al. | Dec 2020 | A1 |
20200402647 | Domracheva et al. | Dec 2020 | A1 |
20200409306 | Gelman et al. | Dec 2020 | A1 |
20200410687 | Siemionow et al. | Dec 2020 | A1 |
20200413031 | Khani | Dec 2020 | A1 |
20210004956 | Book et al. | Jan 2021 | A1 |
20210009339 | Morrison et al. | Jan 2021 | A1 |
20210015583 | Avisar | Jan 2021 | A1 |
20210022599 | Freeman et al. | Jan 2021 | A1 |
20210022808 | Lang | Jan 2021 | A1 |
20210022811 | Mahfouz | Jan 2021 | A1 |
20210022828 | Elimelech et al. | Jan 2021 | A1 |
20210029804 | Chang | Jan 2021 | A1 |
20210030374 | Takahashi et al. | Feb 2021 | A1 |
20210030511 | Wolf et al. | Feb 2021 | A1 |
20210038339 | Yu | Feb 2021 | A1 |
20210049825 | Wheelwright et al. | Feb 2021 | A1 |
20210052348 | Stifter et al. | Feb 2021 | A1 |
20210065911 | Goel et al. | Mar 2021 | A1 |
20210077195 | Saeidi | Mar 2021 | A1 |
20210077210 | Itkowitz | Mar 2021 | A1 |
20210080751 | Lindsey | Mar 2021 | A1 |
20210090344 | Geri et al. | Mar 2021 | A1 |
20210093391 | Poltaretskyi et al. | Apr 2021 | A1 |
20210093392 | Poltaretskyi et al. | Apr 2021 | A1 |
20210093400 | Quid et al. | Apr 2021 | A1 |
20210093417 | Liu | Apr 2021 | A1 |
20210104055 | Ni et al. | Apr 2021 | A1 |
20210107923 | Jackson | Apr 2021 | A1 |
20210109349 | Schneider | Apr 2021 | A1 |
20210109373 | Loo | Apr 2021 | A1 |
20210110517 | Flohr | Apr 2021 | A1 |
20210113269 | Vilsmeier | Apr 2021 | A1 |
20210113293 | Silva et al. | Apr 2021 | A9 |
20210121238 | Palushi et al. | Apr 2021 | A1 |
20210137634 | Lang et al. | May 2021 | A1 |
20210141887 | Kim et al. | May 2021 | A1 |
20210150702 | Claessen et al. | May 2021 | A1 |
20210157544 | Denton | May 2021 | A1 |
20210160472 | Casas | May 2021 | A1 |
20210161614 | Elimelech et al. | Jun 2021 | A1 |
20210162287 | Jun 2021 | A1 | |
20210165207 | Peyman | Jun 2021 | A1 |
20210169504 | Brown | Jun 2021 | A1 |
20210169578 | Calloway et al. | Jun 2021 | A1 |
20210169581 | Calloway et al. | Jun 2021 | A1 |
20210169605 | Calloway et al. | Jun 2021 | A1 |
20210186647 | Elimelech et al. | Jun 2021 | A1 |
20210196404 | Wang | Jul 2021 | A1 |
20210223577 | Zheng | Jul 2021 | A1 |
20210227791 | De Oliveira Seixas | Jul 2021 | A1 |
20210235061 | Hegyi | Jul 2021 | A1 |
20210248822 | Choi | Aug 2021 | A1 |
20210282887 | Wiggermann | Sep 2021 | A1 |
20210290046 | Nazareth | Sep 2021 | A1 |
20210290336 | Wang | Sep 2021 | A1 |
20210290394 | Mahfouz | Sep 2021 | A1 |
20210295512 | Knoplioch et al. | Sep 2021 | A1 |
20210298835 | Wang | Sep 2021 | A1 |
20210306599 | Pierce | Sep 2021 | A1 |
20210311322 | Belanger | Oct 2021 | A1 |
20210314502 | Liu | Oct 2021 | A1 |
20210315636 | Akbarian | Oct 2021 | A1 |
20210315662 | Freeman et al. | Oct 2021 | A1 |
20210325684 | Ninan | Oct 2021 | A1 |
20210333561 | Oh | Oct 2021 | A1 |
20210346115 | Dulin et al. | Nov 2021 | A1 |
20210349677 | Baldev | Nov 2021 | A1 |
20210369226 | Siemionow et al. | Dec 2021 | A1 |
20210371413 | Thurston | Dec 2021 | A1 |
20210373333 | Moon | Dec 2021 | A1 |
20210373344 | Loyola | Dec 2021 | A1 |
20210378757 | Bay | Dec 2021 | A1 |
20210386482 | Gera et al. | Dec 2021 | A1 |
20210389590 | Freeman | Dec 2021 | A1 |
20210400247 | Casas | Dec 2021 | A1 |
20210401533 | Im | Dec 2021 | A1 |
20210402255 | Fung | Dec 2021 | A1 |
20210405369 | King | Dec 2021 | A1 |
20220003992 | Ahn | Jan 2022 | A1 |
20220007006 | Healy et al. | Jan 2022 | A1 |
20220008135 | Frielinghaus et al. | Jan 2022 | A1 |
20220038675 | Hegyi | Feb 2022 | A1 |
20220039873 | Harris | Feb 2022 | A1 |
20220051484 | Jones et al. | Feb 2022 | A1 |
20220071712 | Wolf et al. | Mar 2022 | A1 |
20220079675 | Lang | Mar 2022 | A1 |
20220121041 | Hakim | Apr 2022 | A1 |
20220142730 | Wolf et al. | May 2022 | A1 |
20220155861 | Myung | May 2022 | A1 |
20220159227 | Casas | May 2022 | A1 |
20220179209 | Cherukuri | Jun 2022 | A1 |
20220192776 | Gibby et al. | Jun 2022 | A1 |
20220201274 | Achilefu et al. | Jun 2022 | A1 |
20220245400 | Siemionow et al. | Aug 2022 | A1 |
20220133484 | Lang | Sep 2022 | A1 |
20220287676 | Steines et al. | Sep 2022 | A1 |
20220295033 | Casas | Sep 2022 | A1 |
20220304768 | Elimelech et al. | Sep 2022 | A1 |
20220358759 | Cork et al. | Nov 2022 | A1 |
20220405935 | Flossmann et al. | Dec 2022 | A1 |
20230009793 | Gera et al. | Jan 2023 | A1 |
20230027801 | Qian et al. | Jan 2023 | A1 |
20230034189 | Gera et al. | Feb 2023 | A1 |
Number | Date | Country |
---|---|---|
3022448 | Feb 2018 | CA |
3034314 | Feb 2018 | CA |
101379412 | Mar 2009 | CN |
103106348 | May 2013 | CN |
111915696 | Nov 2020 | CN |
112489047 | Mar 2021 | CN |
202004011567 | Nov 2004 | DE |
102004011567 | Sep 2005 | DE |
102014008153 | Oct 2014 | DE |
0933096 | Aug 1999 | EP |
1640750 | Mar 2006 | EP |
1757974 | Feb 2007 | EP |
2134847 | Jun 2015 | EP |
2891966 | Jan 2017 | EP |
3123970 | Feb 2017 | EP |
2654749 | May 2017 | EP |
3216416 | Sep 2017 | EP |
2032039 | Oct 2017 | EP |
3247297 | Nov 2017 | EP |
2030193 | Jul 2018 | EP |
3034607 | Mar 2019 | EP |
2892558 | Apr 2019 | EP |
2635299 | Jul 2019 | EP |
3505050 | Jul 2019 | EP |
3224376 | Aug 2019 | EP |
2875149 | Dec 2019 | EP |
3206583 | Sep 2020 | EP |
2625845 | Mar 2021 | EP |
3076660 | Apr 2021 | EP |
3858280 | Aug 2021 | EP |
3593227 | Sep 2021 | EP |
3789965 | Dec 2021 | EP |
3634294 | Jan 2022 | EP |
3952331 | Feb 2022 | EP |
2507314 | Apr 2014 | GB |
20140120155 | Oct 2014 | KR |
0334705 | Apr 2003 | WO |
2006002559 | Jan 2006 | WO |
2007051304 | May 2007 | WO |
2008103383 | Aug 2008 | WO |
2010067267 | Jun 2010 | WO |
WO2010074747 | Jul 2010 | WO |
WO2012101286 | Aug 2012 | WO |
2013112554 | Aug 2013 | WO |
2014024188 | Feb 2014 | WO |
WO2014037953 | Mar 2014 | WO |
2014113455 | Jul 2014 | WO |
2014125789 | Aug 2014 | WO |
2014167563 | Oct 2014 | WO |
2014174067 | Oct 2014 | WO |
2015058816 | Apr 2015 | WO |
WO2015061752 | Apr 2015 | WO |
WO2015109145 | Jul 2015 | WO |
2016151506 | Sep 2016 | WO |
WO2007115826 | Oct 2017 | WO |
2018052966 | Mar 2018 | WO |
2018073452 | Apr 2018 | WO |
WO2018200767 | Apr 2018 | WO |
2018206086 | Nov 2018 | WO |
2019083431 | May 2019 | WO |
2019161477 | Aug 2019 | WO |
2019195926 | Oct 2019 | WO |
2019211741 | Nov 2019 | WO |
WO2019210353 | Nov 2019 | WO |
2020109903 | Jun 2020 | WO |
2020109904 | Jun 2020 | WO |
2021019369 | Feb 2021 | WO |
WO2021017019 | Feb 2021 | WO |
WO2021023574 | Feb 2021 | WO |
WO2021046455 | Mar 2021 | WO |
WO2021048158 | Mar 2021 | WO |
WO2021021979 | Apr 2021 | WO |
WO2021061459 | Apr 2021 | WO |
WO2021062375 | Apr 2021 | WO |
WO2021073743 | Apr 2021 | WO |
WO2021087439 | May 2021 | WO |
WO2021091980 | May 2021 | WO |
2021255627 | Jun 2021 | WO |
WO2021112918 | Jun 2021 | WO |
2021130564 | Jul 2021 | WO |
WO2021137752 | Jul 2021 | WO |
WO2021141887 | Jul 2021 | WO |
WO2021145584 | Jul 2021 | WO |
WO2021154076 | Aug 2021 | WO |
2021188757 | Sep 2021 | WO |
WO2021183318 | Dec 2021 | WO |
WO2021257897 | Dec 2021 | WO |
WO2021258078 | Dec 2021 | WO |
WO2022009233 | Jan 2022 | WO |
2022053923 | Mar 2022 | WO |
2022079565 | Apr 2022 | WO |
2023281395 | Jan 2023 | WO |
2023007418 | Feb 2023 | WO |
2023021448 | Feb 2023 | WO |
2023021450 | Feb 2023 | WO |
2023021451 | Feb 2023 | WO |
2023026229 | Mar 2023 | WO |
Entry |
---|
US 11,395,705 B2, 09/2022, Lang (withdrawn) |
Jon Fingas, “Fraunhofer iPad app guides liver surgery through augmented reality”, Aug. 22, 2013, Engadget.com, URL: https://www.engadget.com/2013/08/22/fraunhofer-ipad-app-guides-liver-surgery/(Year: 2013). |
European Patent Application # 16767845.7 Office Action dated May 21, 2019. |
Hainich et al., “Near-Eye displays”, Chapter 10 of Displays: Fundamentals and Applications, CRC press, pp. 439-504, Jul. 5, 2011. |
BRAINLAB—Image Registration Options Enhanced Visualization Leveraging More Data , pp. 1-4, Feb. 2019. |
International Patent Application # PCT/IB2019/053524 search report dated Aug. 14, 2019. |
Liao et al., ‘3-D Augmented Reality for MRI-Guided Surgery Using Integral Videography Autostereoscopic Image Overlay’, IEEE Transactions on Biomedical Engineering, vol. 57, No. 6, pp. 1476-1486, Feb. 17, 2010. |
Sagitov et al., “Comparing Fiducial Marker Systems in the Presence of Occlusion”, International Conference on Mechanical, System and Control Engineering (ICMSC), pp. 1-6, 2017. |
Liu et al., “Marker orientation in fiducial registration”, Medical Imaging 2003: Image Processing, Proceedings of SPIE vol. 5032, pp. 1176-1185, 2003. |
International Application # PCT/IB2019/059770 search report dated Mar. 17, 2020. |
International Application # PCT/IB2019/059771 search report dated Mar. 1, 2020. |
U.S. Appl. No. 16/419,023 Third party submission dated Jan. 19, 2020. |
U.S. Appl. No. 16/199,281 Office Action dated Jun. 11, 2020. |
International Appication # PCT/IB2020/056893 Search Report dated Nov. 9, 2020. |
U.S. Appl. No. 16/200,144 Office Action dated Dec. 28, 2020. |
International Application # PCT/IB2020/060017 Search Report dated Jan. 7, 2021. |
Elimelech et al., U.S. Appl. No. 16/724,297, filed Dec. 22, 2019. |
Wolf et al., U.S. Appl. No. 16/524,258, filed Jul. 29, 2019. |
U.S. Appl. No. 16/724,297 Office Action dated Jan. 26, 2021. |
JP Application # 2021525186 Office Action dated Dec. 1, 2021. |
EP Application # 19796580.9 Search Report dated Dec. 20, 2021. |
International Application # PCT/IB2021/058088 Search Report dated Dec. 20, 2021. |
U.S. Appl. No. 16/200,144 Office Action dated Aug. 19, 2021. |
International Application # PCT/IB2021/055242 Search Report dated Oct. 7, 2021. |
U.S. Appl. No. 16/724,297 Office Action dated Nov. 4, 2021. |
CN Application #2019800757525 Office Action dated Mar. 1, 2022. |
U.S. Appl. No. 16/200,144 Office Action dated Mar. 15, 2022. |
U.S. Appl. No. 16/524,258 Office Action dated Apr. 11, 2022. |
EP Application # 16767845.7 Office Action dated Apr. 29, 2022. |
Lorensen et al., “Marching Cubes: A High Resolution 3D Surface Construction Algorithm,” ACM SIGGRAPH '87, Computer Graphics, vol. 21, No. 4, pp. 163-169, Jul. 1987. |
Wikipedia, “Marching Cubes,” pp. 1-4, last edited Sep. 4, 2021. |
Milletari et al., “V-Net: fully Convolutional Neural Networks for Volumetric Medical Image Segmentation,” arXiv:1606.04797v1, pp. 1-11, Jun. 15, 2016. |
Zhang et al., “Medical Volume Rendering Techniques,” Independent Research, Spring 2014, arXiv:1802.07710v1, pp. 1-33, Feb. 21, 2018. |
Van Ooijen et al., “Noninvasive Coronary Imaging Using Electron Beam CT: Surface Rendering Versus Volume Rendering,” Computers in Radiology, AJR, vol. 180, pp. 223-226, Jan. 2003. |
Webster (ed.), “Structured Light Techniques and Applications,” Wiley Encyclopedia of Electrical and Electronics Engineering, pp. 1-24, year 2016. |
Liberadzki et al., “Structured-Light-Based System for Shape Measurement of the Human Body in Motion,” Sensors, vol. 18, pp. 1-19, year 2018. |
Romero, “Volume Ray Casting Techniques and Applications Using General Purpose Computations on Graphics Processing Units,” Thesis/Dissertation Collections, Rochester Institute of Technology, RIT Scholar Works, pp. 1-140, Jun. 2009. |
International Application PCT/IB2022/056986 filed Jul. 28, 2022. |
International Application PCT/IB2022/057733 filed Aug. 18, 2022. |
International Application PCT/IB2022/057735 filed Aug. 18, 2022. |
International Application PCT/IB2022/057736 filed Aug. 18, 2022. |
International Application PCT/IB2022/057965 filed Aug. 25, 2022. |
International Application PCT/IB2022/059030 filed Sep. 23, 2022. |
Gera et al., U.S. Appl. No. 17/388,064, filed Jul. 29, 2021. |
Mitrasinovic et al., “Clinical and surgical applications of smart glasses”, pp. 381-401, Technology and Health Care, issue 23, year 2015. |
Martin-Gonzalez et al., “Head-mounted virtual loupe with sight-based activation for surgical applications”, IEEE symposium on mixed and augmented reality, pp. 207-208, Oct. 19-22, 2009. |
Figl et al., “A fully automated calibration method for an optical see-through head-mounted operating microscope with variable zoom and focus”, pp. 1492-1499, IEEE transactions on medical imaging, vol. 24, No. 11, Nov. 2005. |
Medithinq Co. Ltd., “Metascope: world's first wearable scope”, pp. 1-7, Jan. 2023. |
Martin-Gonzalez et al., “Sight-based magnification system for surgical applications”, pp. 26-30, Conference proceedings of Bildverarbeitung für die Medizin, year 2010. |
Burstrom et al., “Frameless patient tracking with adhesive optical skin markers for augmented reality surgical navigation in spine surgery”, Spine, vol. 45, No. 22, pp. 1598-1604, year 2020. |
Suenaga et al., “Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study”, BMC Medical Imaging, pp. 1-11, year 2015. |
Mayfield Clinic, “Spinal Fusion: Lateral Lumbar Interbody Fusion (LLIF)”, pp. 1-6, Jan. 2021. |
Qian et al., “AR-Loupe: Magnified Augmented Reality by Combining an Optical See-Through Head-Mounted Display and a Loupe”, pp. 2550-2562, IEEE Transactions on Visualization and Computer Graphics, vol. 28, No. 7, Jul. 2022. |
Kazanzides et al., “Systems and Methods for Augmented Reality Magnifying Loupe”, case ID 15944, pp. 1-2, Nov. 26, 2020. |
International Application PCT/IB2022/057965 Search Report dated Dec. 15, 2022. |
U.S. Appl. No. 16/524,258 Office Action dated Jan. 24, 2023. |
International Application PCT/IB2022/057733 Search Report dated Jan. 26, 2023. |
European Application 22203956.2 Search Report dated Feb. 9, 2023. |
International Application PCT/IB2022/059030 Search report dated Feb. 28, 2023. |
Number | Date | Country | |
---|---|---|---|
20190273916 A1 | Sep 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16159740 | Oct 2018 | US |
Child | 16419023 | US | |
Parent | 15896102 | Feb 2018 | US |
Child | 16159740 | US | |
Parent | 15127423 | US | |
Child | 15896102 | US |