The present invention relates to a display and, in particular, it concerns a display for providing an image to the eye of a viewer.
For applications such as near-eye displays, a projected image having a large field is desirable. This is typically achieved by injecting a large field image into a waveguide from a single image projector. The waveguide expands the aperture of the projected image, thereby illuminating the eye with a large field image.
However, in order to achieve such aperture expansion, a large projector and/or large optics are typically required, which is disadvantageous for use in near-eye displays and other applications where the display must be small in order to be useable in the desired application. Additionally, the angular dimensions of the field of view from a given waveguide and coupling-out arrangement are limited by geometrical optics considerations such as the range of angles which can be trapped within the waveguide so as to propagate by internal reflection, and avoidance of overlap between an image and its conjugate within the waveguide.
According to the teachings of the present invention there is provided, a display for providing an image to an eye of a viewer, the display including: (a) at least two projector assemblies, each projector assembly including: (i) a light-guide optical element (LOE) having a pair of parallel external surfaces, and (ii) an image projector arrangement generating a partial image, the image projector arrangement being deployed to introduce the partial image from the image projector arrangement into the LOE so as to propagate within the LOE by internal reflection from the pair of parallel external surfaces, each projector assembly including a coupling-out arrangement associated with the LOE and configured for coupling out the partial image from the LOE towards the eye of the viewer, wherein the LOE of a first of the projector assemblies is deployed in overlapping relation with the LOE of a second of the projector assemblies such that the first projector assembly projects a first partial image corresponding to a first part of the image, and the second projector assembly projects a second partial image corresponding to a second part of the image, the first and second part of the image having partial overlap so that the at least two projector assemblies cooperate to display the image to the eye of the viewer; and (b) a controller including at least one processor, the controller being associated with the image projector arrangement of at least the first and second projector assemblies, and configured to reduce a pixel intensity of selected pixels projected by at least one of the first and second image projector arrangements, the selected pixels being in a region of the partial overlap between the first and second part of the image so as to enhance a perceived uniformity of the image.
According to the teachings of the present invention there is further provided, display for providing an image to an eye of a viewer, the display including: (a) a projector assembly including: (i) a light-guide optical element (LOE) having a pair of parallel external surfaces, and two non-parallel sets of mutually parallel reflective surfaces, the LOE being configured for 2D aperture expansion of an image propagating through it, (ii) at least two image projector arrangements generating at least two partial images corresponding to at least a first part of the image and at least a second part of the image, respectively, the at least two image projector arrangements being deployed to introduce the at least two partial images into the LOE so as to propagate the at least two partial images within the LOE, by internal reflection from the pair of parallel external surfaces, the projector assembly including a coupling-out arrangement associated with the LOE and configured for coupling out the partial images from the LOE towards the eye of the viewer, wherein the at least a first part of the image and at least a second part of the image have partial overlap so that the at least two projector assemblies cooperate to display the image to the eye of the viewer; and (b) a controller including at least one processor, the controller being associated with the at least two image projector arrangements, and configured to reduce a pixel intensity of selected pixels projected by at least one of the first and second image projector arrangements, the selected pixels being in a region of the partial overlap between the first and second part of the image so as to enhance a perceived uniformity of the image.
According to the teachings of the present invention there is further provided method of providing an image to an eye of a viewer, including: generating, by a first projector assembly including a first LOE and a first image projector arrangement, a first partial image corresponding to a first part of the image for coupling out to the viewer; generating, by a second projector assembly including a second LOE and a second image projector arrangement, a second partial image corresponding to a second part of the image for coupling out to the viewer, wherein the first and second LOEs are deployed in overlapping relation such that the first and second part of the image are coupled out to the viewer having partial overlap so that the projector assemblies cooperate to display the image to the eye of the viewer; determining, by a controller associated with the first and second image projector arrangements, a subset of pixels in a region of the partial overlap; and reducing, by the controller, the intensity of selected pixels in the subset of pixels, the selected pixels being projected by at least one of the first and second image projector arrangements, so as to enhance the perceived uniformity of the image.
According to some aspects of the present invention, the display includes at least a third projector assembly, the at least a third projector assembly including: (i) a LOE having a pair of parallel external surfaces, and (ii) an image projector arrangement generating a third partial image corresponding to a third part of the image and being deployed to introduce the third part of the image from the image projector arrangement into the LOE so as to propagate within the LOE by internal reflection from the pair of parallel external surfaces, the at least a third projector assembly including a coupling-out arrangement associated with the LOE and configured for coupling out the third partial image from the LOE towards the eye of the viewer, wherein the LOE of the at least a third projector assembly is deployed in overlapping relation with the LOE of at least one of the first and second projector assembly such that the at least three projector assemblies cooperate to display the image to the eye of the viewer, wherein the controller is further associated with the image projector arrangement of the at least a third projector assembly and configured to reduce a pixel intensity of selected pixels projected by at least one image projector arrangement of at least one of the projector assemblies, the selected pixels being a region of partial overlap between at least two parts of the image.
According to some aspects of the present invention, the first partial image and the second partial image share a set of common pixels, and wherein the selected pixels of reduced intensity are a subset of the set of common pixels.
According to some aspects of the present invention, the controller varies the selection of the subset of the set of common pixels responsively to an overlap region adjustment input.
According to some aspects of the present invention, the overlap region adjustment input is derived from a pupil position sensor.
According to some aspects of the present invention, the overlap region adjustment input is derived from a manual user input.
According to some aspects of the present invention, the controller is configured to gradually reduce the intensity of the selected pixels projected by the first projector arrangement across the region of partial overlap, and to gradually increase the intensity of the selected pixels projected by the second projector arrangement across the region of partial overlap.
According to some aspects of the present invention, the second projector assembly includes a second image projector arrangement generating a third partial image corresponding to a third part of the image, and being deployed to introduce the third partial image into the LOE of the second projector assembly such that the first, second and third parts of the image have partial overlap, and wherein the controller is further associated with the second image projector arrangement and configured to reduce a pixel intensity of selected pixels projected by at least one image projector arrangement of at least one of the projector assemblies, the selected pixels being a region of partial overlap between at least two parts of the image.
According to some aspects of the present invention, the LOEs of the at least two projector assemblies are deployed parallel to one another.
According to some aspects of the present invention, the LOEs of the at least two projector assemblies are deployed non-parallel to one another.
According to some aspects of the present invention, the LOEs are deployed to extend around or partially encompass the viewer or an eye of the viewer, the display further including one or more index-matched mediums deployed around the viewer between the LOEs forming an optically smooth transition with edges of the LOE.
The invention is herein described, by way of example only, with reference to the accompanying drawings, wherein:
The present invention provides a display for projecting large field images using small sized optics by projecting a plurality of partial, narrow field images to be combined and viewed by the viewer as a single, large field image.
The term “field” as used herein should be understood to refer to the field of view of a projected image.
The term “eye-box” as used herein should be understood to refer to the general area where a pupil is expected to be while viewing an image. It is expected that the actual pupil position within the eye-box will vary across different viewers (e.g. based on interpupillary distance (“IPD”)), and even for a given viewer at different times (e.g. based on eyeball rotation).
The principles and operation of the display according to the present invention may be better understood with reference to the drawings and the accompanying description.
Referring now to the drawings,
Image Projection and Combination
Image projector arrangement 20a is deployed to introduce the partial image into the waveguide so as to propagate the partial image within the waveguide by internal reflection from the pair of parallel external surfaces. The introduction of the partial image into the waveguide is achieved via a suitable optical arrangement, referred to as a coupling-in arrangement, which typically includes a prism with suitably angled surfaces associated with a side edge of the LOE or one of the major surfaces of the LOE and/or one or more coupling-in reflectors which may be located within the LOE or associated with one of the surfaces thereof. Details of the image projector arrangement, including the coupling-in arrangement, are omitted from the schematic drawings for simplicity of presentation of the invention. A coupling-out arrangement 7a (shown as dashed rectangles on the LOE) associated with LOE 28a is deployed to couple-out the partial image from the waveguide towards the eye of the viewer.
In some embodiments, the projector arrangement 20a can be a wide optical arrangement, or may include a distinct optical arrangement for lateral aperture expansion. The coupling-out arrangement 7 is typically implemented as one or more sets of obliquely-angled, mutually-parallel internal partially reflecting surfaces, or as a diffractive optical element, all as is known in the art. The general region of the LOE from which the image illumination is coupled-out towards the eye of the viewer is designated by dashed lines.
Projector assembly 5b includes image projector arrangements 20b1 and 20b2, and LOE 28b. Image projector arrangement 20b1 is configured to generate and project a second partial image corresponding to a second part of the image. Image projector arrangement 20b2 is configured to generate and project a third partial image corresponding to a third part of the image. Image projector arrangements 20b1 and 20b2 are deployed to introduce the second and third partial images, respectively, into LOE 28b2 so as to propagate the partial images within the LOE by internal reflection from the LOE's pair of parallel external surfaces. Coupling-out arrangements 7b1, 7b2 (shown as dashed rectangles on the LOE) associated with LOE 28b are deployed to couple-out the second and third partial images, respectively, from the waveguide towards the eye of the viewer. Note that coupling-out arrangement 7a is in practice associated with projector assembly 5a, but shown in
In the embodiment shown in
It should be noted that while LOE 28a is shown as being located behind LOE 28b, in principle LOE 28a could alternatively be in front of LOE 28b. Preferably, LOEs 28a and 28b should be as close to one another as possible, though an air gap, or a layer simulating an air gap, is typically required in order to maintain the light guiding properties of the LOE. In some embodiments, if the image projector arrangement is wider than its associated waveguide such that part of the image projector will extend over the side of the LOE, it is preferable to have image projector arrangements 20b1 and 20b2 extend over opposing sides of the LOE.
Preferably, the field and aperture continuity as well as pixel intensity uniformity should be maintained when the viewer's pupil is at different positions in the eye-box.
While it should be readily apparent from
The terms “overlap region”, “region(s) of overlap”, and “region of partial overlap” will now be used to refer to image data that is simultaneously projected by more than one image project arrangement. As noted, typically a subset of the pixels within the region of overlap will illuminate the pupil from both projectors at any given time (the other pixels reaching the eye from only one projector while light from the other falls to the left or right of the pupil).
Referring now to
Referring now to
By contrast, the opposite is true when the pupil is at pupil position 15b, where for pixel 1002F only light ray 1002b illuminates the pupil, while for pixel 2002F both light rays 2002a and 2002b illuminate the pupil.
Thus, for pupil position 15a, the “selected pixels” within the region of overlap preferably include pixel 1002F but not 2002F. For pupil position 15b, the selected pixels within the region of overlap preferably include pixel 2002F but not 1002F.
Note that at both of pupil positions 15a and 15b, neither pixel 1000F nor 2000F are included in the overlapping region because each of these pixels originates from one image projector arrangement.
This demonstrates although the overlapping regions of the image are fixed according to the configuration of the projector assemblies, typically only a subset of pixels within the overlapping region will illuminate the pupil from two projectors at a given time based on the viewer's pupil position.
Pixel Intensity Reduction
It should be appreciated that light rays that reach the pupil from two sources will produce pixels having nearly twice the intensity compared to other pixels produced from one source, leading to a perceived non-uniformity in the viewed image. To address this non-uniformity, it is desirable to reduce the intensity of these pixels. However, as already pointed out above, the number of projector arrangements from which illumination arrives at the viewer's pupil for various pixels in the region of overlap between the partial images will vary according to the pupil position across the eye-box. An intensity correction according to an aspect of the present invention is therefore preferably performed only on a selected subset of the pixels within the region of overlap of the partial images, as will now be detailed.
Therefore, according to some embodiments, the pixel intensity of selected pixels in regions of overlap are reduced (e.g. via a controller, as will be further detailed below) so as to enhance the perceived uniformity of the image when viewed by a viewer.
However, if the viewer's eye is repositioned to pupil position 15b, (see
Therefore, according to some embodiments, the controller may vary the subset of pixels for which the intensity is reduced based on an overlap region adjustment input, e.g. based on the viewer's anticipated or known pupil position. In some embodiments, the overlap region adjustment input may be derived automatically, e.g. via a pupil sensor. In some embodiments, the overlap region adjustment input may be derived by manual input from the user. For example, a test image can be displayed to the user with overlapping parts. The user can be asked to look at various parts of the image and provide input to reduce the intensity of select pixels, such as by actuating a knob or lever coupled to the controller, when the image appears uniform. Alternatively, the user can provide feedback to adjustments made by the controller, for example during a calibration process. The controller receiving such feedback can vary the subset of pixels for intensity reduction until a best approximation for a uniform perceived image is achieved.
By way of example,
It should be noted that pupil position changes when the viewer looks in different directions, i.e., at different parts of the projected image due to rotation of the eye about its center of rotation. Typically, the sensitivity of the human eye to variations in image intensity is much greater in the central region of view, while a person is much more tolerant of intensity variations in their peripheral vision. Accordingly, it is typically sufficient to perform an adjustment to optimize the region of intensity correction for each “seam” (region of overlap) for the pupil position which corresponds to an eye direction looking towards that seam. Thus, for example, the aforementioned manual user adjustment may advantageously be performed as part of a software-guided calibration process in which the user is first instructed to look at a projected calibration image spanning a first seam, e.g., to the left, and to make the manual adjustment until that calibration image appears uniform, and then to look at a projected calibration image spanning a second seam, e.g., to the right, and to make the manual adjustment until that calibration image appears uniform. Those settings may then be used continuously for subsequent projection of images, independent of the instantaneous pupil position, with the understanding that the seam regions of the field of view will be at high quality while the user is looking at them with her central vision, and may be somewhat non-uniform in the peripheral vision.
In some embodiments a pupil sensor can be deployed to dynamically detect eyeball rotation (e.g. as a function of deviation from a predetermined rotation center). Based on the detected eyeball rotation, the controller can determine the subset of pixels to be intensity reduced and make appropriate adjustments, providing full-field uniformity optimization for each instantaneous position of the pupil.
In some embodiments, the display can include a separate waveguide for each image projector arrangement. In a particularly preferred embodiment, the display includes three projector arrangements, and three corresponding waveguides, as shown in
This embodiment can be further extended to multiple light guide panels encompassing any desired angle around the observer, and optionally replicated in two dimensions to provide an overall concave display, which could be extended to form a viewing dome or the like.
Projector arrangements 24, 26 project images at two different angles into LOE 28. The light from both projector arrangements is first reflected by facets 30 (thereby expanding aperture in one dimension, e.g. vertically) and subsequently reflected by facets 32 outward toward the observer while simultaneously expanding the aperture in the other dimension, e.g. horizontally. Each projector arrangement generates a partial image, which is then coupled-out to the viewer such that the viewer sees a combined image. Note that the region of overlap between the partial images may be a side-by-side horizontal arrangement as shown in
Projector arrangement 38 is oriented to reflect primarily from facets 32 while projector arrangement 40 is oriented for reflection primarily from facets 30. Light from both projector arrangements 38 and 40 experience some back-and-forth reflection between the perpendicular sets of facets 30, 32 causing aperture expansion in both the vertical and horizontal dimensions.
Each projector assembly 5 includes at least one image projector arrangement 20, and at least one LOE 28 having a pair of parallel external surfaces. Image projector arrangement 20 is configured to generate and project a partial image and is deployed so as introduce the partial image to LOE 28. LOE 28 is configured to propagate the partial image within the LOE by internal reflection from the pair of parallel external surfaces. In some embodiments, each projector assembly includes a coupling-out arrangement 7 associated with LOE 28 and configured for coupling-out the partial image from the LOE towards the eye of the viewer.
In some embodiments, the LOEs 28 of respective projector assemblies are deployed in overlapping relation with one another such that each projector assembly projects a respective partial image corresponding to a respective part of the image to be displayed to the viewer. The respective parts of the image have partial overlap so that the two or more projector assemblies cooperate to display the image to the viewer.
Controller 74 is associated with the image projector arrangements of each projector assembly. Controller 74 includes at least one processor 76 associated with a memory 78. Processor 76, in combination with associated memory 78, is configured to execute one or more functional modules stored in memory 78 for controlling display 70, including, e.g. reducing a pixel intensity of selected pixels projected by at least one image projector arrangement, the selected pixels being in a region of partial overlap between parts of the image, so as to enhance the perceived uniformity of the image displayed to the viewer.
In some embodiments, the controller may be configured to vary the pixel intensities of selected pixels in the region of overlap taking into account any variance in the projector arrangements' pixel intensities projected across the field and the viewer's pupil position within the eye-box.
In some embodiments, the controller may be configured to gradually reduce the intensity of the selected pixels projected by one projector arrangement across the region of partial overlap, and to gradually increase the intensity of the selected pixels projected by the second projector arrangement across the region of partial overlap.
In some embodiments, controller 74 may be coupled to a user input device (not shown) configured for providing user input to controller 74, for example as described above with reference to
In some embodiments, display 70 further includes a pupil sensor 72 configured to detect a current pupil position of the viewer, and to update the controller 74 with data indicative of current pupil position.
In some embodiments, controller 74 is configured to determine a subset of common pixels between the partial images, based on the data obtained from the pupil sensor or based on a user input, for which image illumination is arriving at the pupil from two projectors simultaneously, and to implement intensity reduction for that subset of pixels. In some embodiments, the controller is further configured to vary the selection of the subset of common pixels in response to an overlap region adjustment input. The overlap region adjustment input can be derived from the pupil sensor, or from manual user input.
In some embodiments, the controller can be configured to obtain calibration data in cooperation with pupil sensor 72 and store the obtained calibration data in memory 78, and determine the appropriate overlap region adjustment for every pupil position based on the stored calibration data.
At step 86, the pupil sensor detects the pupil position of the viewer's eye.
At step 88, at each different pupil position, determine a subset of pixels in the overlapping region of the image displayed to the viewer.
At step 90, the intensity of pixels within the determined subset is reduced so as to enhance the uniformity of the image displayed to the eye of the viewer. This intensity reduction is typically performed by modifying the image data sent to the projector, reducing the pixel intensity values for the relevant subset of pixels which are sent to both projectors. For example, a pixel with RGB values of (200,80,168) from the region of perceived overlap could be sent to both projectors as if the pixel data were a dimmer pixel of the same color, such as (100,40,84), assuming ideal linear response of the projector. In practice, the correction may need to be calibrated according to the specific hardware properties of the projector assemblies. Additionally, as described above, the output intensity of the different projector assemblies are typically not uniform across the field, and intensity correction should preferably take into account these non-uniformities.
Although the intensity reduction profile has been illustrated herein as a step function, with 50% intensity being contributed by each projector in the region of perceived overlap, it should be noted that the subdivision of intensity between the two projectors need not be equal for any given pixel, and that the smoothness of the resulting image will typically be greatly enhanced by use of a linear tapering, or an otherwise smoothed transition profile.
It should be appreciated by those skilled in the art that the displays provided herein may be implemented both in virtual reality and in augmented reality applications (i.e. where virtual display elements are combined with a direct view of the real world).
It will be appreciated that the above descriptions are intended only to serve as examples, and that many other embodiments are possible within the scope of the present invention as defined in the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2019/054062 | 5/16/2019 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/220386 | 11/21/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
2329001 | Robinson | Sep 1943 | A |
3544190 | Moorhusen | Dec 1970 | A |
5770847 | Olmstead | Jun 1998 | A |
5999836 | Nelson | Dec 1999 | A |
7949252 | Georgiev | May 2011 | B1 |
8810914 | Amitai | Aug 2014 | B2 |
8861081 | Mansharof et al. | Oct 2014 | B2 |
8873150 | Amitai | Oct 2014 | B2 |
8902503 | Amitai et al. | Dec 2014 | B2 |
9069180 | Amitai | Jun 2015 | B2 |
9104036 | Amitai et al. | Aug 2015 | B2 |
9207457 | Amitai | Dec 2015 | B2 |
9248616 | Amitai | Feb 2016 | B2 |
9279986 | Amitai | Mar 2016 | B2 |
9316832 | Levin et al. | Apr 2016 | B2 |
9328182 | Burmaster et al. | May 2016 | B2 |
9417453 | Amitai | Aug 2016 | B2 |
9448408 | Amitai et al. | Sep 2016 | B2 |
9488840 | Mansharof et al. | Nov 2016 | B2 |
9500869 | Amitai | Nov 2016 | B2 |
9513481 | Levin et al. | Dec 2016 | B2 |
9551874 | Amitai | Jan 2017 | B2 |
9568738 | Mansharof et al. | Feb 2017 | B2 |
9664910 | Mansharof et al. | May 2017 | B2 |
9740013 | Amitai et al. | Aug 2017 | B2 |
9804396 | Amitai | Oct 2017 | B2 |
9910283 | Amitai | Mar 2018 | B2 |
9977244 | Amitai | May 2018 | B2 |
10048499 | Amitai | Aug 2018 | B2 |
10073264 | Amitai | Sep 2018 | B2 |
10133070 | Danziger | Nov 2018 | B2 |
10261321 | Amitai | Apr 2019 | B2 |
10302835 | Danziger | May 2019 | B2 |
10437031 | Danziger et al. | Oct 2019 | B2 |
10473841 | Danziger | Nov 2019 | B2 |
10481319 | Danziger et al. | Nov 2019 | B2 |
10506220 | Danziger | Dec 2019 | B2 |
20040084088 | Callies | May 2004 | A1 |
20050073577 | Sudo | Apr 2005 | A1 |
20050180687 | AMitai | Aug 2005 | A1 |
20050225866 | Ageel | Oct 2005 | A1 |
20060146518 | Dubin | Jul 2006 | A1 |
20060153518 | Abu AGeel et al. | Jul 2006 | A1 |
20070052929 | Allman | Mar 2007 | A1 |
20070091445 | Amitai | Apr 2007 | A1 |
20080106775 | Amitai et al. | May 2008 | A1 |
20080151379 | Amitai | Jun 2008 | A1 |
20080186604 | Amitai | Aug 2008 | A1 |
20080198471 | Amitai | Aug 2008 | A1 |
20080278812 | Amitai | Nov 2008 | A1 |
20080285140 | Amitai | Nov 2008 | A1 |
20080316606 | Inoguchi | Dec 2008 | A1 |
20090052046 | Amitai | Feb 2009 | A1 |
20090052047 | Amitai | Feb 2009 | A1 |
20090097127 | Amitai | Apr 2009 | A1 |
20090122414 | Amitai | May 2009 | A1 |
20090153437 | Aharoni | Jun 2009 | A1 |
20090237804 | Amitai et al. | Sep 2009 | A1 |
20100171680 | Lapidot et al. | Jul 2010 | A1 |
20100290124 | Tohara | Nov 2010 | A1 |
20110050595 | Lunback et al. | Mar 2011 | A1 |
20110191690 | Hang et al. | Aug 2011 | A1 |
20110242661 | Simmonds | Oct 2011 | A1 |
20120062998 | Schultz | Mar 2012 | A1 |
20120179369 | Lapidot et al. | Jul 2012 | A1 |
20130187836 | Cheng | Jul 2013 | A1 |
20130229717 | Amitai | Sep 2013 | A1 |
20130276960 | Amitai | Oct 2013 | A1 |
20130279017 | Amitai | Oct 2013 | A1 |
20140016051 | Kroll | Jan 2014 | A1 |
20140104665 | Popovitch | Apr 2014 | A1 |
20140118813 | Amitai et al. | May 2014 | A1 |
20140118836 | Amitai et al. | May 2014 | A1 |
20140118837 | Amitai et al. | May 2014 | A1 |
20140126051 | Amitai et al. | May 2014 | A1 |
20140126052 | Amitai et al. | May 2014 | A1 |
20140126056 | Amitai et al. | May 2014 | A1 |
20140126057 | Amitai et al. | May 2014 | A1 |
20140126175 | Amitai et al. | May 2014 | A1 |
20150138451 | Amitai | May 2015 | A1 |
20150198805 | Mansharof et al. | Jul 2015 | A1 |
20150205140 | Mansharof et al. | Jul 2015 | A1 |
20150205141 | Mansharof et al. | Jul 2015 | A1 |
20150207990 | Ford | Jul 2015 | A1 |
20150277127 | Amitai | Oct 2015 | A1 |
20150293360 | Amitai | Oct 2015 | A1 |
20160116743 | Amitai | Apr 2016 | A1 |
20160170212 | Amitai | Jun 2016 | A1 |
20160170213 | Amitai | Jun 2016 | A1 |
20160170214 | Amitai | Jun 2016 | A1 |
20160187656 | Amitai | Jun 2016 | A1 |
20160195724 | Levin et al. | Jul 2016 | A1 |
20160312913 | Thybo et al. | Oct 2016 | A1 |
20160341964 | Amitai | Nov 2016 | A1 |
20160349518 | Amitai et al. | Dec 2016 | A1 |
20170045744 | Amitai | Feb 2017 | A1 |
20170052376 | Amitai | Feb 2017 | A1 |
20170052377 | Amitai | Feb 2017 | A1 |
20170336636 | Amitai et al. | Nov 2017 | A1 |
20170357095 | Amitai | Dec 2017 | A1 |
20170363799 | Ofir et al. | Dec 2017 | A1 |
20180003862 | Benitez | Jan 2018 | A1 |
20180039082 | Amitai | Feb 2018 | A1 |
20180067315 | Amitai et al. | Mar 2018 | A1 |
20180101087 | Shinohara | Apr 2018 | A1 |
20180157057 | Gelberg et al. | Jun 2018 | A1 |
20180210202 | Danziger | Jul 2018 | A1 |
20180267317 | Amitai | Sep 2018 | A1 |
20180275384 | Danziger | Sep 2018 | A1 |
20180292592 | Danziger | Oct 2018 | A1 |
20180292599 | Ofir et al. | Oct 2018 | A1 |
20180372940 | Ishii et al. | Dec 2018 | A1 |
20180373039 | Amitai | Dec 2018 | A1 |
20180373115 | Brown et al. | Dec 2018 | A1 |
20190011710 | Amitai | Jan 2019 | A1 |
20190056600 | Danziger | Feb 2019 | A1 |
20190064518 | Danziger | Feb 2019 | A1 |
20190155035 | Amitai | May 2019 | A1 |
20190170327 | Eisenfeld et al. | Jun 2019 | A1 |
20190208187 | Danziger | Jul 2019 | A1 |
20190212487 | Danziger et al. | Jul 2019 | A1 |
20190227215 | Danziger et al. | Jul 2019 | A1 |
20190278086 | Ofir | Sep 2019 | A1 |
20190285900 | Amitai | Sep 2019 | A1 |
20190293856 | Danziger | Sep 2019 | A1 |
20190339530 | Amitai | Nov 2019 | A1 |
20190346609 | Eisenfeld | Nov 2019 | A1 |
20190361240 | Gelberg | Nov 2019 | A1 |
20190361241 | Amitai | Nov 2019 | A1 |
20190377187 | Rubin et al. | Dec 2019 | A1 |
20190391408 | Mansharof | Dec 2019 | A1 |
20200033572 | Danziger et al. | Jan 2020 | A1 |
20200041713 | Danziger | Feb 2020 | A1 |
20200089001 | Amitai et al. | Mar 2020 | A1 |
20200110211 | Danziger et al. | Apr 2020 | A1 |
20200120329 | Danziger | Apr 2020 | A1 |
20200133008 | Amitai | Apr 2020 | A1 |
20200150330 | Danziger et al. | May 2020 | A1 |
20200183159 | Danziger | Jun 2020 | A1 |
20200183170 | Amitai et al. | Jun 2020 | A1 |
20200200963 | Eisenfeld et al. | Jun 2020 | A1 |
20200209667 | Sharlin et al. | Jul 2020 | A1 |
20200241308 | Danziger et al. | Jul 2020 | A1 |
20200249481 | Danziger et al. | Aug 2020 | A1 |
20200278547 | Singer | Sep 2020 | A1 |
20200278557 | Greenstein et al. | Sep 2020 | A1 |
20200285060 | Amitai | Sep 2020 | A1 |
20200292417 | Lobachinsky et al. | Sep 2020 | A1 |
20200292744 | Danziger | Sep 2020 | A1 |
20200292819 | Danziger et al. | Sep 2020 | A1 |
20200310024 | Danziger et al. | Oct 2020 | A1 |
20200326545 | Amitai et al. | Oct 2020 | A1 |
20200355924 | Dobschal | Nov 2020 | A1 |
20200371311 | Lobachinsky et al. | Nov 2020 | A1 |
20210003849 | Amitai et al. | Jan 2021 | A1 |
20210018755 | Amitai | Jan 2021 | A1 |
20210033773 | Danziger et al. | Feb 2021 | A1 |
20210033862 | Danziger et al. | Feb 2021 | A1 |
20210033872 | Rubin et al. | Feb 2021 | A1 |
20210055218 | Aldaag et al. | Feb 2021 | A1 |
20210055466 | Eisenfeld | Feb 2021 | A1 |
20210055561 | Danziger et al. | Feb 2021 | A1 |
20210063733 | Ronen | Mar 2021 | A1 |
20210072553 | Danziger et al. | Mar 2021 | A1 |
20210099691 | Danziger | Apr 2021 | A1 |
20210109351 | Danziger et al. | Apr 2021 | A1 |
20210116367 | Gelberg et al. | Apr 2021 | A1 |
20210141141 | Danziger et al. | May 2021 | A1 |
20210157150 | Amitai | May 2021 | A1 |
20210165231 | Danziger | Jun 2021 | A1 |
Number | Date | Country |
---|---|---|
1485692 | Jun 1967 | FR |
2617562 | Jan 1989 | FR |
Number | Date | Country | |
---|---|---|---|
20210055561 A1 | Feb 2021 | US |
Number | Date | Country | |
---|---|---|---|
62750269 | Oct 2018 | US | |
62672635 | May 2018 | US |