The present disclosure relates generally to a white balance apparatus for an endoscope having a plurality of cameras. Further provided are methods of manufacturing the apparatus and methods of using the same.
An endoscope is a medical device used to image an anatomical site (e.g. an anatomical/body cavity, a hollow organ). Unlike some other medical imaging devices, the endoscope is inserted into the anatomical site (e.g. through small incisions made on the skin of the patient). An endoscope can be employed not only to inspect an anatomical site and organs therein (and diagnose a medical condition in the anatomical site) but also as a visual aid in surgical procedures. Medical procedures involving endoscopy include laparoscopy, arthroscopy, cystoscopy, ureterostomy, and hysterectomy.
White balance (WB) is the process removing unrealistic color casts created as an artifact by digital imaging units (cameras), so that objects which appear white under different light sources or conditions are rendered white in the captured image/video. The white balance process considers inter alia, the color temperature of the light source and the white balance therefore improves the quality of the acquired image/video under a wide range of lighting conditions and light sources.
White balance process is particularly important for endoscopic procedures, where the images and/or video streams obtained from internal organs need to be as clear as possible, and hence color adjustment is critical for proper visualization.
There is thus a need in the art for apparatuses and methods for performing an accurate white balance for cameras of an endoscope, in particular, for endoscope system having more than one camera, wherein each of the cameras is placed in different orientation in the distal tip of the endoscope.
Aspects of the disclosure, according to some embodiments thereof, relate to an advantageous white balance apparatus, for use with an endoscope having a plurality of cameras at the endoscope distal tip. The disclosed white balance apparatus, allows an efficient and accurate white balance procedure to the plurality of cameras of the endoscope, in an accurate and simultaneous fashion. The disclosed white balance apparatus is advantageous as it is accurate, cost efficient, customizable and easy to use. The disclosed white balance apparatus provides means to allow simultaneous white balance calibration to the plurality of cameras of the endoscope, thereby ensuring optimal images and videos to be obtained during surgical operation. The various apparatuses disclosed herein may be disposable or reusable and may advantageously be sterilizable.
According to some embodiments, there is provided a white balance apparatus for a distal tip of an endoscope which includes at least two cameras, the white balance apparatus includes a shell which is at least partially enclosing an internal space, the shell includes an opening configured to allow the insertion of the distal tip into the internal space, such that each of the cameras faces a portion of the internal wall of the shell at a predefined distance therefrom, to thereby allow simultaneous white balancing of the at least two cameras.
According to some embodiments, there is provided a system for white balance of at least two cameras located at a tip of an endoscope, the system includes a white balance apparatus having a shell at least partially enclosing an internal space, the shell includes an opening configured to allow the insertion of the endoscope tip into the internal space, such that each of said cameras faces a portion of the internal wall of the shell at a predefined distance therefrom; and a main control unit configured to receive image data from each of the cameras while being placed/located within the apparatus, and execute a white balance processing on the images.
According to some embodiments, there is provided a method for performing white balance for images obtained simultaneously from least two cameras located at an endoscope tip, the method includes: inserting the endoscope tip into a white balance apparatus having a shell at least partially enclosing an internal space, the shell includes an opening configured to allow the insertion of the endoscope tip into said internal space, such that each of the cameras faces a portion of the internal wall of the shell at a predefined distance therefrom; verifying the positioning of the endoscope tip within the inner space of the apparatus, such that each of the cameras is placed at a predetermined distance from the internal wall of the shell of the apparatus; obtaining one or more images from the cameras, and processing the one or more images to perform white balance using a processor on a main control unit.
According to some embodiments, there is provided a white balance apparatus for a distal tip of an endoscope, said distal tip includes at least two cameras, the white balance apparatus includes an external shell at least partially enclosing an internal space, the external shell includes an opening configured to allow the insertion of the distal tip into said internal space, such that each of said at least two cameras faces a portion of an internal shell surface of the internal space at a predefined distance therefrom, to thereby allow simultaneous white balancing of said at least two cameras.
According to some embodiments, the opening further includes an indicator (apparatus indicator) configured to indicate the extent of insertion of the distal tip into the internal space of the apparatus.
According to some embodiments, the indicator may include a marking, a rim, a groove and/or a sensor.
According to some embodiments, the indicator may be configured to facilitate the positioning of the distal tip within the internal space, such that each of said at least two cameras is placed at the predetermined distance from the internal shell surface of the apparatus.
According to some embodiments, external (ambient) light is essentially prevented from entering the internal space when the distal tip is positioned with the internal space.
According to some embodiments, the internal space may include a cylindrical, round, or spherical shape.
According to some embodiments, the apparatus is substantially spherical and comprises a semi-flexible structure.
According to some embodiments, the apparatus may be made of thermoplastic elastomer (TPE) and/or a thermoset elastomer.
According to some embodiments, the apparatus may be made of silicone. According to some embodiments, the silicone may include 40 shore silicon.
According to some embodiments, the apparatus may be constructed by press molding, injection molding and/or machining.
According to some embodiments, the apparatus may be substantially cylindrical or tube-like and comprises at least partially rigid walls. According to some embodiments, the apparatus may further include an outer shell made of or includes glass, and/or transparent plastic, and an external coating. According to some embodiments, the external coating is white. According to some embodiments, the external coating includes Polytetrafluoroethylene (PTFE). According to some embodiments, thickness of the shell may be determined based on the light focus distance.
According to some embodiments, the predetermined distance between each of the at least two cameras and the internal shell surface of the apparatus is a working distance of each of the at least two cameras.
According to some embodiments, a distance between an outer window of each of the at least two cameras and the internal shell surface of the apparatus may be in the range of about 1-150 mm.
According to some embodiments, the distance between the outer window of each of the at least two cameras and the internal shell surface of the apparatus may be in the range of about 30-60 mm.
According to some embodiments, each of the at least two cameras may be associated with at least one illumination component.
According to some embodiments, the at least two cameras include a front camera and a first side-camera.
According to some embodiments, the at least two cameras further include a second side-camera, wherein the first side-camera and the second side-camera are positioned on opposite sides of the endoscope tip, and wherein the first side-camera is positioned distally relative to the second side-camera.
According to some embodiments, the at least two cameras may provide at least about 270 degrees horizontal field-of-view (FOV) of a target area within an anatomical cavity into which the endoscope is inserted, after a white balance calibration is performed utilizing the white balance apparatus.
According to some embodiments, the at least one illumination component may be or may include a discrete light source.
According to some embodiments, each of the at least two cameras may include a sensor configured to be associated with a main control unit having a suitable white balance circuit for executing a white balance processing of images obtained by the at least two cameras, while the distal tip of the endoscope is placed within the white balance apparatus.
According to some embodiments, there is provided a system for white balance of at least two cameras located at a distal tip of an endoscope, the system includes: a white balance apparatus including an external shell at least partially enclosing an internal space, the external shell including an opening configured to allow the insertion of the distal tip into said internal space, such that each of said at least two cameras faces a portion of an internal shell surface of the internal space at a predefined distance therefrom; and a main control unit configured to receive image data from each of the at least two cameras while placed within the apparatus, and execute a white balance processing on said images.
According to some embodiments, the system may further include a display configured to display one or more images and/or parameters related to the white balance processing.
According to some embodiments, there is provided a method for performing white balance for images obtained simultaneously from at least two cameras located at a distal tip of an endoscope, the method includes one or more of the steps of:
According to some embodiments, the method may further include storing in memory white balance parameters.
According to some embodiments, the white balance may be performed for a predetermined period of time. According to some embodiments, the period of time may be in the length of about 1-10 seconds.
Certain embodiments of the present disclosure may include some, all, or none of the above advantages. One or more other technical advantages may be readily apparent to those skilled in the art from the figures, descriptions, and claims included herein. Moreover, while specific advantages have been enumerated above, various embodiments may include all, some, or none of the enumerated advantages.
Aspects of the disclosure may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, and so forth, which perform particular tasks or implement particular abstract data types. Disclosed embodiments may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
Some embodiments of the disclosure are described herein with reference to the accompanying figures. The description, together with the figures, makes apparent to a person having ordinary skill in the art how some embodiments may be practiced. The figures are for the purpose of illustrative description and no attempt is made to show structural details of an embodiment in more detail than is necessary for a fundamental understanding of the disclosure. For the sake of clarity, some objects depicted in the figures are not to scale.
In the figures:
The principles, uses, and implementations of the teachings herein may be better understood with reference to the accompanying description and figures. Upon perusal of the description and figures present herein, one skilled in the art will be able to implement the teachings herein without undue effort or experimentation. In the figures, same reference numerals refer to same parts throughout.
According to some embodiments, there is provided herein an advantageous white balance apparatus for an endoscope having two or more cameras at a distal end (tip) thereof. The advantageous white balance apparatus allows an efficient and accurate white balance processing, preferably simultaneously, to all cameras of the endoscope, albeit differences between the cameras, such as, that each camera faces a different direction of view and having a different field of view.
Reference is now made to
The handle 104 may include a user control interface 138 configured to allow a user to control endoscope 100 functions. User control interface 138 may be functionally associated with cameras 120 and illumination components 122 via an electronic coupling between shaft 102 and handle 104. According to some embodiments, user control interface 138 may allow, for example, to control zoom, focus, multifocal views, record/stop recording, freeze frame functions, etc., of cameras 120 and/or to adjust the light intensity provided by illumination components 122.
Each of cameras 120 may include a sensor, such as a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor, and a camera lens (e.g. an extreme wide-angle lens) or a lens assembly. Cameras 120 may be configured to provide a continuous/panoramic/surround field-of-view (FOV), as elaborated on below in the description of
Reference is now made to
As shown in
Main control unit 210 may include a user interface 212 (e.g. buttons and/or knobs, a touch panel, a touch screen) configured to allow a user to operate main control unit 210 and/or may allow control thereof using one or more input devices 214, e.g. an external user control interface connectable thereto such as a keyboard, a mouse, a portable computer, and/or even a mobile computational device e.g. a smartphone or a tablet. According to some embodiments, input devices 214 may include a voice controller. According to some embodiments, main control unit 210 may further be configured to partially or even fully operate at least two cameras 120 and illumination components 122 (shown in
According to some embodiments, endoscope 100 is functionally associated with main control unit 210 via a utility cable 142 (shown in
Monitor 220 is configured to display images and, in particular, to display multifocal stream videos captured by at least two cameras 120, and may be connected to main control unit 210 by a cable (e.g. a video cable) or wirelessly. According to some embodiments, monitor 220 may be configured to display thereon information regarding the operation of endoscope 100, as specified above. According to some embodiments, monitor 220, or a part thereof, may function as a touch screen. According to some such embodiments, the touch screen may be used to operate main control unit 210. According to some embodiments, images/videos from different cameras (from at least two cameras 120) or from different lens assemblies of the different cameras, may be displayed separately (e.g. side-by-side, picture on picture, in an equal aspect ratio, in an un-equal aspect ratios, in multiple copies of one or more of the video streams, and the like) on monitor 220, and/or may be presented as a single panoramic/surround, optionally multi-focal or 3D image/video. According to some embodiments, user interface 212 and/or input devices 214 and/or user control interface 138 are configured to allow switching between images/videos corresponding to different field of views (of different cameras) and/or of different field of views (obtained from different lens assemblies of one or more sensors). For example, according to some embodiments, wherein at least two cameras 120 include a front camera 120a, a first side camera 120b, and a second side cameras 120c: switching between footage(s) captured by one or more lens assemblies from front camera 120a to footage(s) captured by one or more lens assemblies of first side camera 120b, switching between footage(s) captured by one or more lens assemblies of front camera 120a to footage(s) captured by one or more lens assemblies of second side cameras 120c, or switching between panoramic/surround video(s) generated from the footage(s) of all of cameras 120a, 120b, and 120c to footage captured by one of cameras 120a, 120b, or 120c. Cameras 120a, 120b, and 120c are depicted together in
According to some embodiments, main control unit 210 may be associated with a plurality of monitors, such as monitor 220, thereby allowing displaying different videos and images on each. For example, main control unit 210 may be associated with four monitors, such as to allow displaying videos from each of cameras 120a, 120b, 120c on three of the monitors, respectively, and a panoramic video (corresponding to the combination of the three videos) on the fourth monitor, which may be wider than the other three. Main control unit may further be used to calibrate one or more of the endoscope cameras, for example, facilitating white balance calibration, utilizing the white balance apparatuses as disclosed herein.
The field-of-view (FOV) provided by endoscope 100 is the combination of the respective FOVs provided by each of at least two cameras 120. At least two Cameras 120 may be configured to provide a continuous and consistent FOV, or at least a continuous and consistent horizontal FOV (HFOV), as explained below.
Reference is now made to
In some embodiments, the combined HFOV is formed by a front HFOV 310a, a first side-HFOV 310b, and a second side-HFOV 310c of front camera 120a, first side-camera 120b, and second side-camera 120c, respectively. Each of HFOVs 310a, 310b, and 310c lies on the xy-plane. HFOV 310a is positioned between HFOVs 310b and 310c and overlaps with each. A first overlap area 320ab corresponds to an area whereon HFOVs 310a and 310b overlap. In other words, first overlap area 320ab is defined by the intersection of the xy-plane with the overlap region (volume) of the FOVs of front camera 120a and first side-camera 120b. Similarly, a second overlap area 320ac corresponds to an area whereon HFOVs 310a and 310c overlap. A first intersection point 330ab is defined as the point in first overlap area 320ab which is closest to front camera 120a. It is noted that first intersection point 330ab also corresponds to the point in first overlap area 320ab which is closest to first side-camera 120b. Similarly, a second intersection point 330ac is defined as the point in second overlap area 320ac which is closest to front camera 120a. It is noted that second intersection point 330ac also corresponds to the point in second overlap area 320ac which is closest to second side-camera 120c.
According to some embodiments, the combined HFOV spans between about 220 degrees to about 270 degrees, between about 240 degrees to about 300 degrees, or between about 240 degrees to about 340 degrees. Each possibility corresponds to separate embodiments. According to some embodiments, the combined HFOV spans at least about 270 degrees. According to some embodiments, for example, each of HFOVs 310a, 310b, and 310c may measure between about 85 degrees to about 120 degrees, between about 90 degrees to about 110 degrees, or between about 95 degrees to about 120 degrees. Each possibility corresponds to separate embodiments.
According to some embodiments, shaft 102 may measure between about 100 millimeters and about 500 millimeters in length, and shaft body 106 may have a diameter measuring between about 2.5 millimeters and about 15 millimeters. According to some embodiments, front camera 120a may be offset relative to a longitudinal axis A, which centrally extends along the length of shaft 102. According to some embodiments, the distance between second side camera 120c and front surface 146 is greater than the distance between first side camera 120b and front surface 146.
According to some embodiments, front camera 120a may be offset relative to the longitudinal axis A by up to about 0.05 millimeters, up to about 0.1 millimeters, up to about 0.5 millimeters, up to about 1.0 millimeters, up to about 1.5 millimeters, up to about 5.0 millimeters, or up to about 7.0 millimeters. Each possibility corresponds to separate embodiments. According to some embodiments, for example, front camera 120a may be offset relative to the longitudinal axis A by between about 0.05 millimeters to about 0.1 millimeters, about 0.5 millimeters to about 1.5 millimeters, about 1.0 millimeter to about 5.0 millimeters, about 1.5 millimeters to about 5.0 millimeters, or about 1.0 millimeters to about 7.0 millimeters. Each possibility corresponds to separate embodiments. According to some embodiments, first side-camera 120b may be positioned at a distance of up to about 1.0 millimeters, up to about 5.0 millimeters, or up to about 15.0 millimeters from front surface 146. Each possibility corresponds to separate embodiments. According to some embodiments, second side-camera 120c may be positioned at a distance of up to about 1.0 millimeters, up to about 5.0 millimeters, up to about 15.0 millimeters, or up to about 25.0 millimeters from front surface 146, such as to optionally be positioned farther from front surface 146 than first-side-camera 120b. Each possibility corresponds to separate embodiments. According to some embodiments, the positioning of cameras 120 on shaft distal section 112 is selected such as to minimize the space occupied by cameras 120 and reduce the diameter of shaft distal section 112, while affording a continuous and consistent HFOV of at least about 270 degrees.
According to some embodiments, each of cameras 120 is associated with a respective illumination component from illumination components 122, which is configured to illuminate the FOV of the camera. Thus, according to some embodiments, front camera 120a may be associated with a respective front illumination component (not numbered), first side-camera 120b may be associated with a respective first side-illumination component, and second side-camera 120c may be associated with a respective second side-illumination component.
According to some embodiments, not depicted in the figures, cameras 120 include only two cameras, both of which are side cameras with fisheye lenses. In such embodiments, shaft distal section 112 may taper in the distal section, such that the cameras provide a continuous HFOV. According to some embodiments, not depicted in the figures, cameras 120 include only two cameras: a front camera and a side camera.
Reference is now made to
According to some embodiments, each of endoscopes 100, and 400 may be (i) directly maneuvered by a user through the manipulation of handle 104, as well as (ii) indirectly maneuvered, via robotics, e.g. using a robotic arm or other suitable gripping means configured to allow manipulation of handle 104.
Reference is now made to
Reference is now made to
In some embodiments, the distance between each of the cameras and the internal shell surface is equal. In some embodiments, the distance between the cameras and the internal shell is not equal. Illustrated in
According to some embodiments, front camera 552A provides a field of view 554A, side camera 552C provides a field of view 554C and first side camera (not shown) provides a field of view 554B. Field of views 554A, 554B and 554C correlate respectively to front HFOV 310a, first side-HFOV 310b, and second side-HFOV 310c of
Reference is now made to
Reference is now made to
According to some embodiments, the white balance apparatus as disclosed in
In some embodiments, the white balance apparatus may have a semi-flexible or flexible structure.
In some embodiments, the white balance apparatus may be constructed of various suitable materials, such as, for example, thermoplastic elastomer (TPE) and/or a thermoset elastomer. In some embodiments, the apparatus may be constructed of silicone, or may contain at least some silicone.
In some exemplary embodiments, the silicone may be 40 shore silicon, or the like.
In some embodiments, the white balance apparatus may be constructed by press molding, using a suitable mold configured to form an outer shell with an opening, enclosing an internal space. In some embodiments, the white balance apparatus may be constructed by injection molding. In some embodiments, the white balance apparatus may be constructed using machine processing (machining). In some embodiments, the white balance apparatus may be manufactured by 3D-printing.
Reference is now made to
Reference is now made to
According to some embodiments, the correct positioning of the endoscope tip within the white balance apparatus is dictated by a distance of each of the plurality of cameras on the endoscope tip from an internal surface of the apparatus. In some embodiments, the distance is determined based on the working distance of each of the plurality of cameras. Thus, according to some exemplary embodiments, the distance between an outer window/cover/lens of a camera and the internal space of the white balance apparatus is in the range of about 1-150 millimeters, or any subranges thereof. In some embodiments, the distance is in the range of about 15-125 millimeters. In some embodiments, the distance is in the range of about 1-100 millimeters. In some embodiments, the distance is in the range of about 2-50 millimeters. In some embodiments, the distance is in the range of about 1-30 millimeters. In some embodiments, the distance between each of the cameras and the internal space/surface is equal. In some embodiments, the distance between the cameras and the internal surface/space is not equal.
According to some embodiments, the apparatus is disposable. In some embodiments, the apparatus is reusable. In some embodiments, the apparatus is sterilizable. In some embodiments, the apparatus is autoclavable.
In some embodiments, the white balance processing is performed by a processing unit, for example, a processing unit located in a main control system (either as part of a main processing unit, as a circuit of the processing unit, or as a dedicated processing unit).
In some embodiments, the white balance processing includes obtaining images/video streams from a plurality of cameras within a distal tip of an endoscope, while being correctly placed/situated within the white balance apparatus, which is used as a reference color, typically white. The obtained reference images are then transferred/conveyed to the proceeding unit for further processing. The white balance processing of the obtained reference images is carried out, for example, by assigning values, parameters or any suitable coefficients to the obtained images, such that when color images are to be provided (for example, when the endoscope is inserted into a body cavity), suitable adjustment of intensity, color temperature, Hue, and the like, may be performed, to adjust one or more or each of the colors (depending on the optical sensors used (for example, RGB: red (R), green (G) and blue (B); CYMK (Cyan, Yellow, Magenta, Black); and/or YCMG: yellow (Ye), cyan (Cy), magenta (Mg), and green (G)) and apply the adjustment to the images/video signals generated from the cameras. In some embodiments, the white balance process (i.e., obtaining images from the cameras at the distal tip of the endoscope while being correctly positioned within the white balance apparatus) may be performed under different lighting conditions (for example, as dictated by the intensity of light generated by the illumination components). In some embodiments, the white balance process (may be performed for a desired length of time, that may be predetermined, or determined according to the progress of the processing. The time length may be, for example, in the range of about 0.2-20 seconds, or any subranges thereof.
According to some embodiments, there is provided a method for performing white balance of an endoscope having a plurality of cameras. In some embodiments, the method is for performing white balance for images obtained simultaneously from least two cameras located at a distal tip of an endoscope, the method includes: inserting the distal tip into the white balance apparatus as disclosed herein; verifying the positioning of the distal tip within the internal space of the apparatus, such that each of the cameras is placed at a predetermined distance from the internal shell surface of the internal space of the white balance apparatus; obtaining one or more images from the cameras, and processing the one or more images to perform white balance using a processor on a main control unit.
According to some embodiments, the method may include further data manipulation, calculation or processing. In some embodiments, the method may further include determining the white balance parameters or values (for each of the cameras and/or for a combination of cameras) and optionally storing at least some of the parameters in memory. According to some embodiments, the white balance process may be performed for a suitable period of time, such as, in the range of about 1-10 seconds, 2-5 seconds, or the like.
Reference is now made to
According to some embodiments, the method may further include presenting or displaying the image(s) obtained by the cameras during the white balance process and/or information/data/parameter related thereto.
In the description and claims of the application, the words “include” and “have”, and forms thereof, are not limited to members in a list with which the words may be associated.
Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. In case of conflict, the patent specification, including definitions, governs. As used herein, the indefinite articles “a” and “an” mean “at least one” or “one or more” unless the context clearly dictates otherwise.
Unless specifically stated otherwise, as apparent from the disclosure, it is appreciated that, according to some embodiments, terms such as “processing”, “computing”, “calculating”, “determining”, “estimating”, “assessing”, “gauging” or the like, may refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data, represented as physical (e.g. electronic) quantities within the computing system's registers and/or memories, into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices.
Embodiments of the present disclosure may include apparatuses for performing the operations herein. The apparatuses may be specially constructed for the desired purposes or may include a general-purpose computer(s) selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), electrically programmable read-only memories (EPROMs), electrically erasable and programmable read only memories (EEPROMs), magnetic or optical cards, or any other type of media suitable for storing electronic instructions, and capable of being coupled to a computer system bus. In some embodiments, a computer may include of the apparatuses may include FPGA, microcontrollers, DSP and video ICS.
The processes and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the desired method(s). The desired structure(s) for a variety of these systems appear from the description below. In addition, embodiments of the present disclosure are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the present disclosure as described herein.
As used herein, the term “about” may be used to specify a value of a quantity or parameter (e.g. the length of an element) to within a continuous range of values in the neighborhood of (and including) a given (stated) value. According to some embodiments, “about” may specify the value of a parameter to be between 99% and 101% of the given value. In such embodiments, for example, the statement “the length of the element is equal to about 1 millimeter” is equivalent to the statement “the length of the element is between 0.99 millimeters and 1.01 millimeters”.
As used herein, according to some embodiments, the terms “substantially” and “about” may be interchangeable.
It is appreciated that certain features of the disclosure, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the disclosure, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination or as suitable in any other described embodiment of the disclosure. No feature described in the context of an embodiment is to be considered an essential feature of that embodiment, unless explicitly specified as such.
Although steps of methods according to some embodiments may be described in a specific sequence, methods of the disclosure may include some or all of the described steps carried out in a different order. A method of the disclosure may include a few of the steps described or all of the steps described. No particular step in a disclosed method is to be considered an essential step of that method, unless explicitly specified as such.
Although the disclosure is described in conjunction with specific embodiments thereof, it is evident that numerous alternatives, modifications and variations that are apparent to those skilled in the art may exist. Accordingly, the disclosure embraces all such alternatives, modifications and variations that fall within the scope of the appended claims. It is to be understood that the disclosure is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth herein. Other embodiments may be practiced, and an embodiment may be carried out in various ways.
The phraseology and terminology employed herein are for descriptive purpose and should not be regarded as limiting. Citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the disclosure. Section headings are used herein to ease understanding of the specification and should not be construed as necessarily limiting.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IL2021/050786 | 6/28/2021 | WO |
Number | Date | Country | |
---|---|---|---|
63050849 | Jul 2020 | US |