Exemplary embodiments of this disclosure relate generally to methods, apparatuses, and computer program products for providing remote fluorophore illumination for eye tracking to minimize undesirable stray light in a field of view of a camera(s).
With standard photonic integrated circuit systems in which eye illumination and a camera spectral bandwidth may be the same, there may be a tendency that stray light leakage from waveguides may contaminate an image seen from an eye, thus reducing the contrast ratio of the eye image.
In view of the foregoing drawbacks, it may be beneficial to provide an efficient and reliable mechanism for improving waveguides, coatings and structures to prevent and/or reduce undesirable stray light in a camera’s field of view.
Exemplary embodiments are described for providing remote phosphor illumination in eye tracking applications to prevent and/or minimize undesirable stray light within a camera’s field of view.
The exemplary embodiments may provide fluorophores such as, for example, Stokes phosphors (e.g., a quantum dot(s) and/or nanocrystal(s), etc.). The Stokes phosphors may be placed at a terminus and at a focus of eye tracking optics (e.g., glint lenses) of glasses (e.g., augmented reality/virtual reality glasses) to move illumination wavelengths out of a user’s vision and may significantly reduce stray light within a camera’s field of view. In some example embodiments, the eye tracking optics may include, but are not limited to, glint lenses which may be utilized to detect glints in a type(s) of eye tracking system(s). Furthermore, by placing the Stokes phosphors at the terminus and at the focus of the eye tracking optics, there may be an increase in the contrast ratio of a glint signal, thereby allowing a faster signal response with lower error incidence.
Some exemplary embodiments may also utilize anti-Stokes phosphors to shift illumination wavelengths to an eye safe region.
In one example embodiment, a device for eye tracking is provided. The device may include at least one camera and one or more illumination sources. The device may further include one or more processors and a memory including computer program code instructions. The memory and computer program code instructions are configured to, with at least one of the processors, cause the device to at least perform operations including detecting illumination comprising a first wavelength emitted from the one or more illumination sources. The illumination may propagate along at least one waveguide to at least one termination node associated with the at least one waveguide. The memory and computer program code are also configured to, with the processor, cause the device to detect the illumination propagating a remote fluorophore located at the at least one termination node. The memory and computer program code are also configured to, with the processor, cause the device to determine that the remote fluorophore shifted the first wavelength to a second wavelength such that the illumination comprises the second wavelength.
In another example embodiment, a method for eye tracking is provided. The method may include detecting illumination comprising a first wavelength emitted from one or more illumination sources. The illumination may propagate along at least one waveguide to at least one termination node associated with the at least one waveguide. The method may further include detecting the illumination propagating a remote fluorophore located at the at least one termination node. The method may further include determining that the remote fluorophore shifted the first wavelength to a second wavelength such that the illumination comprises the second wavelength.
In yet another example embodiment, a computer program product for eye tracking is provided. The computer program product includes at least one computer-readable storage medium having computer-executable program code instructions stored therein. The computer-executable program code instructions may include program code instructions configured to detect illumination comprising a first wavelength emitted from one or more illumination sources. The illumination may propagate along at least one waveguide to at least one termination node associated with the at least one waveguide. The computer program product may further include program code instructions configured to detect the illumination propagating a remote fluorophore located at the at least one termination node. The computer-executable program code instructions may further include program code instructions configured to determine that the remote fluorophore shifted the first wavelength to a second wavelength such that the illumination comprises the second wavelength.
Additional advantages will be set forth in part in the description which follows or may be learned by practice. The advantages will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive, as claimed.
The summary, as well as the following detailed description, is further understood when read in conjunction with the appended drawings. For the purpose of illustrating the disclosed subject matter, there are shown in the drawings exemplary embodiments of the disclosed subject matter; however, the disclosed subject matter is not limited to the specific methods, compositions, and devices disclosed. In addition, the drawings are not necessarily drawn to scale. In the drawings:
The figures depict various embodiments for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
Some embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, various embodiments of the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Like reference numerals refer to like elements throughout. As used herein, the terms “data,” “content,” “information” and similar terms may be used interchangeably to refer to data capable of being transmitted, received and/or stored in accordance with embodiments of the invention. Moreover, the term “exemplary”, as used herein, is not provided to convey any qualitative assessment, but instead merely to convey an illustration of an example. Thus, use of any such terms should not be taken to limit the spirit and scope of embodiments of the invention.
As defined herein a “computer-readable storage medium,” which refers to a non-transitory, physical or tangible storage medium (e.g., volatile or non-volatile memory device), may be differentiated from a “computer-readable transmission medium,” which refers to an electromagnetic signal.
As referred to herein, glint(s) or glint image(s) may refer to detection of intended light reflected at an angle from a surface of one or more eyes. As referred to herein, a glint signal may be any point-like response from an eye(s) caused by an energy input. Examples of energy inputs may be any form of time, space, frequency, phase, and/or polarized modulated light or sound. Additionally, glint signals may result from broad area illumination in which the nature of the field of view from a receiving eye tracking system may allow detection of point like responses from surface pixels or volume voxels of an eye(s) (e.g., a combination of an eye detection system with desired artifacts on the surfaces/layers of an eye(s) or within the volume of the eye(s)). This combination of illumination and detection field of views coupled with desired artifacts on the layers/volumes of an eye(s) may result in point like responses from an eye(s), for example, glints.
As referred to herein, a fluorophore(s) may be any particle(s) that fluoresces.
It is to be understood that the methods and systems described herein are not limited to specific methods, specific components, or to particular implementations. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting.
As referred to herein a fluorophore(s) may be any material that takes in photons at a wavelength 1 (also referred to herein as wavelength λ1) and emits photons at a wavelength 2 (also referred to herein as wavelength λ2) with the conversion (e.g., from wavelength λ1 to wavelength λ2) occurring due to quantum energy level shifts with the material of the fluorophore’s physical and/or chemical make-up. In some exemplary embodiments, a fluorophore(s) may be a phosphor, a fluorescent nanocrystal, a fluorescent quantum dot or any other suitable fluorophore(s). The material of a fluorophore(s) may be composed of organic or inorganic compounds.
By the exemplary embodiments placing a remote fluorophore(s) such as a Stokes phosphor (e.g., a remote phosphor), or an anti-Stokes phosphor, in the form of a fluorophore(s) (e.g., a quantum dot (QD), a nanocrystal, etc.) at a terminus of waveguides and at a focus of lenses of a head-mounted display (e.g., glasses), the illumination wavelengths may be moved to a waveband outside of the vision of humans and out of band for a camera such as, for example, a near infrared (NIR) camera, which may be utilized for detection of a glint image. The transport of bluish/ultraviolet (UV) wavelengths may be invisible to the camera, by being out of a spectral range of the camera, and upon striking the remote fluorophore may be converted to a safe wavelength (e.g., 980 nm) for a user’s vision and/or at the point of use may significantly reduce/minimize stray light in a field of view of the camera and may increase a contrast ratio of a glint signal allowing for faster response with lower error incidence.
Since the waveguides may be outside of the keep out zones of lenses, any wavelength in the blue to near infrared band may be utilized by the exemplary embodiments as long as that band is out of the spectral range of the camera. In this regard, blue light wavelengths may be utilized by the exemplary embodiments. In some exemplary embodiments, 780 nanometer (nm) or 840 nm or the like wavelengths generated by illumination sources may be utilized with fluorophores such as for example quantum dots to shift a wavelength to 980 nm, as an example, for illumination emission to detect a glint image.
Some exemplary embodiments may utilize anti-Stokes fluorophores (e.g., anti-Stokes phosphors) which may allow an illumination wavelength to be shifted to any wavelength greater than 1250 nm (e.g., an eye safe region) while still allowing for the illumination wavelength emission for detecting a glint image to be in the 980 nm band that a camera may view without any potential eye safety issues.
A Stokes fluorophore (also referred to herein as Stokes phosphor) may absorb radiation (e.g., in the form of photons) at a wavelength such as, for example, wavelength λ1 and may emit a lower energy (e.g., longer wavelength) at a wavelength such as, for example, wavelength λ2. This may, for example, be enacted by the material of the Stokes fluorophore by way of a quantum mechanical exchange due to an incoming photon (e.g., an excitation source) causing a lower bound electron to rise to a higher energy state which may have a fast decay time to a lower energy state that may not be a ground state and as such may emit a lower energy (e.g., longer wavelength (e.g., wavelength λ2)).
Some exemplary embodiments may utilize anti-Stokes fluorophores. An anti-Stokes fluorophore may be similar to a Stokes fluorophore in energy states, but the anti-Stokes fluorophore may include a series of subbands or defect bands going from a lower energy state to a higher energy state. Each of the subbands may have a long decay time such that energy within an eye tracking system (e.g., a head-mounted display having an eye tracking camera) may build up by absorbing photons of lower energy at a wavelength such as, for example, a wavelength λ3. In an instance in which electrons attain enough energy to pass into the higher energy state, which may also have a short decay time with a direct path to an energy state lower than where the electrons first started, this electron state may emit a photon at a shorter wavelength such as, for example, wavelength λ2. In some exemplary embodiments, wavelength λ2 may be a desired wavelength for eye tracking systems/applications.
As described more fully below, by utilizing Stokes fluorophores and/or anti-Stokes fluorophores, a source illumination such as, for example, light having a wavelength λ1 and/or wavelength λ3 may not be detected by an eye tracking camera because this source illumination may be either filtered out by optical wavelength filters in front of a photodetection surface associated with the eye tracking camera or may be above an absorption spectral band or below the absorption spectral band of detector elements associated with the eye tracking camera. As such, the signal to noise ratio and/or the contrast ratio of the eye tracking camera may be improved due to a lack of ambient noise being present in eye tracking systems that emit and detect source illumination having wavelength λ1 and/or wavelength λ3.
In some examples, the head-mounted display 100 may be implemented in the form of augmented-reality glasses. Accordingly, the waveguide 108 may be at least partially transparent to visible light to allow the user to view a real-world environment through the waveguide 108.
To assemble the head-mounted display 100, the three subprojectors 106A, 106B, and 106C may be initially assembled with each other (e.g., three light sources mounted to a common substrate, three collimating lenses aligned on the three light sources) to form the light projector 106 as a unit. The light projector 106 may include one or more projector fiducial marks 116, which may be used in optically aligning (e.g., positioning, orienting, securing) the light projector 106 with the enclosure 102. In some examples, the enclosure 102 may likewise include one or more frame fiducial marks 118 to assist in the optical alignment of the light projector 106 with the enclosure 102.
Optical alignment of the light projector 106 relative to the enclosure 102 may involve viewing the light projector 106 and/or enclosure 102 during placement of the light projector 106 in or on the enclosure 102 with one or more cameras, which may be used to identify the location and orientation of the projector fiducial mark(s) 116 relative to the location and orientation of the frame fiducial mark(s) 118. The projector fiducial mark(s) 116 on both sides of the enclosure 102 may be used to balance the frame into a computer aided design (CAD)-nominal position. The projector fiducial mark(s) 116 and the enclosure fiducial mark(s) 118 are each shown in
The alignment cameras 424 may be used during assembly of the head-mounted display 400 to optically align the light projector 406 with the frame 402 and/or to optically align the waveguide 408 with the light projector 406. For example, the alignment cameras 424 may be used to detect the location and/or orientation of a fiducial mark (e.g., the projector fiducial marks 116, the frame fiducial marks 118, etc.), a physical component or feature, a reflective material, etc. In additional examples, the alignment cameras 424 may be used to detect a location and/or orientation of a projected pattern (e.g., the projected pattern 302). This detected information may be used to adjust a position and/or orientation of the light projector 406 relative to the frame 402 or of the waveguide 408 relative to the light projector 406 and/or frame 402.
As shown in
The frame 402 and the light projector 406 may be used substantially aligned. For example, the frame 402 and the light projector 406 may be aligned such that, when viewed by a camera, a projected pattern produced by a light projector 406 and a camera target (e.g., projected pattern 302 and camera target 304 in
One of the cameras 516 may be a forward-facing camera capturing images and/or videos of the environment that a user wearing the HMD 510 may view. The HMD 510 may include an eye tracking system to track the vergence movement of the user wearing the HMD 510. In one exemplary embodiment, the camera(s) 518 may be the eye tracking system. In some exemplary embodiments, the camera(s) 518 may be one camera configured to view at least one eye of a user to capture a glint image(s) (e.g., and/or glint signals). In some other exemplary embodiments, the camera(s) 518 may include multiple cameras viewing each of the eyes of a user to enhance the capture of a glint image(s) (e.g., and/or glint signals). The HMD 510 may include a microphone of the audio device 506 to capture voice input from the user. The augmented reality system 500 may further include a controller 504 comprising a trackpad and one or more buttons. The controller 504 may receive inputs from users and relay the inputs to the computing device 508. The controller may also provide haptic feedback to one or more users. The computing device 508 may be connected to the HMD 510 and the controller through cables or wireless connections. The computing device 508 may control the HMD 510 and the controller to provide the augmented reality content to and receive inputs from one or more users. In some example embodiments, the controller 504 may be a standalone controller or integrated within the HMD 510. The computing device 508 may be a standalone host computer device, an on-board computer device integrated with the HMD 510, a mobile device, or any other hardware platform capable of providing artificial reality content to and receiving inputs from users. In some exemplary embodiments, HMD 510 may include an artificial reality system/virtual reality system.
Referring now to
Referring now to
The output coupler 41A of
In the example of
Referring to
In some exemplary embodiments, the PIC layer 100A may be embodied within, or associated with, a head-mounted display (e.g., HMD 510) which may include an eye tracking system to track the vergence movement of a user wearing the HMD. In one exemplary embodiment, a camera (e.g., camera(s) 518) may be the eye tracking system. For example, the camera (e.g., camera(s) 518) may track movement and/or gaze of a user’s eyes. Consider for example, an instance in which the camera is tracking one or more eyes of a user. In this regard, the illumination sources 50A (e.g., LEDs, lasers) may emit light to be directed towards an eye(s) in which the light may be utilized as an eye tracking beam. Consider, for example, that the light emitted by one or more of the illumination sources 50A has a wavelength λ1. In some example embodiments, for purposes of illustration and not of limitation, a wavelength associated with wavelength λ1 may, but need not, be 460 nm. Other suitable examples of wavelength λ1 (e.g., 780 nm, 840 nm) may be possible in some exemplary embodiments. In some examples, one or more of the illumination sources 50A may emit in a blue/ultraviolet visible spectrum and/or in a near infrared visible spectrum. In an instance in which anti-Stokes phosphors are utilized, the anti-Stokes phosphors may allow an illumination wavelength to be shifted to any wavelength (e.g., wavelength λ3) greater than 1250 nm (e.g., an eye safe region) while still allowing for the illumination wavelength emission for detecting a glint image to be in the 980 nm band (e.g., wavelength λ2) that a camera may view without any potential eye safety issues. A remote fluorophore (e.g., remote fluorophore 40A) located at a PIC waveguide (e.g., PIC waveguide 30A) may convert the wavelength λ1 to a desired wavelength that may be beneficial for eye tracking, as described more fully below.
For example, the illumination sources 50A, of the source illumination carrier 10A, may be configured to facilitate emission of light into a PIC waveguide such as, for example, PIC waveguide 30A. The light (e.g., an illumination source having wavelength λ1) may travel/propagate to a termination node (e.g., termination node 36A, termination node 35A) of the PIC waveguide. For example, the light may travel/propagate to the termination node 36A. As shown in the cross section 37A, of
In the above example, the remote fluorophore 40A (e.g., a Stokes fluorophore (e.g., a quantum dot)) may absorb radiation (e.g., in the form of photons) at a wavelength such as, for example, wavelength λ1 (e.g., 460 nm) and may emit a lower energy (e.g., longer wavelength) at a wavelength such as, for example, wavelength λ2 (e.g., 980 nm). This may, for example, be enacted in the material of the remote fluorophore (e.g., a Stokes fluorophore) by way of a quantum mechanical exchange due to an incoming photon (e.g., an excitation source) causing a lower bound electron to rise to a higher energy state which may have a fast decay time to a lower energy state that may not be a ground state and as such may emit a lower energy (e.g., longer wavelength). In some example embodiments, the wavelength selectivity associated with an excitation wavelength (e.g., 460 nm, etc.) may be attained by structuring a quantum dot (e.g., resonant coatings), and/or adding compounds that may negate the effects of undesired wavelengths such as, for example, minimizing defects and traps associated with an electronic structure of the quantum dot to negate undesired wavelengths. The size of the quantum dot may determine emission wavelengths such as the desired emission wavelength (e.g., 980 nm). The defects and traps, described above, may be electronic and/or quantum mechanical structures within a material (e.g., a quantum dot). Defects and traps may cause a change in transition of an excited electron/hole to reach a ground state. In some instances, defects and/or traps may be initially created to alter the time or energy level of an excited electron(s) on its way to ground (e.g., neutral) state. As an example pertaining to lasers, a stimulated emission may be caused by a trap that is initially put into a quantum mechanical structure of the lasing media by the addition of dopant materials that may perturb the energy states to form a trap(s). The electrons that are excited may get trapped in this state until a threshold is reached and in which case the trap level is released all at once thus producing an inversion situation and may allow the material to lase. In examples such as fluorescent materials (e.g., a quantum dot), a defect may cause a change in an emission wavelength with some of the energy going into heat in a matrix (e.g., associated with phonons or long wave photons).
In response to the remote fluorophore 40A converting/shifting the light from wavelength λ1 (e.g., 460 nm) to wavelength λ2 (e.g., 940 nm), the output coupler 41 may react to the light having wavelength λ2 and may direct the light out of the PIC waveguide (e.g., PIC waveguide 30A) along a termination node emission 42 path normal to a surface of the PIC layer 5A at a termination node (e.g., termination node 36A). The termination node emission 42A may shape the light having wavelength λ2 from the output coupler 41 and may emit the light having wavelength λ2 towards an eye(s) of a user (e.g., a user wearing HMD 510) as an eye tracking beam. In this example, the termination node emission 42A may be associated with light having wavelength λ2 (e.g., 960 nm), whereas the light from one or more illumination sources 50A may be associated with wavelength λ1 (e.g., 460 nm). For purposes of illustration and not of limitation, the camera (e.g., camera(s) 518) associated with the HMD may be only capable of detecting light associated with wavelength λ2 (e.g., an eye safe wavelength). In other words, the light associated with wavelength λ1 emitted by one or more of the illumination sources 50A may be invisible (e.g., undetectable) to the camera. The camera may be unable to detect any light having a wavelength band that is outside of the spectral range of the camera. As such, even in an instance in which stray light having wavelength λ1 may leak from a PIC waveguide (e.g., PIC waveguide 30A), the stray light may be undetectable by the camera because it may be outside of the spectral range of the camera. Since the stray light may be outside of the spectral range of the camera, the stray light may not degrade a signal to noise ratio (SNR) and/or a contrast ratio associated with the camera. Furthermore, as described above, the light having wavelength λ2 that is directed, by the termination node emission 42A, to an eye(s) of a user as an eye tracking beam may be safe for eyes.
In some alternative exemplary embodiments, the remote fluorophore 40A may be a remote phosphor such as an anti-Stokes phosphor which may allow light emitted from one or more illumination sources 50A at a wavelength λ3 or greater (e.g., greater than 1250 nm) to be shifted by the remote fluorophore 40A, in the PIC waveguide at a termination node, to be in the wavelength λ1 band (e.g., 980 nm) that the camera (e.g., camera 518) may be able to detect. The wavelength λ3 may be in an eye safe region. The remote fluorophore 40A as an anti-Stokes phosphor may be in the PIC waveguide (e.g., PIC waveguide 30A) at a termination node (e.g., termination node 36A, termination node 35A) in a same manner as described above regarding a Stokes phosphor as the remote fluorophore 40A.
The anti-Stokes phosphor may be similar to the Stokes phosphor in energy states, but the anti-Stokes phosphor may include a series of subbands or defect bands going from a lower energy state to a higher energy state. Each of the subbands may have a long decay time such that energy within an eye tracking system (e.g., HMD 510) may build up by absorbing photons of lower energy at wavelength λ3. In an instance in which electrons attain enough energy to pass into the higher energy state, which also may have a short decay time with a direct path to an energy state lower than where the electrons first started, that electron state may emit a photon at wavelength λ2 (e.g., a shorter wavelength) and wavelength λ2 may be a desired wavelength for eye tracking associated with the camera (e.g., camera(s) 518). The illumination (e.g., light) emitted from the illumination sources 50A having wavelength λ1 and wavelength λ3 may not be detectable by the camera since these wavelengths may be outside of the spectral range of the camera (e.g., camera(s) 518). As such, the signal to noise ratio and/or the contrast ratio of the camera (e.g., camera(s) 518) may be improved due to a lack of ambient noise being present in the camera and/or associated with an HMD as an eye tracking system.
At operation 902, a device (e.g., HMD 510) may detect the illumination propagating a remote fluorophore (e.g., remote fluorophore 40A) located at the termination node. At operation 904, a device (e.g., HMD 510) may determine that the remote fluorophore shifted the first wavelength (e.g., wavelength λ1) to a second wavelength (e.g., wavelength λ2) such that the illumination comprises the second wavelength.
Optionally at operation 904, a device (e.g., HMD 510) may detect that the illumination comprising the second wavelength is directed out of the termination node and emitted, towards at least one eye of a user, as an eye tracking beam. The illumination comprising the second wavelength may be directed out of the termination node by an output coupler (e.g., output coupler 41A) based on a termination node emission (e.g., termination node emission 42A). The illumination comprising the second wavelength (e.g., 980 nm) may be safe for the at least one eye of the user. The first wavelength (e.g., 460 nm) may be harmful to the at least one eye of the user.
The device (e.g., HMD 510) may include at least one photonics integrated circuit layer (e.g., PIC layer 100A) including a plurality (e.g., an array) of PIC waveguides (e.g., PIC waveguides 25A) configured to transport the illumination including the first wavelength or other illumination comprising a third wavelength (e.g., wavelength λ3). The device (e.g., HMD 510) may determine that the remote fluorophore (e.g., remote fluorophore 40A) shifted the third wavelength (e.g., wavelength λ3 (e.g., greater than 1250 nm)) to the second wavelength (e.g., wavelength λ2 (e.g., 980 nm)) such that the other illumination comprises the second wavelength.
The foregoing description of the embodiments has been presented for the purpose of illustration; it is not intended to be exhaustive or to limit the patent rights to the precise forms disclosed. Persons skilled in the relevant art can appreciate that many modifications and variations are possible in light of the above disclosure.
Some portions of this description describe the embodiments in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, hardware, or any combinations thereof.
Any of the steps, operations, or processes described herein may be performed or implemented with one or more hardware or software modules, alone or in combination with other devices. In one embodiment, a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.
Embodiments also may relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, and/or it may comprise a computing device selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non-transitory, tangible computer readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus. Furthermore, any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
Embodiments also may relate to a product that is produced by a computing process described herein. Such a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the patent rights be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the patent rights, which is set forth in the following claims.