This application claims the benefit under 35 USC 119(a) of Indian Patent Application No. 3509/CHE/2014, filed on Jul. 17, 2014, in the Indian Patent Office, and Korean Patent Application No. 10-2015-0073810, filed on May 27, 2015, in the Korean Intellectual Property Office, the entire disclosures of which are incorporated herein by reference for all purposes.
1. Field
The following description relates to a healthcare system, and more particularly, to an imaging system for recording an image of an eye of a user.
2. Description of Related Art
In general, some hand-held optical adaptors include a function of capturing an image of a user's anatomy, for example, the skin, an eye, and an ear. A portion of the hand-held optical adaptors includes an interchangeable instrument available for a variety of medical examinations to capture an image. Some optical adaptors are designed to be used with an imaging capturing device having camera features and functions. An optical adaptor may be attached to an imaging capturing device by an outer housing facility of the optical adaptor on a side of the optical adaptor on which an eye of a user may be placed for examination.
In rural areas, many persons suffer from infections of, for example, an eye, an ear, and skin. Infections mainly in the eye, such as cataracts, may be cured or prevented if they are detected early. Due to an absence of expensive optical adaptors and a lack of experts in rural areas, it is difficult to detect such infections early in the rural areas. However, an innovative imaging system including an optical adaptor that is attached to an electronic device captures images of an affected eye of a person using differential transmission holography, optical fluorescence, or an array of lenses capturing reflected light. An optical adaptor attached to a smartphone having a camera lens and a display system captures a low-resolution image since an optical resolution of the camera lens is low.
Captured images are sent to a location remotely located from a user, such as a hospital/laboratory, over an existing wireless network at which experts use the images for diagnosis and provide the user with necessary preventive measures. The above procedure consumes a relatively large amount of time since the images are sent to the remote location for diagnosis, and also has an increased standby time until the images are used by the experts. A hand-held processing device such as a phone or a remote server may be selected as a processing unit based on image resolution and complexity.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In one general aspect, there is provided an imaging system for using an optical device with an electronic device for diagnostic imaging. The imaging system may include a controller configured to capture a series of holograms by powering a light source of the optical device to illuminate an object, wherein light from the light source is collimated onto the object through an aperture of the optical device. The controller may be configured to extract an interference pattern of the object from the series of holograms, wherein the interference pattern is produced by interference between a reflected beam from the object and a reference beam formed by a diffraction mirror of the optical device. The controller may be further configured to record at least one image of the object based on the interference pattern. The imaging system may include a data storage configured to store the at least one image.
When recording the at least one image of the object based on the interference pattern, the imaging system may be configured to: obtain a frequency spectrum of the object by obtaining a Fresnel transform of an amplitude and a phase retrieved from the interference pattern, wherein a high frequency portion in the frequency spectrum is recovered using an iterative restoration approach; and obtain the at least one image of the object by obtaining a Fourier transform of the frequency spectrum, wherein the at least one image is a low-resolution image.
When recording the at least one image of the image based on the interference pattern, the imaging system may be configured to: obtain a frequency spectrum of the object by obtaining a Fresnel transform of an amplitude and a phase retrieved from the interference pattern, wherein a high frequency portion in the frequency spectrum is recovered using an iterative restoration approach; and obtain the at least one image of the object by obtaining an inverse Fourier transform of the frequency spectrum, wherein the at least one image is a high-resolution image.
The light may be partially reflected and partially transmitted by a beam splitter.
The light may be split by the beam splitter into an incident beam and the reference beam, and the incident beam may pass through a phase plate and be reflected from the object.
The reference beam may be formed by the light source.
The optical device may include an adaptor. The adaptor may include a housing facility including a proximal end and a distal end, the housing facility being configured to removably attach to the electronic device at the proximal end. The proximal end may be configured to surround an imaging sensor of the electronic device and the distal end is configured to be fixed on or near the object using a head strap.
The imaging system may be further configured to display the recorded at least one image on the electronic device and authenticate the recorded at least one image by comparing the recorded at least one image to at least one pre-stored image of the object.
In another general aspect, there is provided a method of operating an optical device. The method may include capturing a series of holograms by powering a light source associated with the optical device to illuminate an object, wherein light from the light source is collimated onto the object through an aperture. The method may include extracting an interference pattern of the object from the series of holograms, wherein the interference pattern is produced by interference between a reflected beam from the object and a reference beam formed by a diffraction mirror associated with the optical device. The method may further include recording at least one image of the object based on the interference pattern, and storing the at least one image in a data storage of an electronic device.
The recording of the at least one image may include: obtaining a frequency spectrum of the object by obtaining a Fresnel transform of an amplitude and a phase retrieved from the interference pattern, wherein a high frequency portion in the frequency spectrum is recovered using an iterative restoration approach; and obtaining the at least one image of the object by obtaining a Fourier transform of the frequency spectrum, wherein the at least one image is a low-resolution image.
The recording of the at least one image may include: obtaining a frequency spectrum of the object by obtaining a Fresnel transform of an amplitude and a phase retrieved from the interference pattern, wherein a high frequency portion in the frequency spectrum is recovered using an iterative restoration approach; and obtaining the at least one image of the object by obtaining an inverse Fourier transform of the frequency spectrum, wherein the at least one image is a high-resolution image.
The light may be partially reflected and partially transmitted by a beam splitter.
The light may be split by the beam splitter into an incident beam and the reference beam, and the incident beam may pass through a phase plate and be reflected from the object.
The reference beam may be formed by the light source.
The method may include displaying the recorded at least one image on the electronic device and authenticating the recorded at least one image by comparing the recorded at least one image to at least one pre-stored image of the object.
In another general aspect, an imaging system for recording at least one image of an object includes a housing facility including a light source, an aperture, a diffraction mirror, a head strap, a display screen, a data storage, and a controller. The housing facility may include a proximal end and a distal end, and may be configured to attach to the display screen at the proximal end. The controller may be configured to capture a series of holograms by powering the light source to illuminate the object, wherein light from the light source is collimated onto the object through the aperture. The controller may be configured to extract an interference pattern of the object from the series of holograms, wherein the interference pattern is produced by interference between a reflected beam from the object and a reference beam formed by the diffraction mirror. The controller may be configured to record at least one image of the object based on the interference pattern, and store the at least one image in the data storage.
When recording the at least one image of the object based on the interference pattern, the controller may be further configured to: obtain a frequency spectrum of the object by obtaining a Fresnel transform of an amplitude and a phase retrieved from the interference pattern, wherein a high frequency portion in the frequency spectrum is recovered using an iterative restoration approach; and obtain the at least one image of the object by obtaining a Fourier transform of the frequency spectrum, wherein the at least one image is a low-resolution image.
When recording the at least one image of the object based on the interference pattern, the controller may be further configured to: obtain a frequency spectrum of the object by obtaining a Fresnel transform of an amplitude and a phase retrieved from the interference pattern, wherein a high frequency portion in the frequency spectrum is recovered using an iterative restoration approach; and obtain the at least one image of the object by obtaining an inverse Fourier transform of the frequency spectrum, wherein the at least one image is a high-resolution image.
The light may be partially reflected and partially transmitted by a beam splitter.
The controller may be further configured to display the recorded at least one image on the display screen and authenticate the recorded at least one image by comparing the recorded at least one image to at least one pre-stored image of the object.
In yet another general aspect, an imaging adaptor may include a housing configured to attach to an image sensor of an electronic device, and configured to be fixed to or near an object. The imaging adaptor may be configured to emit light towards the object, capture a series of holograms generated by light reflected from the object, and generate an interference pattern from the series of holograms. The interference pattern may be configured to be processed to record at least one image of the object.
The electronic device may be a smartphone.
The object may be an eye.
Other features and aspects will be apparent from the following detailed description, the drawings, and the claims.
Throughout the drawings and the detailed description, the same reference numerals refer to the same elements. The drawings may not be to scale, and the relative size, proportions, and depiction of elements in the drawings may be exaggerated for clarity, illustration, and convenience.
The following detailed description is provided to assist the reader in gaining a comprehensive understanding of the methods, apparatuses, and/or systems described herein. However, various changes, modifications, and equivalents of the methods, apparatuses, and/or systems described herein will be apparent to one of ordinary skill in the art. The sequences of operations described herein are merely examples, and are not limited to those set forth herein, but may be changed as will be apparent to one of ordinary skill in the art, with the exception of operations necessarily occurring in a certain order. Also, descriptions of functions and constructions that are well known to one of ordinary skill in the art may be omitted for increased clarity and conciseness.
The features described herein may be embodied in different forms, and are not to be construed as being limited to the examples described herein. Rather, the examples described herein have been provided so that this disclosure will be thorough and complete, and will convey the full scope of the disclosure to one of ordinary skill in the art.
The examples herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting examples that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the examples herein. Also, the various examples described herein are not necessarily mutually exclusive, as some examples can be combined with one or more other examples to form new examples. The term “or” as used herein, refers to a non-exclusive or, unless otherwise indicated. The examples used herein are intended merely to facilitate an understanding of ways in which the examples herein can be practiced and to further enable those skilled in the art to practice the examples herein. Accordingly, the examples should not be construed as limiting the scope of the examples herein.
The examples herein disclose an imaging system and method for recording an image of an object. The imaging system includes a housing facility, an imaging sensor, a light source configured to make light partially coherent, a phase plate configured to generate a donut-shaped illumination, a beam-splitter cube configured to partially reflect and transmit the light, a diffraction mirror configured to form a reference beam, a head strap configured to fix the optical adapter on the object, and a rechargeable battery pack. The housing facility is attached to the display screen at a proximal end and extends from the proximal end to a distal end.
The method includes powering the light source to emit the light toward the object. The light from the light source is collimated onto the object through a pinhole aperture. Further, the method includes extracting an interference pattern of the object based on the emitted light. The interference pattern is obtained by interference between a reflected beam from the object and the reference beam. Further, the method includes obtaining a frequency spectrum of the object by obtaining a Fresnel transform of an amplitude and a phase retrieved from the interference pattern. A high frequency portion in the frequency spectrum is recovered using an iterative restoration approach. Further, the method includes obtaining the image of the object by obtaining a Fourier transform of the frequency spectrum. Further, the method includes recording the image in the imaging system for diagnosis.
The method and system disclosed herein is simple and robust for building a low-cost, hand-held optical adapter capable of imaging an eye, an ear, or a throat noninvasively, and diagnosing conditions. The optical adapter also reads microscopic information for verification of its authenticity. The optical adapter fitting includes the housing facility that is attachable to the electronic device and the housing facility contains a light transmission guide configured to focus the light from a partially coherent light emitting diode (LED) light source and to direct the light onto the object being viewed. The light transmission guide includes several components, such as a pinhole aperture configured to collimate light from the light source to make the light partially coherent, a beam-splitter cube configured to partially reflect and transmit the light, a diffraction mirror configured to form a reference beam, and a phase plate configured to generate a donut-shaped illumination.
The imaging system disclosed herein is a low-cost and hand-held optical adapter for imaging an eye, an ear, or a throat, and diagnoses existing or developing conditions using a consumer camera. Using iterative methods, a high-resolution image is reconstructed from a relatively low-resolution image and an optical adapter to capture a wide angle scene over a narrow angle lens system is designed. Captured images are of low cost but comparable in image quality to expensive diagnostic equipment. The object is imaged noninvasively and no ionizing radiation is used. Also, the proposed method and system may be implemented using existing optical components and does not require extensive setup and instrumentation.
Hereinafter, examples will be described with reference to
In an example, the object 102 refers to, for example, an eye, an ear, a throat, skin, currency, or a document. However, the object 102 is not limited to the aforementioned examples. The object 102 is positioned on the optical adaptor 104 to noninvasively image a subject of the object 102 for the purpose of diagnosing and evaluating the object 102. For example, the optical adaptor 104 may be fixed to an eye corresponding to the object 102 to noninvasively image a retina, for example, the subject, of the eye in order to diagnose and evaluate the eye.
In an example, the optical adaptor 104 may be attached to the electronic device 106 to perform many examinations that are currently performed by standard ophthalmoscopes in order to view the retina of the user, and captures images of the retina of the user.
In an example, the optical adaptor 104 is attached to the electronic device 106 through a snap-fit connection, a sliding connection, or other mechanisms for fixing the optical adaptor 104 to the electronic device 106. The optical adaptor 104 may be removably attached to the electronic device 106, allowing the optical adaptor 104 to be attached when an optical system is in use, and detached when the optical system is not in use. Other types of fixed and removable attachment methods and mechanisms may be used to fix the optical adaptor 104 to the electronic device 106, in addition to the examples provided herein. The optical adaptor 104 is removably attached to the electronic device 106 at a proximal end of the optical adapter 104 and extends from the proximal end to a distal end of the optical adapter 104. The proximal end of the optical adaptor 104 surrounds an imaging sensor in the electronic device 106. The distal end of the optical adaptor 104 is fixed to or positioned on the object 102 to be imaged, diagnosed, and evaluated. The optical adaptor 104 emits the light toward the object 102 to generate an interference pattern of the object 102. The captured image is received by the electronic device 106 using the imaging sensor.
In an example, the electronic device 106 described herein may be, without being limited, for example, a laptop, a desktop computer, a mobile phone, a smartphone, a personal digital assistant (PDA), a tablet, a phablet, a consumer electronic device, or other electronic devices.
The electronic device 106 is attached to the optical adaptor 104. The electronic device 106 may be configured to take a photo of an interference pattern captured at a focus of the imaging sensor. The interference pattern is generated by the optical adaptor 104 by emitting the light toward the object 102. The electronic device 106 may be configured to obtain a frequency spectrum of the object 102 by obtaining a Fresnel transform of an amplitude and a phase retrieved from the interference pattern. The electronic device 106 may be configured to obtain an image of the object 102 by obtaining a Fourier transform of the frequency spectrum.
In an example, extracting and processing of the interference pattern may be performed by the electronic device 106 to reconstruct a low-resolution image. In another example, to reconstruct a high-resolution image, the electronic device 106 may be configured to transmit the captured interference pattern to the server 108 in order for the server 108 to extract and process spectrum data. The electronic device 102 includes an interface suitable for directly or indirectly communicating with the server 108 and other various devices.
In an example, the server 108 described herein may be, without being limited, for example, a gateway device, a router, a hub, a computer, or a laptop. The server 108 may be configured to receive the interference pattern from the electronic device 106. The server 108 may be configured to extract the frequency spectrum of the object 102 obtained by the Fresnel transform of the amplitude and the phase retrieved from the interference pattern to record an image of the object 102 in the server 108. The server 108 may be configured to transmit the processed and reconstructed high-resolution image to the electronic device 106 in order to display the reconstructed image and thereby diagnose existing or developing conditions.
Conventional systems may not perform noninvasive imaging of an eye/ear without ionizing radiations since an optical resolution of an integrated consumer mobile camera is low and is inapplicable to medical application fields. Dissimilar to the conventional systems, an optical adaptor and an electronic device combined with a backend computation operation replace an expensive high-resolution lens system without moving parts and using a lensless holography method for computationally reconstructing an image from a light interference pattern.
The housing facility 105 is removably attached to the electronic device 106 at a proximal end and extends from the proximal end 105a to a distal end 105b. The head strap 212 is provided at the distal end to fix the optical adapter 104 on the object 102.
The light source 202 emits light to illuminate the object 102. In an example, the light source 202 may be an LED or a light amplification by stimulated emission of radiation (LASER). For example, an LED system may provide adequate brightness and intensity to effectively illuminate the object 102 of the user if focused properly. The light source 202 may be configured to direct the light only to an interior side of the optical adapter housing facility 105. The rechargeable battery pack 214 is used in association with the light source 202 to power the light source 202. The pinhole aperture 204 may be configured to collimate the light from the light source 202 to make the light partially coherent.
The phase plate 206 generates a donut-shaped illumination. The beam-splitter cube 208 partially reflects and transmits the light emitted from the light source 202. Herein, that light is partially reflected and partially transmitted, or vice versa, indicates that a portion of light is reflected and a portion of light is transmitted.
Further, the partially coherent light is split by the beam-splitter cube 208 into an incident beam and a reference beam. The incident beam is partially transmitted and partially reflected by the beam-splitter cube 208. The incident beam passes through the phase plate 206 and then is reflected from the object 102. The diffraction mirror 210 reflects the incident beam partially reflected by the beam-splitter cube 208. The rechargeable battery pack 214 is used in association with the light source 202 to power the light source 202. The partially coherent light is reflected back from the object 102 to the imaging sensor of the electronic device 106.
Notations of
Zl denotes a distance between the light source 202 and a center of the beam-splitter cube 208.
Zr denotes a distance between the center of the beam-splitter cube 208 and the diffraction mirror 210.
Zs denotes a distance between a specimen, or object, and the center of the beam-splitter cube 208.
Zd denotes a distance between the imaging sensor and the center of the beam-splitter cube 208.
Based on the above notations, distances traversed by the reference beam and the reflected beam are calculated as follows:
Distance traversed by the reference beam=Zi+2Zr+Zd
Distance traversed by the reflected beam=Zi+2Zs+Zd
The electronic device 106 captures and processes an image of the object 102 by extracting the interference pattern. An example operation of the electronic device 106 for capturing and processing an image of the object will now be described.
For example, in a scenario in which an ear of a user, such as a patient, is to be imaged, the proximal end 105a of the optical adapter 104 is fixed to a smartphone, surrounding the imaging sensor on the smartphone 106. The distal end 105b of the optical adapter 104 is fixed to the ear of the user through the head strap 212. The LED light source 202 in the optical adapter 104 is activated to emit light beams for illuminating the ear of the user.
The light emitted from the LED light source 202 passes through the pinhole aperture 204 to collimate the light, in order to make the light partially coherent. The partially coherent light is split by the beam-splitter cube 208 into the incident beam and the reference beam. The incident beam is partially transmitted and partially reflected by the beam-splitter cube 208. The incident beam passes through the phase plate 206 and is emitted to illuminate the ear of the user. The phase plate 206 generates the donut-shaped illumination to reduce the reflection from the ear. The incident beam is reflected back from the ear to the imaging sensor of the smartphone 106 along with the reference beam reflected by the diffraction mirror 210. The imaging sensor of the smartphone 106 receives an interference pattern of the ear. That is, the incident beam reflected back from the ear interferes with the reference beam reflected back from the diffraction mirror 210.
The smartphone 106 extracts a frequency spectrum of the ear by obtaining a Fresnel transform of an amplitude and a phase recovered from the interference pattern. When the image is a low-resolution image, the smartphone 106 reconstructs the image of the ear by obtaining a Fourier transform of the frequency spectrum. When the image is a high-resolution image, the smartphone 106 transmits the interference pattern to the server 108 to extract and process spectrum data, in order to record the image of the ear.
Referring to
In an example, the imaging sensor 302 described herein may be, without being limited, for example, a charge-coupled device (CCD) imaging sensor and a complementary metal-oxide-semiconductor (CMOS) imaging sensor.
The controller 304 is configured to extract an interference pattern of an object from a series of holograms. The interference pattern is obtained by interference between a reflected beam from the object and a reference beam from a diffraction mirror. The controller 304 may be configured to extract the interference pattern prior to determining a calibration factor in the electronic device 106 by the imaging sensor 302 in a housing facility. The controller 304 may be configured to obtain a frequency spectrum of the object by obtaining a Fresnel transform of an amplitude and a phase retrieved from the interference pattern. A high frequency portion in the frequency spectrum may be recovered using an iterative restoration approach. The controller 304 may be configured to obtain an image of the object by obtaining a Fourier transform of the frequency spectrum.
In an example, the image may be a low-resolution image. The controller 304 may be configured to record the image of the object in the data storage 310. The controller 304 may include, for example, a visual dimension system.
The communicator 306 may be configured to transfer captured data to the server 108 in order for the server 108 to extract the frequency spectrum of the interference pattern and process the interference pattern in order to reconstruct the image of the object. Further, the display screen 308 may be configured to display the reconstructed image to diagnose existing or developing conditions. The data storage 310 may be configured to store various images of the object 102. The data storage 310 may be configured to store reconstructed images of the object 102 to diagnose existing or developing conditions. The data storage 310 may be configured to store control instructions to perform various operations in a system.
A beam splitter 508 partially reflects and transmits the partially coherent LED light. The beam splitter 508 splits the partially coherent LED light into an incident beam and a reference beam. The reference beam is marked with a notation “A”. A diffraction mirror 510 reflects the reference beam received from the beam splitter 508. The transmitted incident beam “B” passes through a phase plate 506 and is then emitted toward the eye to be studied. The incident beam “B” passes through the phase plate 506 to generate a donut-shaped illumination, in order to avoid pupil reflections of the eye. An object beam marked with a notation “C” and reflected from the eye or retina interferes with the reflected reference beam “A” from the diffraction mirror 510, thereby generating an interference pattern.
The interference pattern is collected by an imaging sensor (not shown) of an electronic device and transmitted to a controller 304 included in the electronic device or a server (not shown). A frequency spectrum of the object is obtained by a Fresnel transform of an amplitude and a phase of the interference pattern. An image of the object is reconstructed by obtaining a Fourier transform of the frequency spectrum of the object to display the reconstructed image on a display screen (not shown) to diagnose existing or developing conditions.
Hereinafter, a process of reconstructing a low-resolution image from an interference pattern in an electronic device and a process of reconstructing a high-resolution image in a server will be described.
When an object pattern on a CCD imaging sensor of an electronic device is s(u, v) and a reference beam pattern is r(u, v), an interference pattern e(u, v) between the object pattern and the reference beam pattern is expressed by Equation 1.
e(u,v)=|s(u,v)|2+|r(u,v)|2+s(u,v)r*(u,v)+s*(u,v)r(u,v) [Equation 1]
In Equation 1, the reference beam pattern is given by Equation 2.
In Equation 2, r0 denotes a known constant amplitude, λ denotes a wavelength of light used, and θ denotes an angle of the reference beam, such that θmax≈λ/2Δu with sampling Δu. The terms |s(u,v)|2 and |r(u,v)|2 denote DC terms while s*(u,v)r(u,v) is a twin image. An object complex field s(u, v) is reconstructed from e(u, v) by suppressing the DC terms and the twin image. Equation 3 is obtained using a Bayesian framework, to minimize a cost function.
J(s(u,v)|e(u,v))=½∥e(u,v)−(u,v)−(|s(u,v)|2+|r(u,v)|2+s(u,v)r*(u,v)+s*(u,v)r(u,v))∥22+λJ(s(u,v)) [Equation 3]
In Equation-3, the cost function to be minimized is J(s(u,v)|e(u,v)) and prior knowledge on a complex spectrum to be estimated is given by J(s(u,v)). A parameter A used here denotes a tradeoff parameter and is not to be confused with the wavelength of light. The prior knowledge is defined as Equation 4.
J(s(u,v))=∥∇s(u,v)∥1 [Equation 4]
An iterative solution to estimate s(u, v) is obtained using a simple gradient descent, as expressed by Equation 5.
ŝ(u,v)n+1=ŝ(u,v)n+1−α∇J(ŝ(u,v)n|e(u,v)) [Equation 5]
In Equation 5, ∇J(ŝ(u,v)n|e(u, v)) denotes a gradient of the cost function when no statistical prior knowledge is introduced. The gradient is expressed by Equation 6.
∇J(s(u,v)|e(u,v))=−[e(u,v)−(|s(u,v)|2+|r(u,v)|2+s(u,v)r*(u,v)+s*(u,v)r(u,v))]×(s(u,v)+r(u,v)) [Equation 6]
A constraint or filter h(u, v) is added as a convolution, as expressed by Equation 7.
In Equation 7, the filter h(u, v) is similar to a low pass filter and a spread of the filter h(u, v) is limited by the estimated bandwidth B. At a subsequent iteration, ŝ(u,v)newn+1 is used as new estimate. A value of α is directly selected or estimated using a line-search algorithm. Once ŝ(u,v) is estimated, an image of specimen is obtained by back-propagating the complex function through convolution with a Fresnel impulse response as expressed by Equation 8.
In Equation 8, Δx and Δu denote sampling pixels in an imaged space and an inverse space of the imaged space. The sampling pixels are related by the magnification as expressed by Equation 9.
In operation 1006, the method includes extracting an interference pattern of the object 102 from the series of holograms. The partially coherent light from the light source 202 is split by the beam-splitter cube 208 into an incident beam and a reference beam. The incident beam passes through the phase plate 206 and is reflected from a subject of the object 102. The incident beam is partially transmitted and partially reflected by the beam-splitter cube 208. The interference pattern is obtained by interference between the reflected beam from the object 102 and the reference beam. The controller 304 extracts the interference pattern of the object 102 from the series of holograms. Further, the controller 304 extracts the interference pattern prior to determining a calibration factor in the electronic device 106. For example, a camera of a smartphone may receive an interference pattern of the skin, an eye, or another object 102, by interference between an incident beam reflected back from the object 102 and a reference beam reflected back from the diffraction mirror 210.
In operation 1008, a frequency spectrum of the object 102 is obtained by obtaining a Fresnel transform of an amplitude and a phase retrieved from the interference pattern. More specifically, a high frequency portion in the frequency spectrum is recovered using an iterative restoration approach, and the controller 304 obtains a frequency spectrum of the object 102 by obtaining a Fresnel transform of an amplitude and a phase retrieved from the interference pattern. For example, the smartphone 106 may extract the frequency spectrum of the object 102 by obtaining the Fresnel transform of the amplitude and the phase recovered from the interference pattern.
In operation 1010, an image of the object 102 is obtained by obtaining a Fourier transform of the frequency spectrum. In an example, the image may be a low-resolution image. In another example, the image may be a high-resolution image. More specifically, when reconstructing the low-resolution image, the controller 304 obtains the image of the object 102 by obtaining the Fourier transform of the frequency spectrum. When reconstructing the high-resolution image, the controller 304 obtains the image of the object 102 by obtaining an inverse Fourier transform of the frequency spectrum. For example, the smartphone 106 reconstructs the image of the object 102 by the Fourier transform of the frequency spectrum when the image is a low-resolution image, and transmits the interference pattern to the server 108 for extracting spectrum data when the image is a high-resolution image.
In operation 1012, the reconstructed image of the object 102 is recorded in the data storage 310 of the electronic device 106. The data storage 310 records the image of the object 102 in the electronic device 106. For example, the smartphone 106 processes the interference pattern to record the image of the object 102 in a data storage 310 of the smartphone 106.
In operation 1014, the recorded image is displayed on the electronic device 106. The display screen 308 displays the recorded image on the electronic device 106. For example, the recorded image of the object 102 is displayed on the smartphone 106.
In operation 1016, the image is authenticated by comparing the recorded image to a pre-stored image of the object 102. The controller 304 authenticates the image by comparing the recorded image to the stored image of the object 102.
For example, an emergency room physician may use an optical adapter attached to an electronic device to view an eye of a user, for example, a patient. The optical adapter records images of the eye and transmits the images to the electronic device. The electronic device obtains a frequency spectrum of the eye by obtaining a Fresnel transform of an amplitude and a phase recovered from the captured image. The electronic device reconstructs the image of the eye by obtaining a Fourier transform of the frequency spectrum. The reconstructed image is stored in a data storage of the electronic device to diagnose the eye of the user, for example, the patient. Such a diagnosis is referred to as a coarse level diagnosis.
In another example, a medical practitioner may operate an imaging system while examining an eye of a user, for example, a patient to capture images of the eye. In this example, the captured images may be transmitted to an electronic device or a server to process and reconstruct an image of the eye and thereby diagnose the eye of the user, for example, the patient. Such a diagnosis is referred to as a detailed level diagnosis.
Various actions, acts, blocks, operations, and the like of
The examples disclosed herein may be implemented through at least one software program running on at least one hardware device and performing network management functions to control the elements.
The apparatuses, units, modules, devices, and other components illustrated in
The methods illustrated in
Instructions or software to control a processor or computer to implement the hardware components and perform the methods as described above are written as computer programs, code segments, instructions or any combination thereof, for individually or collectively instructing or configuring the processor or computer to operate as a machine or special-purpose computer to perform the operations performed by the hardware components and the methods as described above. In one example, the instructions or software include machine code that is directly executed by the processor or computer, such as machine code produced by a compiler. In another example, the instructions or software include higher-level code that is executed by the processor or computer using an interpreter. Programmers of ordinary skill in the art can readily write the instructions or software based on the block diagrams and the flow charts illustrated in the drawings and the corresponding descriptions in the specification, which disclose algorithms for performing the operations performed by the hardware components and the methods as described above.
The instructions or software to control a processor or computer to implement the hardware components and perform the methods as described above, and any associated data, data files, and data structures, are recorded, stored, or fixed in or on one or more non-transitory computer-readable storage media. Examples of a non-transitory computer-readable storage medium include read-only memory (ROM), random-access memory (RAM), flash memory, CD-ROMs, CD-Rs, CD+Rs, CD-RWs, CD+RWs, DVD-ROMs, DVD-Rs, DVD+Rs, DVD-RWs, DVD+RWs, DVD-RAMs, BD-ROMs, BD-Rs, BD-R LTHs, BD-REs, magnetic tapes, floppy disks, magneto-optical data storage devices, optical data storage devices, hard disks, solid-state disks, and any device known to one of ordinary skill in the art that is capable of storing the instructions or software and any associated data, data files, and data structures in a non-transitory manner and providing the instructions or software and any associated data, data files, and data structures to a processor or computer so that the processor or computer can execute the instructions. In one example, the instructions or software and any associated data, data files, and data structures are distributed over network-coupled computer systems so that the instructions and software and any associated data, data files, and data structures are stored, accessed, and executed in a distributed fashion by the processor or computer.
While this disclosure includes specific examples, it will be apparent to one of ordinary skill in the art that various changes in form and details may be made in these examples without departing from the spirit and scope of the claims and their equivalents. The examples described herein are to be considered in a descriptive sense only, and not for purposes of limitation. Descriptions of features or aspects in each example are to be considered as being applicable to similar features or aspects in other examples. Suitable results may be achieved if the described techniques are performed in a different order, and/or if components in a described system, architecture, device, or circuit are combined in a different manner, and/or replaced or supplemented by other components or their equivalents. Therefore, the scope of the disclosure is defined not by the detailed description, but by the claims and their equivalents, and all variations within the scope of the claims and their equivalents are to be construed as being included in the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
3509/CHE/2014 | Jul 2014 | IN | national |
10-2015-0073810 | May 2015 | KR | national |