An embodiment of an image capture device comprises an image axis and a gyroscope operable to indicate the orientation of the image axis
An embodiment of a capsule endoscopy system comprises an imaging capsule and an external unit. The imaging capsule may include an image capture device having an image axis and a gyroscope operable to indicate the orientation of the image axis. The external unit may include a gyroscope operable to indicate an orientation of a subject and a harness wearable by the subject, and is operable to align the gyroscope with an axis of the subject. The imaging capsule may send an image to the external unit for processing and display, and the external unit may calculate the image-axis orientation relative to the body.
For example, in such an embodiment, the imaging capsule may be ingested and images of a subject's gastrointestinal system, and the external unit may determine the orientation of the imaging capsule's image axis relative to the subject's body.
The present disclosure is presented by way of at least one non-limiting exemplary embodiment, illustrated in the accompanying drawings in which like references denote similar elements, and in which:
Endoscopy, or internal examination of a living subject such as a human, may be performed with an endoscope that is inserted into a body opening (e.g., mouth or anus) and that allows a physician to internally view a body cavity (e.g., esophagus, stomach, colon, or intestine) that is accessible via the opening. Examination of the gastrointestinal tract (“GI tract”), for example, includes inserting the endoscope into the mouth, down the esophagus, and into the stomach and/or intestines. Similarly, examination of the colon (e.g., a colonoscopy), for example, includes inserting the endoscope through the anus into the colon.
Unfortunately, such a procedure may be invasive and uncomfortable for a subject, and may necessitate general anesthesia. Moreover, such a procedure may require sterile endoscopy equipment and a sterile environment. Accordingly, an endoscopy procedure is generally performed in a hospital setting, which may increase the cost of such a procedure.
Compared to conventional endoscopy as discussed above, the endoscopy system 105 described herein is non-invasive because a subject 100 need only swallow the imaging capsule 110 and wear the external unit 115 as the imaging capsule travels through his/her GI tract 140. Therefore, no anesthesia is believed to be required in most cases, and imaging via the endoscopy system 105 need not be performed in a sterile hospital setting, or even at a doctor's office. In fact, once the subject 100 swallows the imaging capsule 110, the subject may move about normally as the imaging capsule captures images of the subject's GI tract 140. This may significantly reduce the cost of endoscopy procedures and may significantly reduce the discomfort and inconvenience of the subject 100.
The imaging capsule 110 may assume numerous orientations relative to the subject 100 while traveling through the GI tract 140, such that the image axis 145 may be pointing in any direction at any given time. Therefore, images captured by the imaging capsule 110 may be taken from numerous orientations within the GI tract. As described further herein, because a physician may want to know the relative orientation of each image relative to the GI tract 140 for purposes of analysis and diagnosis, the external unit 115 and imaging capsule 110 may be operable to indicate, for each image, the orientation of the imaging capsule 110 relative to a frame of reference of the subject 100. For example, for images of the subject's stomach, a doctor may wish to know if the image is of, e.g., the back of the stomach, the front of the stomach, the top of the stomach, or the bottom of the stomach.
The external unit 115 is coupled to the subject 100 with a harness 210, which may be a belt or strap of a suitable material that encircles the subject 100 and maintains an axis 245 of the frame of reference of the external unit 115 in alignment with an axis 250 of the subject's frame of reference regardless of how the subject 100 may move. That is, the harness 210 maintains the unit's axis 245 approximately parallel to or approximately co-linear with the subject axis 250. For example, the subject 100 in
Additionally,
The imaging module chip 410 includes a processor 420, a gyroscope 430, a wireless transceiver module 440, a light source 450, a power source 460, a lens assembly 470, and a pixel array 480. The focal axis of the lens assembly 470 and the array axis normal to the center of the pixel array 480 are approximately aligned along the image axis 145. That is, the pixel array 480 is operable to capture an image of an object toward which the image axis 145 points.
The shell 405 may be formed of any suitable material, and may be any suitable size and shape. For example, in an embodiment, the shell 405 may be operable to be ingested and to pass through the gastrointestinal tract of the subject 100 (
The imaging-module IC 410 may be an integrated circuit, a hybrid integrated circuit, a micro-electro-mechanical system (MEMS), or any suitable system. Furthermore, as discussed above, the components of the imaging-module IC 410 may be disposed on a single IC die or on multiple IC dies. Additionally, the imaging-module IC 410 may include more or fewer components than are described herein, and such components may be configured in any suitable arrangement.
The processor 420 may be any suitable processor, processing system, controller, or module, and may be programmable to control one or more of the other components of the imaging capsule 110. Furthermore, the processor 420 may perform image processing on images captured by the pixel array 480 before the images are transmitted to the external unit 115 (
The gyroscope 430 may be any suitable device operable to indicate a degree of rotation about one or more coordinate axes of the gyroscope's frame of reference. For example, the gyroscope 430 may be operable to detect “yaw”, “pitch”, and “roll” (i.e., rotation) about coordinate X, Y, and Z axes, respectively. Examples of gyroscopes suitable for the gyroscope 430 include the STMicroelectronics L3G4200DH and the L3G4200D. In an embodiment, there may be a plurality of gyroscopes 430.
The wireless module 440 may be any suitable device that is operable to send and receive wireless communications. For example, the wireless module 440 may be operable to send to the external unit 115 (
The light source 450 may be any suitable device (e.g., one or more light-emitting diodes) operable to provide illumination to aid in capturing images. For example, the light source may be operable to provide sufficient illumination while in the gastrointestinal tract of the subject 100 such that the pixel array 480 may capture an image. The light source 450 may provide continuous illumination, or may provide flash illumination as is suitable for the application, for example, under the control of the processor 420. Additionally, the intensity of illumination may be modified, e.g., by the processor 420 (the light source 450, or the image capsule 110, may include an intensity sensor (not shown in
The power source 460 may be any suitable source of power such as a battery, and may provide power to one or more components of the imaging capsule 110. The power source 460 may be recharged via a wired technique, or may be recharged wirelessly (e.g., via RF energy). In an embodiment, there may be a plurality of power sources 460.
The lens assembly 470 may be operable to focus, or otherwise to modify electromagnetic energy (e.g., visible light) such that the energy may be sensed by the pixel array 480 to capture an image. Collectively, the lens assembly 470 and pixel array 480 may constitute an image-capture apparatus, and may be arranged as a single imaging module, assembly, or unit. As discussed above, the normal to the center of the pixel array 480 and the focal axis of the lens assembly 470 are approximately aligned along the image axis 145, which “points” in the direction of an object (or portion of an object) whose image the pixel array may capture. The lens assembly 470 may be any suitable type of imaging lens assembly, such as a macro lens, process lens, fisheye lens, or stereoscopic lens.
In an embodiment, the pixel array 480 and lens assembly 470 may be operable to capture images in various regions of the electromagnetic spectrum, including infrared, ultraviolet, or within visible light. In an embodiment, the pixel array 480, lens assembly 470, or both the pixel array and the lens assembly, may be separate from the imaging module chip 410. Additionally, in an embodiment, the lens assembly 470 may be omitted. In an embodiment, there may be a plurality of pixel arrays 480 lens assemblies 470.
The processor 520 may be any suitable processor, processing system, controller, or module, and may be programmable to control one or more of the other components of the imaging capsule 110. Furthermore, the processor 520 may perform image processing on images captured by the pixel array 480.
The gyroscope 530 may be any suitable device operable to indicate a degree of rotation about one or more coordinate axes of the gyroscope's frame of reference. For example, the gyroscope 530 may be operable to detect “yaw”, “pitch”, and “roll” (i.e., rotation) about coordinate X, Y, and Z axes, respectively.
The wireless module 540 may be operable to send and receive wireless communications. For example, the wireless module 540 may be operable to receive from the imaging capsule 110 (
The computer 510 may be any suitable computing device (e.g., a laptop or desktop computer) that is directly or wirelessly coupled with the external unit 115, and may be operable to program the external unit 115, obtain stored data from the external unit 115, process data obtained from the external unit 115, and the like. The computer 510 may also be operable to program the processor 420 of the imaging capsule 110 (
In an embodiment, the endoscopy system 105 described herein may also be used to capture images within a non-human subject 100. Additionally, the endoscopy system 105 or components thereof may be used to capture images within non-living systems, such as systems of pipes, a moving body of water, or the like.
As depicted in
In an embodiment, the external unit axis 245 (
Although the XBODY, YBODY, and ZBODY are depicted as having specific orientations relative to the body of the subject 100, in another embodiment, the XBODY, YBODY, and ZBODY axes may have different orientations relative to the subject, and need not be aligned with a plane, the spine 605, or other part of the body. Therefore, the alignments of the XBODY, YBODY, and ZBODY axes shown in
The ZNBODY orientation represents an orientation of the ZBODY axis (
The ZNCAP orientation represents an orientation of the ZNCAP axis (
ZNCAP and ZNBODY may be defined within the terrestrial coordinate system 800 by spherical coordinates relative to the earth XEARTH YEARTH ZEARTH coordinate system 800. For example, θBODY and ϕBODY, are depicted in
As ZNCAP changes direction relative to the terrestrial coordinate system 800 as the imaging capsule 110 moves through the gastrointestinal tract capturing images, knowing the orientation of ZNCAP relative to ZNBODY may be important when interpreting the images captured by the image capsule 110. For example, for a given image or a series of images, it may be important to determine whether the image axis 145 is pointing toward the back, legs, head, or front of the subject 100 so that the images may be properly interpreted or so that images may be combined.
Given that the ZNCAP CAP and ZNBODY orientations may be both continuously and independently changing relative to each other over time, the orientation of ZNCAP relative to the body coordinate system 600 may be calculated by synchronizing or calibrating the frame of reference of the external unit 115 and the frame of reference of the imaging capsule 110 (
For example, a doctor may initially synchronize or calibrate the external unit 115 and imaging capsule 110 by having the subject 100 stand while wearing the external unit coincident to or parallel with the gravitational force of earth {right arrow over (G)} and ZBODY, while the doctor holds the imaging capsule parallel with the gravitational force of earth (e.g., away from the ground), as depicted in
For example, presuming that the external unit 115 and imaging capsule 110 are initially synchronized or calibrated having the orientations depicted in
Accordingly, as the subject 100 and external unit 115 change orientation, and as the imaging capsule 110 changes orientation within the subject 100 while capturing images, the orientation of the image axis 145 may be determined relative to the subject coordinate system 600 (
For example, assume that the external unit 115 and imaging capsule 110 are initially synchronized or calibrated having the initial orientations depicted in
To determine the normalized rotation (RNNORMAL) and normalized orientation (ONNORMAL) of the image axis 145 (i.e., the orientation of the image axis relative to the body coordinate system 600 frame of reference), one may use the following equation: R1 (Φ, Θ, Ψ)CAP−R1(Φ, Θ, Ψ)BODY=R1(φ, θ, ψ)NORMAL (i.e., R1φCAP−R1φBODY=R1 φNORMAL; R1 θCAP−R1 θBODY=R1 θNORMAL; R1ψCAP−R1ψBODY=R1ψNORMAL). Returning to the example above, (φ, θ, ψ)NORMAL may be calculated as follows: R1(−45°, 45°, −90°)CAP−R1(−90°, 0.0°, 90°)BODY=R1(45°, 45°, −0°)NORMAL. As depicted in
Images captured by the imaging capsule 110 may be associated with a given time so that the image orientation (i.e., the orientation of the image axis 145) may be determined at a number of discrete times. For example, I1 (image 1) may be associated with Z1CAP and Z1BODY and a determination of 01NORMAL would therefore be an indication of the normalized orientation of I1 relative to the body of the subject 100 and the body coordinate system 600 frame of reference (
Images and corresponding data may be captured at various suitable intervals. For example, images and corresponding data may be captured every second, tenth of a second, or five images every tenth of a second, with one second between a set of such five images.
From the foregoing it will be appreciated that, although specific embodiments have been described herein for purposes of illustration, various modifications may be made without deviating from the spirit and scope of the disclosure. Furthermore, where an alternative is disclosed for a particular embodiment, this alternative may also apply to other embodiments even if not specifically stated.
Number | Date | Country | Kind |
---|---|---|---|
2010 1 0603106 | Dec 2010 | CN | national |
This application is a continuation of U.S. patent application Ser. No. 13/329,293, filed on Dec. 18, 2011, which claims priority to Chinese Patent Application No. 201010603106.2, filed on Dec. 17, 2010, which applications are hereby incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4159690 | Farris | Jul 1979 | A |
5253647 | Takahashi et al. | Oct 1993 | A |
5749540 | Arlton | May 1998 | A |
5776050 | Chen et al. | Jul 1998 | A |
5821414 | Noy | Oct 1998 | A |
6432041 | Taniguchi et al. | Aug 2002 | B1 |
6442417 | Shahidi et al. | Aug 2002 | B1 |
6690963 | Ben-Haim et al. | Feb 2004 | B2 |
6895305 | Lathan | May 2005 | B2 |
7211042 | Chatenever et al. | May 2007 | B2 |
7585273 | Adler et al. | Sep 2009 | B2 |
7603160 | Suzuki et al. | Oct 2009 | B2 |
7993265 | Suzushima et al. | Aug 2011 | B2 |
8050738 | Minai et al. | Nov 2011 | B2 |
8214017 | Sato | Jul 2012 | B2 |
20010002121 | Seto | May 2001 | A1 |
20020099310 | Kimchy et al. | Jul 2002 | A1 |
20020160746 | Yoshioka | Oct 2002 | A1 |
20030028078 | Glukhovsky | Feb 2003 | A1 |
20030137433 | Schiller | Jul 2003 | A1 |
20030199756 | Kawashima | Oct 2003 | A1 |
20030216639 | Gilboa et al. | Nov 2003 | A1 |
20040199054 | Wakefield | Oct 2004 | A1 |
20050027178 | Iddan | Feb 2005 | A1 |
20050043587 | Fujimori et al. | Feb 2005 | A1 |
20050143643 | Mimai | Jun 2005 | A1 |
20050187479 | Graumann | Aug 2005 | A1 |
20050228230 | Schara et al. | Oct 2005 | A1 |
20070003612 | Williams | Jan 2007 | A1 |
20080021282 | Hoeg et al. | Jan 2008 | A1 |
20080039688 | Minai et al. | Feb 2008 | A1 |
20080300453 | Aoki | Dec 2008 | A1 |
20090147993 | Hoffmann | Jun 2009 | A1 |
20090227840 | Uchiyama et al. | Sep 2009 | A1 |
20090227864 | Sato | Sep 2009 | A1 |
20100010300 | Gilad | Jan 2010 | A1 |
20100039381 | Cretella, Jr. | Feb 2010 | A1 |
20100169409 | Fallon | Jul 2010 | A1 |
20110144573 | Rofougaran | Jun 2011 | A1 |
20120157769 | Zhu et al. | Jun 2012 | A1 |
20120172681 | Sun et al. | Jul 2012 | A1 |
Number | Date | Country |
---|---|---|
2735933 | Oct 2005 | CN |
1868396 | Nov 2006 | CN |
101351145 | Jan 2009 | CN |
101502412 | Aug 2009 | CN |
2006263167 | Oct 2006 | JP |
9605768 | Feb 1996 | WO |
Entry |
---|
L3G4200D, “MEMS Motion Sensor: Three-Axis Digital Outpost Gyroscope,” Feb. 2010, Doc ID 17116 Rev. 1, www.st.com, 24 pages. |
L3G4200DH, “MEMS Motion Sensor: Three-Axis Digital Outpost Gyroscope,” Apr. 2010, Doc ID 17300 Rev 1, www.st.com, 28 pages. |
Number | Date | Country | |
---|---|---|---|
20170307373 A1 | Oct 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13329293 | Dec 2011 | US |
Child | 15645371 | US |