The present invention relates to a system and method for displaying an image stream captured by an in vivo imaging device; more specifically, to a system and method for displaying an image stream in a consolidated manner.
Devices and methods for sensing of passages or cavities within a body, and for gathering information (e.g., image data, pH data, temperature information, pressure information), are known in the art. Such known devices may include, inter alia, swallowable imaging devices that may be autonomous and may travel through the gastrointestinal (GI) passively by, for example, natural peristaltic motion. Images captured by such devices may be transmitted, for example, by wireless transmission to an external receiving/recording, and may be subsequently displayed, typically as an image stream. Sometimes the image stream may be displayed as a movie.
According to some embodiments of the present invention, an image stream may be constructed that may present a progression of images captured along a body lumen. According to one embodiment of the present invention, the rate of change of scenery in the image stream may be regulated. According to another embodiment of the present invention, the image stream may display all image information captured while reducing/eliminating redundancy. The image stream, according to an embodiment of the invention, may reduce viewing time, increase the field of view of an image frame and provide more comfortable viewing, for example by providing a steady view through a body lumen without redundant information and without loosing any new information.
According to some embodiments of the present invention, consolidated image frames may be deformed to a standard shape and size. According to other embodiments of the present invention a defined central region of the consolidated image may be maintained as is without deformation. According to yet other embodiments of the present invention, the streaming rate of display may be regulated based on the content of the current image frame displayed.
The present invention will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which:
It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn accurately or to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity, or several physical components may be included in one functional block or element. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.
In the following description, various aspects of the present invention will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the present invention. However, it will also be apparent to one skilled in the art that the present invention may be practiced without the specific details presented herein. Furthermore, well known features may be omitted or simplified in order not to obscure the present invention.
The device, system and method of the present invention may be used with an imaging system or device such as that shown in
Autonomous in-vivo imaging device that may passively traverse, for example, the GI tract may advance/progress through one or more body lumens of the GI tract in an orderly, substantially predictable, and/or smooth fashion and/or at a regulated/nearly consistent rate, and/or at a steady pace. As such an imaging device, capturing images at a fix rate may produce an image stream that may display the entire body lumen in a steady stream or steady periodic intervals of the body lumen. An example of such a lumen may be the small intestine where the body lumens walls may substantially hug the in-vivo imaging device so as to maintain its orientation with respect to the body lumen wall and the smooth periodic peristaltic pressure waves may advance the imaging device down stream in a steady predictable fashion. Other body lumens in addition to the small intestine may promote smooth advancement of an in-vivo imaging device.
In other body lumens the progression of the device may be chaotic/random/difficult to predict and may not facilitate collection of data in an orderly fashion or may not result in an orderly scanning of the entire body lumen and/or periodic intervals of the body lumen. For example, an in-vivo device may linger in one or more locations or sections of the body lumen, for example, for an extended period of time and subsequently rapidly progress through another section of the body lumen. Image data that may be captured at a fixed periodic rate may be redundant in the section where the device may have lingered and may be sparse in the section where the device may have progressed rapidly. An example of such a lumen may be the colon and/or the stomach and/or other voluminous body lumens. The colon and stomach may generally have a larger diameter than for example, the small intestine, and may not facilitate continuously maintaining its orientation with respect to the body lumen wall. In addition, the peristaltic motion of the colon and or stomach may be erratic and/or may occur at much lower frequencies. In other examples, stagnation and changes in orientation of the in vivo device may also occur in the small intestine and/or in the esophagus. Other body lumens may result in unpredictable and/or unsteady advancement of an imaging device.
Reviewing an image stream captured from an in-vivo device passively traversing through a body lumen, e.g. a colon and/or stomach may be cumbersome and time consuming due to the numerous repetitions of image data in certain sections and fast progression in other sections. For example in one section of the body lumen, a lot or a redundant amount of image data may be captured, while in another section little data may be captured. In addition, while viewing such an image stream it may be difficult to estimate the progression rate through the body lumen or to correlate between the frame capture rate and the advancement through the body lumen due to the erratic nature of the progression.
It may be helpful to consolidate two or more images in sections where the in-vivo device may have lingered while maintaining, for example, original image frames in areas where there was a significant advancement of the in-vivo imaging device. According to one embodiment of the present invention, a revised image stream consisting of partially consolidated image frames and partially original image frames may be constructed. In another embodiment of the present invention, a revised image stream may consist of a stream of consolidate image data where each consolidate image frame in the revised image stream and/or the consolidated image stream may be a consolidation of two or more images.
According to some embodiments of the present invention, a revised image stream may be constructed that may shorten the visualization time required to review the image stream and that may show a smoother and/or more steady progression of images captured along a body lumen. According to one embodiment of the present invention, the rate of change of scenery in the image stream may be regulated. According to another embodiment of the present invention, the revised image stream may display all image information captured while reducing/eliminating redundancy. The revised image stream may reduce viewing time, and provide more comfortable viewing, for example by providing a steady view through a body lumen without redundant information and without loosing any new information.
According to one embodiment of the present invention, consecutive frames of an image stream may be checked for potentially overlapping areas. An image frame, e.g. image frame n+1 may be compared to, for example, image frame n, where image frame n may be referred to as a reference image frame. In some examples, transformation, for example translation transformation, scaling transformation, shear transformation, affine transformation, projective transformation, and/or other polynomials transformations that may represent variations between views of a scene in an image stream may be used to facilitate a comparison. If overlapping is identified the images may be merged. Subsequently, image frame n+2 may be compared to the current merged image for an overlapping area. This process may continue until the subsequent image frame does not share any overlapping areas with the merged image frame. At that point a new reference image frame may be defined and the process may be repeated to form a second consolidated image frame in the revised image stream. In other examples, the reference image frame may be compared to a previous image frame and/or image frames that were captured a number of frames before or after the reference image frame. Other suitable methods of consolidating image frames in an image stream may be used. Registration and merging of images may be performed by methods known in the art. In some embodiments of the present invention, an image stream may contain in the order of magnitude of 10,000 frames and for example, an order of magnitude of 100 frames may be consolidated and/or merged into one image. In other embodiments of the present invention, the image frame may be of other sizes and the number of images merged into one may typically be of other sizes. By way of example, a revised image stream may have a first consolidated image frame that may be a result of a consolidation of 50 images, a second consolidated image frame in the revised image stream may be a result of a consolidation of 200 images and a third image frame in the revised image stream may be a single frame from the original image stream. During display of the revised image stream, the streaming rate may be constant or may be variable according to one or more parameters, e.g. image parameters. According to one embodiment the consolidation is done in processing unit 14.
Reference is made to
In some embodiments of the present invention it may be desired to deform the shape of each of the consolidated image frames to a uniform shape and size. Since each of the consolidated image frames in the image stream may result in a different shape, it may be awkward, distracting and/or uncomfortable to view an image stream, typically a moving image stream, where some of the frames have different sizes and shapes. Reference is now made to
According to some embodiments of the present invention, deformation of the image data outside of the central area 120 may be radially from central point 70. Reference is now made to
According to some embodiments a method for shortening a movie of in vivo images. According to an embodiment of the invention a movie may be constructed of image frames taken by an autonomous in vivo imaging device (such as the capsule described above). The movie may include one or more revised frames that include a combination of original frames. The movie may be displayed, such as on a monitor as in
It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather the scope of the present invention is defined only by the claims, which follow:
This application claims the benefit of U.S. Provisional Application Ser. No. 60/680,526, filed on May 13, 2005.
Number | Name | Date | Kind |
---|---|---|---|
5138460 | Egawa | Aug 1992 | A |
5331551 | Tsuruoka et al. | Jul 1994 | A |
5547455 | McKenna et al. | Aug 1996 | A |
5649032 | Burt et al. | Jul 1997 | A |
5764809 | Nomami et al. | Jun 1998 | A |
5796861 | Vogt et al. | Aug 1998 | A |
5800341 | McKenna et al. | Sep 1998 | A |
5838837 | Hirosawa et al. | Nov 1998 | A |
5953146 | Shelby | Sep 1999 | A |
6261226 | McKenna et al. | Jul 2001 | B1 |
6346940 | Fukunaga | Feb 2002 | B1 |
6424752 | Katayama et al. | Jul 2002 | B1 |
6532036 | Peleg et al. | Mar 2003 | B1 |
6549681 | Takiguchi et al. | Apr 2003 | B1 |
6720997 | Horie et al. | Apr 2004 | B1 |
6744931 | Komiya et al. | Jun 2004 | B2 |
6785410 | Vining et al. | Aug 2004 | B2 |
6947039 | Gerritsen et al. | Sep 2005 | B2 |
6947059 | Pierce et al. | Sep 2005 | B2 |
7011625 | Shar | Mar 2006 | B1 |
7162102 | Cahill et al. | Jan 2007 | B2 |
7381183 | Hale et al. | Jun 2008 | B2 |
7609910 | Geiger et al. | Oct 2009 | B2 |
20020109774 | Meron et al. | Aug 2002 | A1 |
20020190980 | Gerritsen et al. | Dec 2002 | A1 |
20030028078 | Glukhovsky | Feb 2003 | A1 |
20030076406 | Peleg et al. | Apr 2003 | A1 |
20030113035 | Cahill et al. | Jun 2003 | A1 |
20040169726 | Moustier et al. | Sep 2004 | A1 |
20040210105 | Hale et al. | Oct 2004 | A1 |
20040228508 | Shigeta | Nov 2004 | A1 |
20040249247 | Iddan | Dec 2004 | A1 |
20050008254 | Ouchi et al. | Jan 2005 | A1 |
20050151730 | Lobregt | Jul 2005 | A1 |
20050168616 | Rastegar et al. | Aug 2005 | A1 |
20050226483 | Geiger et al. | Oct 2005 | A1 |
20070161853 | Yagi et al. | Jul 2007 | A1 |
Number | Date | Country | |
---|---|---|---|
20060285732 A1 | Dec 2006 | US |
Number | Date | Country | |
---|---|---|---|
60680526 | May 2005 | US |