Method and devices for displaying images for viewing with varying accommodation

Information

  • Patent Grant
  • 7108378
  • Patent Number
    7,108,378
  • Date Filed
    Tuesday, October 11, 2005
    18 years ago
  • Date Issued
    Tuesday, September 19, 2006
    17 years ago
Abstract
Light (22) is projected from an image projector (20), in response to an image information signal (18) and is controllably refracted in response to a first control signal (26), for projecting refracted light (27) onto a screen (28). The screened light is for providing viewable images of varying size occupying all or part of the screen. The viewable images are refracted in response to a second control signal (64) for viewing the images of increasingly smaller size with correspondingly increasing magnification.
Description
BACKGROUND OF INVENTION

1. Technical Field


The present invention relates to acquiring images and, more particularity, to displaying such images for viewing with varying accommodation.


2. Discussion of Related Art


The granularity of a given static man-made image of a real object is only as good as that of the imaging technology used to acquire and present it. Closer inspection with a magnifying glass or other aid to eyesight does not ultimately reveal any deeper granularity but only the limitations of the imaging technology used. This is not usually a problem for the enjoyment of images in books, movies, and other conventional media


On the other hand, the granularity of real objects is unlimited as far the human eye is concerned. Considering the eye itself, with increased focus, more detailed granularity of objects is always revealed. Moreover, with technological aids to the eye, e.g., the magnifying glass, the optical microscope, the electron microscope and other scientific tools, smaller details are always revealed.


A recent motion picture or video innovation provides successive images to the eye at varying apparent distances for viewing with correspondingly varying focus (accommodation) of the eye. With increased magnification merely at the viewer's end, however, because of the limitations of man-made imaging technology, there is not any increased level of granularity available for inspection. If the magnification is also increased at the camera end there will be increased granularity but the viewer with increased accommodation will focus on only a small part of the entire field-of-view presented. The effect is reduced granularity. Therefore, the verisimilitude of the imagery under increased magnification is a problem.


SUMMARY OF INVENTION

An object of the present invention is to provide images of objects in a scene at varying distances that provide increased granularity to close-ups.


According to a first aspect of the present invention, a method is provided comprising the steps of projecting light from an image projector, in response to an image information signal, controllably refracting said projected light, in response to a first control signal, for projecting refracted light for providing viewable images of varying extent, and controllably refracting said viewable images in response to a second control signal for viewing said images of increasingly smaller extent with correspondingly increasing magnification.


According to a second aspect of the present invention, a device is provided, comprising a projector, responsive to an image information signal, for providing first light rays, a first optic, responsive to the first light rays and to a first control signal, for providing second light rays, a screen, responsive to the second light rays, for providing third light rays indicative of images of varying size, and a second optic, responsive to the third light rays and to a second control signal, for providing fourth light rays for viewing.


According to a third aspect of the invention, a device is provided, comprising an image projector for projecting light in response to an image information signal, a first optic for controllably refracting said projected light, in response to a first control signal, for providing light rays of varying extent, and a second optic for controllably refracting said light rays in response to a second control signal for providing light rays of increasingly smaller extent at correspondingly decreasing focal length.


These and other objects, features, and advantages of the present invention will become more apparent in light of a detailed description of a best mode embodiment thereof which follows, as illustrated in the accompanying drawing.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows a system including a device for showing images to an eye of a user.



FIG. 2 shows a video camera collecting images of a scene illuminated by a light source with an eye of a cameraman shown using an eyepiece to view the scene being photographed and having a sensed eye signal encoded along with video information.



FIG. 3 shows light rays projected to form an image that fills or almost fill the entire area or extent of a screen.



FIG. 4 shows light rays projected to form an image that only partially fills the entire extent of the screen.



FIG. 5 shows light rays projected to form an image that only fills a small extent of the entire extent of the screen.



FIG. 6 is similar to FIG. 1, except the control signal is sensed not from the eye of the cameraman, but from the eye of the user.





DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS


FIG. 1 shows a device 10 for showing images to an eye 12 of a user. A video signal is received on a line 14 by a control 16. The video signal contains image information which is decoded and provided on a signal line 18 to for instance an image projector 20 which projects images with first light rays 22 to a first optic 24. The optic 24 may for instance be lens that is under the control of a control signal on a line 26 from the control 16. The control signal on the line 26 is decoded by the control 16 from the video signal on the line 14. The optic 24 refracts or otherwise bends the light rays 22 into second light rays 27 that are projected onto a translucent screen 28 to form images of different sizes, i.e., that fill the screen 28 to a greater or lesser extent as shown in FIGS. 3–5, as discussed below. The images on the screen 28 are projected as third light rays 29. It should be realized that the screen need not be translucent but could be reflective. The signal on the line 14 can be provided in many different ways. It should first of all be realized that the signal on the line 14 need not be a single line (which implies some form of multiplexing) but could be two or more signal lines.


Secondly, as a first example of a way to provide the signal on the line 14, referring to FIG. 2, a video camera 30 is shown collecting images of a scene 32 illuminated by a light source 34. An eye 36 of a cameraman is shown using an eyepiece 38 for viewing the scene being photographed. A sensor 40 such as an eye accommodation sensor senses the accommodation of the eye 36. The sensor 40 provides a sensed signal on a line 42 to an optic control 44. The optic control 44 provides a camera optic control signal on a signal line 46 to a motorized camera optic 48. The optic control 44 causes the motorized optic 48 to focus on the scene 32 at differing focal lengths according to changes in the accommodation of the eye 36 as detected by the sensor 40. The optic 48 casts rays 50 onto an image sensor 52 that provides a video image signal on a line 53 to a combiner 54. It combines, e.g., in a time division multiplexed way, the image signal on the line 53 with the signal on the line 42 to form the video data signal on the line 14 (see FIG. 1). As explained above, the signal on the line 42 could be provided in parallel on a separate signal line alongside the signal on the line 14. In that case, it would only carry video information.


Referring now back to FIG. 1, it will be realized from the foregoing that the control signal on the line 26 changes the projected light rays 22 by means of the optic 24 according to changes detected in the cameraman's eye 36 of FIG. 2 by the sensor 40. In other words, the signal on the line 42 from the sensor 40 is not only used to control the optic 48, but is encoded in the signal on the line 14 along with the image information and used also to control the optic 24. The nature of the change in the projected light rays 22 is manifested by the manner in which the light rays 27 are projected onto the screen 28. Examples are shown in FIGS. 3–5. If the eye 36 of FIG. 2 is detected by the sensor 40 viewing the scene 32 with a long focal distance, such as infinity, the optic control 44 causes the optic 48 to focus at a correspondingly long distance. The optic 48 focuses the scene 32 at infinity and projects the details of the scene with a wide field of view onto the image sensor 52. Consequently, the available sensor 52 pixels are spread over a relatively wide field of view. In other words, the granularity of the image is spread over a wide field of view. FIG. 3 shows the light rays 27 projected to form an image 54 that fills or almost fills the entire area or extent of the screen 28.


If the eye 36 of FIG. 2 is detected by the sensor 40 viewing the scene 32 with an intermediate focal distance, the optic control 44 causes the optic 48 to focus at a correspondingly intermediate distance. The optic 48 focuses the scene 32 at the intermediate distance and projects the details of the scene with an intermediate field of view onto the image sensor 52. Consequently, the granularity, i.e., the available sensor pixels are spread over a relatively intermediate field of view that is less in extent than the relatively wide field-of-view 54 of FIG. 3. FIG. 4 shows the light rays 27 projected to: form an image 56 that only partially fills the entire extent of the screen 28.


If the eye 36 of FIG. 2 is detected by the sensor 40 viewing the scene 32 with a short focal distance, the optic control 44 causes the optic 48 to focus at a correspondingly short distance. The optic 48 focuses the scene 32 at a correspondingly short distance and projects the details of the scene with a narrow field of view onto the image sensor 52. FIG. 5 shows the light rays 27 projected to form an image 58 that only fills a small extent of the entire extent of the screen 28. Consequently, the available sensor pixels are spread over a relatively narrow field of view 58 that is less in extent than the intermediate field of view 56 of FIG. 4. Particular objects within the narrowed field of view 58 of FIG. 5 can be viewed with more granularity than those same objects could be with the granularity provided by that of FIG. 4 and even more so than that of FIG. 3.


Referring back to FIG. 1, as explained above, the light rays 27 are projected onto the screen 28 with different fields-of-view, areas or extents 54, 56, 58, all of which have the same total number of pixels. The advantage of this approach is that with the aid of an optic 60, the field of view of the eye 12 of the viewer can be fully occupied with all of these pixels even though the accommodation of the eye 12 changes. The total number of pixels can be spread over the full used extent of the retina in all cases by a combination of changes in the focal length of the optic 60 and the accommodation of the eye 12. When the optic 48 of the camera 30 of FIG. 2 focuses in on a detail of the wider scene 32, it increases the granularity of the imaged scene in that area. At the same time, the optic 24 causes the size of the image to be reduced on the screen 28. In other words, when the focal length of the optic 48 is shortened to capture a narrowed field of view of the scene with increased magnification, the granularity of that smaller portion of the imaged scene increases as manifested in a smaller area on the screen 28. The focal length of the optic 60 is controlled by a control signal on a line 64 from the control 16 to allow the eye 12 to accommodate, i.e., to focus closer onto the scene with increased granularity in a smaller area of interest. In other words, at the same time that the control signal on the line 26 causes the optic 24 to reduce the extent to which the screen 28 is filled by imagery (see FIG. 4), the signal on the line 64 causes the optic 60 to reduce the field of view provided for the eye 12, e.g., by increasing its magnification. Thus, the optic 60 refracts the rays 29 to provide fourth light rays 65 in such a way that the eye 12 must change its accommodation so as to bring the image into focus. This causes the field of view of the eye 12 to be fully occupied with an up-close image.


If the cameraman's eye 36 changes to a long view of the scene 32, as explained above, the image 54 (FIG. 3) fills the screen 28 because the control signal on the line 26 causes the optic 24 to expand the extent to which the screen 28 is filled by imagery. At the same time, the signal on the line 64 causes the optic 60 to expand the field of view provided for the eye 12, e.g., by reducing its magnification or increasing its focal length. The eye 12 changes its accommodation accordingly. In other words, when the control signal on the line 26 causes the optic 24 to increase the extent to which the screen 28 is filled by imagery, as in FIG. 3, the signal on the line 64 causes the optic 60 to increase the field of view provided for the eye 12 even further, e.g., by decreasing its magnification even more.



FIG. 6 shows a variation of FIG. 1 where instead of sensing the cameraman's eye, the eye of the user is sensed. As shown, an eye sensor 70 senses an eye of a user, for instance by sensing accommodation. A control signal having a magnitude indicative thereof is provided on a line 74 to a control 76 which provides it on a line 78 to a network 80. The network provides the control signal on a line 82 to a camera 84. An input/output device 86 is responsive to the control signal on the line 82 and provides it on a line 88 to an optic control 90 which controls a lens 92 by means of a signal on a line 94. In this way, the camera's magnification is controlled remotely by the user's eye 72 instead of locally by a cameraman's eye.


At the same time, light rays 96 focused by the lens 92 and cast upon an image sensor 98 are converted to an electrical signal on a line 100 and provided to the input/output means 86 for transmission on the line 82 via the network and the line 78 to the control 76. An image signal is provided on a line 102 to a projector 20 for projecting light rays 22 to an optic 24 for projection as rays 27 onto a translucent screen 28. A backlit image formed on the screen 28 provides light rays 29 to an optic 60 that projects rays 118 to the eye 72 of the user. Besides controlling the lens 92, the signal on the line 74 is used by the controller 76 to provide optic control signals on lines 120, 122 for controlling the optics 108, 116 in providing images with differing field of view at different sizes as shown in FIGS. 3–5.


It should be realized that the sensed property of the user's eye need not be accommodation. For instance, eye direction could be sensed. Moreover, it should also be realized that the control signal need not be derived from a sensed eye at all. It could be provided for instance by a mouse or other control device employed by the user, by the cameraman, by a director, or by someone else to control magnification of the imagery. Likewise, the imagery need not be obtained from a camera, but could be generated from a computer or a drawing, painting, or hand-drawn animation. Although a real image is shown projected onto a screen, the optics 108, 116 could be arranged to present a virtual image to the eye of the user without any intermediate screen. It should also be realized that the imagery could be provided as stereoscopic images

Claims
  • 1. Apparatus for collecting video images of a scene illuminated by a light source, comprising: an optic control, responsive to a sensed eye signal indicative of changes in a sensed eye, for providing a camera optic control signal;a camera optic, responsive to the camera optic control signal, for focusing on the scene at differing focal lengths according to the changes in the sensed eye so as to project light with details of said scene over a widest field of view when focusing at infinity and over correspondingly narrower fields of view when focusing at various distances closer than infinity;an image sensor with available pixels for sensing said scene, responsive to light projected from said camera optic with details of said scene over said widest field of view and over said correspondingly narrower fields of view spread over the available pixels, for providing a video image signal; anda combiner, responsive to said video image signal and to said sensed eye signal for providing a video data signal with said video images collected by said apparatus formed for display with a largest size corresponding to said widest field of view and with decreasing sizes corresponding to increasingly narrower fields of view and with said sensed eye signal formed for control of said video images for said display such that particular objects in said scene are viewable with increasing granularity with decreasing focal length.
  • 2. The apparatus of claim 1, further comprising an eye sensor for sensing an eye of a user of the apparatus and for providing said sensed eye signal.
  • 3. The apparatus of claim 2, wherein said eye sensor is for sensing accommodation of said eye of said user.
  • 4. The apparatus of claim 2, wherein said eye sensor is for sensing said eye of said user while said user is viewing objects in said scene and while said apparatus is collecting images of said objects in said scene.
  • 5. The apparatus of claim 1, further comprising: a control, responsive to said video data signal, for providing an image information signal, a first control signal, and a second control signal;an image projector for projecting light in response to said image information signal;a first optic for controllably refracting said projected light in response to said first control signal, for providing light rays of varying extent; anda second optic for controllably refracting said light rays in response to a second control signal for providing light rays for forming said video images with increasingly smaller extent at correspondingly decreasing focal length.
  • 6. The apparatus of claim 5, wherein said video data signal is transmitted over a network.
  • 7. The apparatus of claim 5, wherein said eye sensor is for sensing an eye of a user viewing said video images.
  • 8. The apparatus of claim 7, wherein said video data signal is transmitted over a network.
  • 9. The apparatus of claim 7, wherein said eye sensor is for sensing accommodation of said eye of said user viewing said video images.
  • 10. The apparatus of claim 9, wherein said video data signal is transmitted over a network.
  • 11. The apparatus of claim 1, further comprising: a control, responsive to said video data signal, for providing an image information signal, a first control signal, and a second control signal;a projector, responsive to said image information signal, for providing first light rays;a first optic, responsive to the first light rays and to said first control signal, for providing second light rays;a screen, responsive to the second light rays, for providing third light rays indicative of images of varying size; anda second optic, responsive to the third light rays and to said second control signal, for providing fourth light rays as said video images for viewing.
  • 12. The apparatus of claim 11, wherein said video data signal is transmitted over a network.
  • 13. The apparatus of claim 11, wherein said eye sensor is for sensing an eye of a user viewing said video images.
  • 14. The apparatus of claim 13, wherein said eye sensor is for sensing accommodation of said eye of said user viewing said video images.
  • 15. The apparatus of claim 13, wherein said video data signal is transmitted over a network.
  • 16. The apparatus of claim 13, wherein said eye sensor is for sensing accommodation of said eye of said user viewing said video images.
  • 17. The apparatus of claim 16, wherein said video data signal is transmitted over a network.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 10/057,489 filed on Jan. 23, 2002 now U.S. Pat. No. 6,953,249 and claims priority from U.S. Provisional Application No. 60/264,812 filed Jan. 29, 2001.

US Referenced Citations (96)
Number Name Date Kind
2955156 Heilig Oct 1960 A
3507988 Holmes Apr 1970 A
3542457 Balding et al. Nov 1970 A
4028725 Lewis Jun 1977 A
4072971 Kuboshima Feb 1978 A
4189744 Stern Feb 1980 A
4199785 McCullough et al. Apr 1980 A
4513317 Ruoff, Jr. Apr 1985 A
4618231 Genco et al. Oct 1986 A
4672438 Plante et al. Jun 1987 A
4757380 Smets et al. Jul 1988 A
4828381 Shindo May 1989 A
5103306 Weiman et al. Apr 1992 A
5175616 Milgram et al. Dec 1992 A
5245371 Nagano et al. Sep 1993 A
5252950 Saunders et al. Oct 1993 A
5262807 Shindo Nov 1993 A
5291234 Shindo et al. Mar 1994 A
5296888 Yamada Mar 1994 A
5341181 Godard Aug 1994 A
5363241 Hegg et al. Nov 1994 A
5365370 Hudgins Nov 1994 A
5374820 Haaksman Dec 1994 A
5422653 Maguire, Jr. Jun 1995 A
5422700 Suda et al. Jun 1995 A
5448411 Morooka Sep 1995 A
5455654 Suzuki Oct 1995 A
5467104 Furness, III et al. Nov 1995 A
5499138 Iba Mar 1996 A
5515130 Tsukahara et al. May 1996 A
5526082 Hirano Jun 1996 A
5534918 Torii et al. Jul 1996 A
5543887 Akashi Aug 1996 A
5546224 Yokota Aug 1996 A
5563670 Tenmyo Oct 1996 A
5579165 Michel et al. Nov 1996 A
5583795 Smyth Dec 1996 A
5589908 Irie Dec 1996 A
5625765 Ellenby et al. Apr 1997 A
5635947 Iwamoto Jun 1997 A
5644324 Maguire, Jr. Jul 1997 A
5715384 Oshima et al. Feb 1998 A
5734421 Maguire, Jr. Mar 1998 A
5737012 Tabata et al. Apr 1998 A
5857121 Arai et al. Jan 1999 A
5909240 Hori Jun 1999 A
5973845 Hildebrand et al. Oct 1999 A
5978015 Ishibashi et al. Nov 1999 A
6061103 Okamura et al. May 2000 A
6078349 Molloy Jun 2000 A
6084557 Ishida et al. Jul 2000 A
6094182 Maguire, Jr. Jul 2000 A
6133941 Ono Oct 2000 A
6134048 Kato et al. Oct 2000 A
6177952 Tabata et al. Jan 2001 B1
6181371 Maguire, Jr. Jan 2001 B1
6215532 Takagi et al. Apr 2001 B1
6246382 Maguire, Jr. Jun 2001 B1
6246437 Kaneda Jun 2001 B1
6252989 Geisler et al. Jun 2001 B1
6271895 Takagi et al. Aug 2001 B1
6281862 Tidwell et al. Aug 2001 B1
6307589 Maquire, Jr. Oct 2001 B1
6359601 Maguire, Jr. Mar 2002 B1
6388707 Suda May 2002 B1
6393056 Talluri et al. May 2002 B1
6407724 Waldern et al. Jun 2002 B1
6414709 Palm et al. Jul 2002 B1
6417867 Hallberg Jul 2002 B1
6445365 Taniguchi et al. Sep 2002 B1
6449309 Tabata Sep 2002 B1
6469683 Suyama et al. Oct 2002 B1
6478425 Trajkovic et al. Nov 2002 B1
6504535 Edmark Jan 2003 B1
6507359 Muramoto et al. Jan 2003 B1
6512892 Montgomery et al. Jan 2003 B1
6546208 Costales Apr 2003 B1
6558050 Ishibashi May 2003 B1
6568809 Trajkovic et al. May 2003 B1
6580448 Stuttler Jun 2003 B1
6580563 Finney Jun 2003 B1
6603443 Hildebrand et al. Aug 2003 B1
6614426 Kakeya Sep 2003 B1
6714174 Suyama et al. Mar 2004 B1
6741250 Furlan et al. May 2004 B1
6762794 Ogino Jul 2004 B1
6778150 Maguire, Jr. Aug 2004 B1
6953249 Maguire, Jr. Oct 2005 B1
6972733 Maguire, Jr. Dec 2005 B1
20010043163 Waldern et al. Nov 2001 A1
20020064314 Comaniciu et al. May 2002 A1
20020109701 Deering Aug 2002 A1
20020158888 Kitsutaka Oct 2002 A1
20030197933 Sudo et al. Oct 2003 A1
20040108971 Waldern et al. Jun 2004 A1
20040150728 Ogino Aug 2004 A1
Foreign Referenced Citations (10)
Number Date Country
0 350 957 Jan 1990 EP
0 655 640 May 1995 EP
1 083 454 Mar 2001 EP
2 226 923 Jul 1990 GB
2 272 124 May 1994 GB
3-292093 Dec 1991 JP
WO 9930508 Jun 1999 WO
WO 0051345 Aug 2000 WO
WO 0195016 Dec 2001 WO
WO 03079272 Sep 2003 WO
Provisional Applications (1)
Number Date Country
60264812 Jan 2001 US
Continuations (1)
Number Date Country
Parent 10057489 Jan 2002 US
Child 11248898 US