1. Field of the Invention
The present invention relates to electronic image acquisition and more particularly to a visual communication system employing, inter alia, visual observation, data-capturing, data transferring and/or data reproducing features.
2. Description of the Related Art
Communication display surfaces, such as whiteboards, blackboards and flipcharts, are commonly used to convey visual information. The visual information conveyed on these display surfaces may take a number of different forms, including without limitation, hand-written notes, images projected onto a display surface, and tangible items secured to a display surface, such as documents, adhesive notes and photographs. Depending on where the display surface is located or positioned, information may be displayed and then manipulated, removed, stored, and/or transferred from display surface to accommodate several users. In the past, if the information on a display surface is to be saved, the information was commonly transcribed by hand before removal. For these and other reasons, it is desirable to provide a means for conveniently capturing and using information and images associated with a display surface.
A visual communication system that is suitable for use in connection with a display surface is disclosed. The system includes a camera or other data-capturing device that can be remotely positioned from the display surface. A control unit receives image data from the camera or data-capturing device and processes the image data to create an electronic image of the display surface. The invention takes the form of numerous embodiments associated with the aforementioned system and methods pertaining to the system.
Embodiments of the invention will now be described, by way of example, with reference to the accompanying drawings, wherein:
a is a schematic that generally illustrates another embodiment of a camera.
Referring now to the drawings, illustrative embodiments of the present invention are shown in detail. Although the drawings represent some embodiments of the invention, the drawings are not necessarily to scale and certain features may be exaggerated to better illustrate and explain the present invention. Further, the embodiments set forth herein are not intended to be exhaustive or otherwise limit or restrict the invention to the precise configurations shown in the drawings and disclosed in the following detailed description.
Referring to
While display surface 24 is shown in
The illustrated embodiments of the system 22 include a camera 25 or other visual data-capturing device. It is understood however that the system 22 is not necessarily limited to a single camera and may instead include a plurality of cameras that work independently or in combination. Camera 25 may be of either a scanning or non-scanning (e.g., instant capture) character and can be deployed selectively with respect to display surface 24 and the associated displayed information. The camera 25 may be comprised of an integral unit or several components working in combination. Moreover, if desired for certain embodiments, the camera may also include a projection device or elements of a projection device and thus may both form (or partially form) and capture elements of an image upon and from a display surface 24. If desired, a light bulb may be included to provide surface illumination.
An embodiment of a camera 25 is shown schematically in
In a particular embodiment, camera 25 includes a scanning system 35 having a selectively operable drive unit 36, such as a stepper motor, and a sled 38 that may be moved under precision control by drive unit 36 through a drive mechanism 37 disposed between or about the drive unit 36 and sled 38 (for example, as shown in
While certain embodiments of a camera 25 are schematically shown in the corresponding figures, the camera should not be so limited. For example, the drive mechanism 37 illustrated in
In a particular embodiment, sled 38 is moveably mounted in housing 31 between a pair of stationary guide tracks 42, 44 for movement linearly and/or reciprocatingly behind lens 34. A stabilizer mechanism 45 may be provided to inhibit wobbling of sled 38 as it moves on guide tracks 42, 44. In the configuration shown in
In one mode of operation, camera 25 scans display surface 24, particularly the area within which information is displayed on display surface 24. In an embodiment, the optical-to electronic imager 40 scans the display surface one line or row at a time in a vertical or horizontal manner, much like a document scanner or photocopier. As display surface 24 is scanned, the optical-to-electronic imager 40 (such as a CCD) converts an optical image of the display surface into an electronic data-stream. In an alternative mode of operation, a fixed-area optical-to-electronic imager 40, such as a fixed-area CCD can capture a two-dimensional image of the display surface 24 at once.
Referring again to
In the illustrative embodiment shown in
In an embodiment, system 22 also includes at least one control unit 54 in communication with camera 25 and at least one user interface 56 also in communication with the control unit 54. While one or more elements of a control unit 54 may be conveniently located within outrigger arm 50, as shown in
Referring to an embodiment of a configuration shown in
In a particular embodiment of the invention, storage media interface 66 may include a floppy diskette drive and/or CD-R drive provided within user interface 56. The drive can write image files, for example, to a standard IBM compatible PC 3.5″ formatted HD diskette and/or a writable CD-ROM. The number of image that may be stored on the diskette or CD-ROM vary depending on the size of the image file. While storage media interface 66 is shown in
While control unit 54 may communicate with camera 25 and peripheral devices using suitable communication cables, it is also possible for control unit 54 to communicate with other devices, including camera 25, wirelessly using Bluetooth, WIFI, and other suitable wireless communication protocols. A network interface controller may also be included in communication interface 62 to connect imaging system 22 to an appropriate network, such as a local area network. Control unit 54 may also be configured as a “Web” server, allowing a predetermined number of images stored in memory 60 to be accessed from anywhere on the network using a standard web browser or other suitable web client application. When so configured, control unit 54 may support fixed IP addressing or DHCP protocol. Access to the network may also be provided by a suitable communications cable or wirelessly using a wireless communication device.
In response to a user's request to acquire an image of display surface 24, raw image data may be acquired by a camera 25 and conveyed to control unit 54, particularly a controller or CPU 58. The controller or CPU 58 transforms or converts the raw image data into electronic images of display surface 24 in a desired file format, such as a .tiff, .JPEG, .GIF, or .PNG file. Depending on the user input provided, controller or CPU 58 may transmit the electronic image file to one or more peripheral devices, send the electronic image file over the communications network, and/or store in an internal memory for later retrieval (e.g., through a computer network such as the Web).
In a particular embodiment of the invention, user interface 56 is configured to receive a single user input to initiate imaging of display surface 24 and distribution of an electronic image. In a certain configuration, user interface 56 includes at least one action button 68 that may be selectively actuated by a user to acquire an image and to distribute the image in a predetermined manner. The term “action button” as used herein, generically describes an input device that receives a user command and electronically transfers the user command to control unit 54. Suitable action buttons include, but are not limited to, touch sensors, electromechanical switches and proximately switches. The action button may be accompanied by a visual indicator, such as an LED, which provides the user with a visual indication that an action button has been activated. Action buttons 68 conveniently allow a user to acquire and distribute an image of display surface 24 using a single step, rather than a series of steps. In addition to or in place of one or more action buttons, the system 22 may include voice recognition software and may receive instructions, such as action button type instructions, provided and effectuated by a user's voice command.
In the embodiment shown in
In a similar configuration, a second action button can, for instance, initiate acquisition of the image data and distribution of the image to a storage media interface device when activated by a user. For example, by activation of the second action button, the acquired image file may be sent for recording on a floppy diskette or other digital storage medium contained in storage media interface 66.
In still another configuration, a third action button can initiate acquisition of the image data and distribution of the image to a memory when activated by a user. For example, by activation of the third action button, the acquired image file can be transferred to a network server for later distribution to a local or wide area communications network or the internet. Alternatively, the acquired image file may be immediately transferred across a local or wide area network to a computer server, where it can be distributed across a wider network or the Internet. Moreover, for some embodiments of the invention, the web server may be built into the system, for example, as part of the control unit or camera.
The system is not limited to the action buttons previously described and may include a plurality of action buttons. Moreover, the system can also provide a great deal of flexibility in that each of the action buttons may, if desired, be programmable to perform a specific series of instructions that are preset by a user for a single-action actuation. This way a limited number of action buttons can be set or reset to perform a series of operations that are commonly anticipated, yet with the convenience of a single actuation. For example, depressing a “printer” button three times and the “save” (to diskette) button once could be configured to produce three copies of the captured material on a printer and one copy of the image on a diskette.
Optionally, system 22 may also include an information communications device 70, such as a loudspeaker or visual display screen appropriately communicating with control unit 54 to provide information concerning operation of system 22. In a particular embodiment, information communications device 70 is a loudspeaker that audibly announces, by way of confirmation, that a specific activity requested by the user is in fact being performed, or has been performed. For example, when a user requests a printed image of display surface 24, the information communications device 70 may announce “One Copy.” Likewise, when the image has been photographed, the information communications device 70 (e.g., a speaker) may announce “captured.” Similarly, when the image is printed, the communications device may announce “Copy Ready.” Alternatively or in combination with an audible message, operational information may be communicated using an LCD screen, for example. While an illustrative embodiment of information communications device 70 is shown in
System 22 can also include a means of calibrating the system to capture and/or process the image data and modify or “correct” the processed image to enhance, among other things, the quality and/or size of the image. Also, when system 22 is installed or positioned adjacent to display surface 24, the system may be calibrated to determine an appropriate boundary of the display surface. The enablement of such calibration, and other related adjustment considerations, will be further appreciated with reference to
In an embodiment, command codes and other information can be uploaded into control unit 54 using calibration images that may be written, adhered or projected onto display surface 24.
In one possible arrangement, one or more boundary and/or system calibration images 72, 74 are contained on sheets (e.g., sheets of paper) and are adhered at one or more designated or predetermined locations on display surface 24. For example, as shown in
In certain embodiments of the system, one or more system calibration images 74 (see, e.g., the image illustrated in
A system calibration image 74 may be used to transmit an encoded command to the camera. In a number of embodiments, the systems will automatically, or otherwise when activated or instructed, search for a system calibration image in a preset or anticipated location. If such an image is present, the system will capture and use the information obtained from the system calibration image to configure the system or its operation. If no such system calibration image is located, the system will typically employ preset or predetermined default system parameters.
For example, without limitation, a user might initiate a function, such as pressing a “print” button. The system 22 will search for calibration sheets. If a calibration sheet is found, for instance in an anticipated location, the system will print an image of the screen in accordance with the instructions derived from the calibration images, which will likely exclude the printing of the associated calibration sheets. If no calibration sheets are detected, the system may simply initiate a default instruction, such as the printing of the entire display surface without discrimination.
In the illustrated embodiment of
In the absence of a formal calibration of system 22, such as the use of one or more calibration images, control unit 54 may include a means of self-calibrating the system 22. In a representative embodiment, control unit 54 may self-calibrate at power-up of the system or may be configured to perform a calibration periodically. Essentially, the self-calibration technique operates on the principle of a display surface having a display area encompassed by a border. The camera scans the display surface and discovers boundaries of the display area by detecting a change in the contrast in the colors of the surface area of the border. An exemplary method for self-calibrating system 22 includes, with reference to the flow diagram of
Not all communications display surfaces actually have a white surface. Some communications display surfaces have a more traditional “blackboard” or colored surface. To account for various display surface colors, one or more calibration images may be encoded with imaging data that identifies the type of display surface being used. For example,
The encoded information displayable on calibration image 74 is not necessarily limited to the properties of display surface 24. Encoded information can be contained in calibration images that, when scanned by camera 25 and processed by control unit 54, perform specified tasks relative to operation of system 22. Such encoded information may include, for example, an IP address of a printer or WEP encryption settings for a wireless communication device in control unit 54. The encoded information may also include, among other things, a software upgrade relative to operation of system 22 and specific control/adjustment information, including information for enhancing the quality of the image, such as by correcting keystone and camera lens distortion. The information may also place elements of the system in a factory service state.
In a setting where there are multiple display surfaces 24 and a corresponding visual communication system 22 assigned to each display surface, each system 22 may be assigned an individual, computer readable “network” address. The addresses allow for, among other things, a remote network user to determine the location and operating statuses of the systems 22 included in the network. Such discovery requests can be made, for example, for the purpose of scheduling time on the corresponding display surfaces 24 and the like.
This feature will be appreciated with reference to
It will be appreciated that system 22 can be easily installed and used with existing display surfaces to capture and convey various forms of visual information. System 22 can be self-contained and does not require any external electronics, such as a personal computer, to operate. Setup of system 22 is easy and requires virtually no training to operate most common functions. The scanning feature of camera 25 produces superior clarity, even in the far corners of the image, and is easily configurable to accommodate various communications display surfaces.
The present invention has been particularly shown and described with reference to the foregoing embodiments, which are merely illustrative of the best modes for carrying out the invention. It should be understood by those skilled in the art that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention without departing from the spirit and scope of the invention as defined in the following claims. It is intended that the following claims define the scope of the invention and that the method and apparatus within the scope of these claims and their equivalents be covered thereby. This description of the invention should be understood to include all novel and non-obvious combinations of elements described herein, and claims may be presented in this or a later application to any novel and non-obvious combination of these elements. Moreover, the foregoing embodiments are illustrative, and no single feature or element is essential to all possible combinations that may be claimed in this or a later application.
This application is a Divisional application of U.S. Ser. No. 10/449,336, with a filing date of May 30, 2003, which application is hereby incorporated by reference in its entirety. This application claims priority to U.S. provisional patent applications 60/385,059, 60/385,114, 60/385,062, 60/385,060 and 60/385,061, each individually filed on Jun. 2, 2002, which are incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4587568 | Takayama et al. | May 1986 | A |
4667254 | Araki et al. | May 1987 | A |
4670794 | Araki et al. | Jun 1987 | A |
4725889 | Yaniv et al. | Feb 1988 | A |
4725890 | Yaniv et al. | Feb 1988 | A |
4727431 | Nakamura et al. | Feb 1988 | A |
4728803 | Catchpole et al. | Mar 1988 | A |
4739414 | Pryor et al. | Apr 1988 | A |
4751584 | Midorikawa | Jun 1988 | A |
4755882 | Toyota | Jul 1988 | A |
4760465 | Yamazaki | Jul 1988 | A |
4766500 | Yaniv et al. | Aug 1988 | A |
4797106 | Umehara et al. | Jan 1989 | A |
4797107 | Hatta et al. | Jan 1989 | A |
D300041 | Kurozumi et al. | Feb 1989 | S |
4803564 | Sakai | Feb 1989 | A |
D300150 | Ishii | Mar 1989 | S |
4821335 | Yamazaki et al. | Apr 1989 | A |
4858021 | Toyota | Aug 1989 | A |
4903012 | Ohuchi | Feb 1990 | A |
4979027 | Sakai | Dec 1990 | A |
5023408 | Murakami et al. | Jun 1991 | A |
5063460 | Mutze et al. | Nov 1991 | A |
5072412 | Henderson, Jr. et al. | Dec 1991 | A |
5073926 | Suzuki et al. | Dec 1991 | A |
5146345 | Yamashita et al. | Sep 1992 | A |
D330042 | Yamaguchi et al. | Oct 1992 | S |
5181015 | Marshall et al. | Jan 1993 | A |
5181104 | Sugishima et al. | Jan 1993 | A |
5233687 | Henderson, Jr. et al. | Aug 1993 | A |
5235363 | Vogeley et al. | Aug 1993 | A |
5239373 | Tang et al. | Aug 1993 | A |
5274362 | Potvin | Dec 1993 | A |
5299033 | Watanabe et al. | Mar 1994 | A |
5374971 | Clapp et al. | Dec 1994 | A |
5394521 | Henderson, Jr. et al. | Feb 1995 | A |
5422693 | Vogeley et al. | Jun 1995 | A |
5448263 | Martin | Sep 1995 | A |
5506999 | Skillman et al. | Apr 1996 | A |
5525764 | Junkins et al. | Jun 1996 | A |
5528290 | Saund | Jun 1996 | A |
5533183 | Henderson, Jr. et al. | Jul 1996 | A |
RE35329 | Murakami et al. | Sep 1996 | E |
5581637 | Cass et al. | Dec 1996 | A |
5583323 | Zurstadt et al. | Dec 1996 | A |
5610730 | Osipchuk | Mar 1997 | A |
5734417 | Yamamoto et al. | Mar 1998 | A |
5790114 | Geaghan et al. | Aug 1998 | A |
5818425 | Want et al. | Oct 1998 | A |
5831602 | Sato et al. | Nov 1998 | A |
5856879 | Suzuki et al. | Jan 1999 | A |
5859427 | Horie et al. | Jan 1999 | A |
5863209 | Kim | Jan 1999 | A |
5894306 | Ichimura | Apr 1999 | A |
5903252 | Ogata | May 1999 | A |
5926605 | Ichimura | Jul 1999 | A |
5933132 | Marshall et al. | Aug 1999 | A |
5933191 | Ariga et al. | Aug 1999 | A |
5999278 | Suzuki et al. | Dec 1999 | A |
6000946 | Snyders et al. | Dec 1999 | A |
6067080 | Holtzman | May 2000 | A |
6100538 | Ogawa et al. | Aug 2000 | A |
6100877 | Chery et al. | Aug 2000 | A |
6101288 | Kang | Aug 2000 | A |
6104387 | Chery et al. | Aug 2000 | A |
6111565 | Chery et al. | Aug 2000 | A |
6115146 | Suzuki et al. | Sep 2000 | A |
6122865 | Branc et al. | Sep 2000 | A |
6124847 | Chery et al. | Sep 2000 | A |
6147681 | Chery et al. | Nov 2000 | A |
6177927 | Chery et al. | Jan 2001 | B1 |
6181378 | Horie et al. | Jan 2001 | B1 |
6188493 | Esaki et al. | Feb 2001 | B1 |
6191778 | Chery et al. | Feb 2001 | B1 |
6209266 | Branc et al. | Apr 2001 | B1 |
6211863 | Chery et al. | Apr 2001 | B1 |
6212547 | Ludwig et al. | Apr 2001 | B1 |
6232962 | Davis et al. | May 2001 | B1 |
6256063 | Saito et al. | Jul 2001 | B1 |
6310615 | Davis et al. | Oct 2001 | B1 |
6318825 | Carau, Sr. | Nov 2001 | B1 |
6335724 | Takekawa et al. | Jan 2002 | B1 |
6337681 | Martin | Jan 2002 | B1 |
D458230 | Hand et al. | Jun 2002 | S |
6414673 | Wood et al. | Jul 2002 | B1 |
6421042 | Omura et al. | Jul 2002 | B1 |
6427389 | Branc et al. | Aug 2002 | B1 |
6429856 | Omura et al. | Aug 2002 | B1 |
6430567 | Burridge | Aug 2002 | B2 |
6441807 | Yamaguchi | Aug 2002 | B1 |
7427983 | Hildebrandt et al. | Sep 2008 | B1 |
20010019325 | Takekawa | Sep 2001 | A1 |
20010030668 | Erten et al. | Oct 2001 | A1 |
20010046035 | Vanderwerf et al. | Nov 2001 | A1 |
20020008692 | Omura et al. | Jan 2002 | A1 |
20020016788 | Burridge | Feb 2002 | A1 |
20020041272 | Ohashi | Apr 2002 | A1 |
20020041327 | Hildreth et al. | Apr 2002 | A1 |
20020067520 | Brown et al. | Jun 2002 | A1 |
20020082998 | Sastri et al. | Jun 2002 | A1 |
20020118180 | Martin | Aug 2002 | A1 |
20020124479 | Branc et al. | Sep 2002 | A1 |
20020126161 | Kuzunuki et al. | Sep 2002 | A1 |
20020141616 | Cox et al. | Oct 2002 | A1 |
20020145595 | Satoh | Oct 2002 | A1 |
20020149808 | Pilu | Oct 2002 | A1 |
20020163505 | Takekawa | Nov 2002 | A1 |
20070008303 | Kitaguchi et al. | Jan 2007 | A1 |
Number | Date | Country |
---|---|---|
2350152 | Dec 2002 | CA |
0379354 | Jul 1990 | EP |
0749037 | Dec 1996 | EP |
0884890 | Dec 1998 | EP |
0895189 | Feb 1999 | EP |
0701225 | Dec 1999 | EP |
0982923 | Mar 2000 | EP |
0996276 | Apr 2000 | EP |
1028003 | Aug 2000 | EP |
1083477 | Mar 2001 | EP |
1083514 | Mar 2001 | EP |
1089548 | Apr 2001 | EP |
1132852 | Sep 2001 | EP |
61026127 | Feb 1986 | JP |
63228321 | Sep 1988 | JP |
3075914 | Mar 1991 | JP |
3174578 | Jul 1991 | JP |
04081159 | Mar 1992 | JP |
05197510 | Aug 1993 | JP |
07107220 | Oct 1993 | JP |
05251359 | Apr 1995 | JP |
10210356 | Aug 1998 | JP |
WO-9814888 | Apr 1998 | WO |
WO-9959130 | Nov 1999 | WO |
WO-02086618 | Oct 2002 | WO |
Number | Date | Country | |
---|---|---|---|
20080297595 A1 | Dec 2008 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10449336 | May 2003 | US |
Child | 12191858 | US |