This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2007-224349 filed Aug. 30, 2007.
1. Technical Field
The present invention relates to an information processing device and a remote communicating system, and more particularly, to an information processing device that is connected to an overview image capturing device that captures an overview image of an object and an enlarged image capturing device that captures an enlarged image of the object, and is also connected to a remote control device that remote-controls at least the enlarged image capturing device, and a remote communicating system that includes the information processing device.
2. Related Art
There has been known a remote diagnosis system that includes a server (a computer, for example) connected to a video camera and a projector, and a client (a computer, for example) located at a remote location and connected to the server via a network. The remote diagnosis system diagnoses a diagnosis object existing on the server side, with the diagnosis being made on the client side.
According to an aspect of the invention, there is provided an information processing device that is connected to an overview image capturing device that captures an overview image of an object and an enlarged image capturing device that captures an enlarged image of the object, and is connected to a remote control device that remote-controls at least the enlarged image capturing device. This information processing device includes: an encoding unit that encodes the overview image and the enlarged image; a transmitting unit that transmits the images encoded by the encoding unit to the remote control device; and a switching unit that switches an encoding method to be utilized by the encoding unit and a transmission method to be utilized by the transmitting unit among a first mode, a second mode, and a third mode. When there is not a continuous change in the overview image captured by the overview image capturing device, the overview image is transmitted in the first mode. When there is a continuous change in the overview image, the overview image is transmitted in the second mode. The enlarged image captured by the enlarged image capturing device is transmitted in the third mode.
An exemplary embodiment of the present invention will be described in detail based on the following figures, wherein:
Referring to
The remote diagnosis system 100 of
The server 1 and the client 2 are connected to an intranet 23. The client 2′ is connected to an intranet 23′. The intranet 23 is connected to the Internet 3 via a firewall 25, and the intranet 23′ is connected to the Internet 3 via a firewall 25′.
A projector 4 (a projecting device), a video camera 5 (an overview image capturing device), and an enlarging camera 6 (an enlarged image capturing device) are connected to the server 1.
Based on a control command from the server 1, the projector 4 emits light beams or projects an annotation image or the like onto a diagnosis object 7 through a half mirror 8. An annotation image is an image of any types, including a line, a character, a symbol, a figure, a color, and a font.
The video camera 5 captures a reflected image of the diagnosis object 7 through the half mirror 8, and outputs the captured image (an overview image) to the server 1. The enlarging camera 6 is a video camera having the panning/tilting/zooming function that can capture an enlarged partial image of the diagnosis object 7, and outputs the captured image (an enlarged image) to the server 1.
A display 21 and an input interface 24 such as a mouse are connected to the client 2. The display 21 displays an overview image and an enlarged image in windows 22a and 22b that are separate from each other. A display 21′ and an input interface 24′ are connected to the client 2′. The display 21′ displays the overview image and the enlarged image, which are the same as the images displayed on the display 21, in windows 22a and 22b that are separate from each other. The client 2 (2′) may be formed with a personal computer integrated with the display 21 (21′).
Buttons such as a pen button, a text button, an erase button, and a zoom button, and icons representing line types and color types are displayed in each of the windows 22a. An image captured by the video camera 5 (an overview image) is displayed in a display area 23a in the window 22a. In
In each window 22a with the above arrangement, the pen button is clicked with the input interface 24 (or 24′) connected to the client 2 (or 2′), so as to draw a figure or the like on the diagnosis object 7 through the movement of the mouse pointer. The information about the figure (or more accurately, the coordinates (x, y) representing the figure in the display area 23a) is then output from the client 2 to the server 1. The server 1 converts the information about the figure into the information about the coordinates in the projector 4, and outputs the converted information to the projector 4. Based on the converted information about the figure, the projector 4 projects the figure onto the diagnosis object 7. Since the captured image is displayed in the display area 23a, the coordinates (x, y) in the captured image match the coordinates (x, y) in the display area 23a.
In each window 22a, the zoom button is clicked with the input interface 24 (24′) connected to the client 2 (2′), so as to designate a part of the diagnosis object 7 (for example, the part surrounded by the dotted lines in
Referring now to
The image input unit 41 converts image signals that are input from the video camera 5 and the enlarging camera 6 into digital data. The movement detecting unit 42 determines whether there is a continuous change (movement) in an image that is input through the image input unit 41, and notifies the controller 46 of the determination result.
Under the control of the controller 46, the image processor 43 processes (compresses) an image that is input through the image input unit 41. More specifically, the image processor 43 has an encoding switching unit 61, as shown in
Referring back to
Referring back to
Referring now to
The image display 51 displays an image that is transmitted from the server 1 on the display 21 (21′). The image processor 52 processes the image transmitted from the server 1 into display image data, in accordance with an encoding method and a transmission method (a transmission protocol). The operation input unit 53 receives the information about an operation that is input through the input interface 24 (24′) by a user. The operation input unit 53 then notifies the communication controller 54 of the operation information. The communication controller 54 receives the image data that is transmitted from the server 1, and transmits the information about the user operation transmitted from the operation input unit 53 to the server 1 (to the operation performing unit 47 shown in
Referring now to
Referring first to the flowchart of
In step S10 of
If the determination result of step S14 is negative, the operation moves on to step S16, and the controller 46 sets the image transmission mode to the “first mode” (selecting JPEG (with a low compression rate) as the compression method of the image processor 43, and selecting TCP as the communication protocol of the communication controller 45). The communication controller 45 then transmits the image data compressed by JPEG (with a low compression rate) to the client 2 or 2′ by TCP, and the operation moves on to step S18. Upon receipt of the image data through the communication controller 54, the client 2 or 2′ sends the image data to the image processor 52. The image data is decoded by the image processor 52, and is then sent to the image display 51. The image display 51 displays the decoded overview image in the display area 23a in the window 22a of the display 21 (21′).
If the determination result of step S14 is positive (or in a case where overview image (data) has already been sent), step S16 is skipped, and the operation moves on to step S18.
In step S18, the movement detecting unit 42 determines whether the overview image has movement. When the overview image has movement (when there is a continuous change in a predetermined number of more of pixels, for example), the determination's result of step S18 becomes positive, and the operation moves on to step S20. In step S20, the image transmission mode is changed to the “second mode” (switching the compression method of the image processor 43 to MPEG2, and switching the communication protocol of the communication controller 45 to UDP).
In step S22, the communication controller 45 transmits the image data obtained by the image processor 43 compressing the overview image by MPEG2, to the client 2 or 2′ by UDP. After that, image data transmission is continued (step S24) until the movement ends. When the movement ends, the operation moves on to step S26. Upon receipt of the image data through the communication controller 54, the client 2 or 2′ sends the overview image data to the image processor 52. The image data is decoded by the image processor 52, and is sent to the image display 51. The image display 51 displays the decoded overview image (a moving image) in the display area 23a in the window 22a of the display 21 (21′).
In step S26, the image transmission mode is changed back to the first mode (JPEG (with a low compression rate) and TCP), and the operation moves on to step S28. In step S28, the overview image observed when the movement ends (the image data processed by the image processor 43) is transmitted to the client 2 or 2′ through the communication controller 45 in the first mode, and the operation shown in
Referring now to the flowchart of
In step S30 of
In step S34, the communication controller 45 transmits the image data obtained by the image processor 43 compressing an enlarged image by H.264 to the communication controller 54 of the client 2 or 2′. Upon receipt of the enlarged image data through the communication controller 54, the client 2 or 2′ sends the enlarged image data to the image processor 52. The image data is decoded by the image processor 52, and is sent to the image display 51. The image display 51 displays the decoded enlarged image in the display area 23a in the window 22a of the display 21 (21′).
In step S36, when the user of the client 2 (2′) presses the use end button for the enlarging camera 6 with the use of the input interface 24 (24′), the controller 46 determines whether the usage of the enlarging camera 6 has been cancelled. The communication controller 45 continues enlarged image data transmission until the determination result of step S36 becomes positive. When the determination result becomes positive, the communication controller 45 stops the enlarged image data transmission, and the operation moves on to step S38. In step S38, the image transmission mode is changed to the “first mode”.
In step S40, whether there is a change in an overview image is determined while the enlarging camera 6 is being used. If the determination result is positive, the overview image (overview image data) is transmitted in the first mode to the client 2 or 2′ in step S42, and the operation (processing and determinations) according to the flowchart of
If the determination result of step S40 is negative, step S42 is skipped, and the operation shown in
As described so far in detail, in the remote diagnosis system of this exemplary embodiment, image transmission is performed by an encoding method and a transmission method that are selected from the first mode in which an overview image is transmitted when the overview image does not have movement (a continuous change), the second mode in which an overview image is transmitted when the overview image has movement, and the third mode in which an enlarged image captured by the enlarging camera 6 is transmitted. In this manner, an encoding method and a transmission method that are suitable for each image can be selected, and image transmission can be performed in accordance with the needs of the user and the communication speed between the server and the client. Especially, in a case where an overview image has movement, there is a high probability that the diagnosis object 7 is moving or the video camera 5 is performing a panning/tilting/zooming operation in this exemplary embodiment. In such a case, the overview image is highly likely an image that is important to the user observing overview images. In a case where an overview image does not have movement, on the other hand, the image is highly likely an image that is relatively important to the user observing overview images. With those facts being taken into consideration, a suitable encoding method and transmission method should be selected (MPEG2 and UDP should be selected in the former case, and JPEG (with a low compression rate) and TCP should be selected in the latter case, for example). Thus, appropriate image transmission can be performed.
In this exemplary embodiment, the encoding method of the third mode is an encoding method (H.264) by which an overall image having higher image quality than the image quality achieved by the encoding method (MPEG2) of the second mode can be obtained, and the transmission method (transmission protocol) of the third mode is a transmission method (TCP) with higher reliability than the transmission method (UDP) of the second mode. Accordingly, an enlarged image that is highly likely an important image to the user observing images (the user diagnosing the diagnosis object 7) can be effectively transmitted.
In the above described exemplary embodiment, the operation shown in
More specifically, in the client 2 (2′), the user draws a figure surrounding a part of an overview image with the use of the input interface 24 (24′). When the user issues an instruction to enlarge and transmit the part (the selected spot) surrounded by the figure, the instruction information is transmitted from the communication controller 54 and the operation performing unit 47 of the server 1 through the operation input unit 53 and the operation performing unit 55 of the client 2 (2′). Therefore, in step S44 of
In step S46, based on the instruction information input to the operation performing unit 47, the image output unit 44 projects a figure (corresponding to the selected spot) on the diagnosis object 7 through the projector 4.
In step S48, the image processor 43 recognizes and extracts the selected spot from an image that is input to the image input unit 41.
In step S50, the controller 46 sets the image transmission mode to the “first mode”. In step S52, the image processor 43 compresses only the extracted part of the image by the compressing JPEG (with a low compression rate), and the communication controller 45 transmits the compressed data to the client 2 or 2′ by TCP. The entire operation shown in
In the above described manner, the user can display a part to be specifically observed (the part to be diagnosed) on the display 21 (21′) with high image quality simply by drawing a figure surrounding the part to be observed. In the above described exemplary embodiment, the server 1 transmits only the part surrounded by a figure to the client 2 or 2′. However, the present invention is not limited to that, and it is possible to transmit an entire overview image, with the part surrounded by a figure having high image quality, and the other part having low image quality, for example.
In the above described exemplary embodiment, enlarged image data transmission is always performed in the third mode (H.264 being set as the compression method, TCP being set as the transmission method). However, the present invention is not limited to that, and the server 1 may transmit an enlarged image as a still image to the client 2 or 2′, when the enlarged image does not have movement (where the user of the client 2 or 2′ does not control the enlarging camera 6 to perform a panning/tilting/zooming operation, for example). In such a case, the enlarged image data can be transmitted in the same mode (by the same encoding method and transmission method) as the first mode. Also, when an enlarged image has movement, the enlarged image may be transmitted with low image quality, and an enlarged image may be transmitted only when the enlarged image does not have movement.
In the above described exemplary embodiment, a user may select an encoding method and a transmission method for each mode. Especially, for enlarged image data in the third mode, the function (a setting unit) for setting an encoding method and a transmission method in accordance with the needs of the user (priorities being put on movement or image quality, for example) may be provided in the clients 2 and 2′ and the server 1.
Although not specifically mentioned in the description of the above exemplary embodiment, overview image transmission from the server 1 to the client 2 or 2′ may be stopped (suspended) while a user is using the enlarging camera 6. Alternatively, it is possible to put priority on enlarged image transmission (by degrading the image quality or lowering the communication speed for overview image transmission), though overview image transmission is performed concurrently with the enlarged image transmission.
In the above described exemplary embodiment, the communication path is supposedly in good condition. However, there may be cases where the condition of the communication path is not good. In such cases, the compression methods shown in the right-side column of the table shown in
In a case where an overview image does not have movement in the above described exemplary embodiment, the overview image is not transmitted before a change is caused in the overview image (see step S16 and step S28 of
In the above described exemplary embodiment, movement is detected from an entire overview image, and a transmission method is set in accordance with whether there is movement. However, the present invention is not limited to that. For example, an overview image may be divided into smaller areas, and movement detection is performed for each of the areas. Based on the detection results, an encoding method and a transmission method for each area may be set.
In the above described exemplary embodiment, there are two clients (the clients 2 and 2′). However, the present invention is not limited to that arrangement, and there may be three or more clients. Also, the network structure of the present invention is not limited to the network structure shown in
The encoding methods shown in
The above described exemplary embodiment is a mere example of an exemplary embodiment of the present invention. However, the present invention is not limited to that, and various changes and modifications may be made to it without departing from the scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2007-224349 | Aug 2007 | JP | national |