The present disclosure relates to an image display system, an information processing apparatus, and an image display method.
Conventionally, a display device which performs adjustment according to a supplied image when displaying an image is known in the art. For example, there is known a method for performing a display-related adjustment based on attributes of image data supplied from a mobile terminal, in order to eliminate the necessity of manually operated adjustment or preliminary registration. For example, see Japanese Unexamined Patent Application Publication No. 2013-003327.
Further, there is known a video signal processing method in which, when a video signal input source is switched to another input source, a display adjustment value is switched to a specific display adjustment value according to an external device, which eliminates the necessity of user's manual adjustment operations. For example, see Japanese Unexamined Patent Application Publication No. 2008-033138.
Further, there is known a method in which when content data is displayed, a time to display a setting content data is made to be consistent with a time to actually display the content data. For example, see Japanese Unexamined Patent Application Publication No. 2015-055827.
Further, there is known a method of generating an omnidirectional image by an imaging device, in which an inclination of the imaging device to a vertical direction is detected and a conversion table used for image processing is corrected based on the inclination, to generate the omnidirectional image in which the vertical direction is properly consistent with the inclination of the imaging device.
For example, see Japanese Unexamined Patent Application Publication No. 2013-214947.
In one aspect, the present disclosure provides an image display system which is capable of displaying one of wide view images at intervals of a predetermined time based on input parameters.
In one embodiment, the present disclosure provides an image display system which displays a display image and includes at least one display device and at least one information processing apparatus connected to the display device, the information processing apparatus including a processor configured to implement an input unit configured to receive image data items and parameters related to the display image, a determination unit configured to determine areas of an image indicated by the image data items, which areas are displayed by the display device as partial images of the display image, based on the parameters, and a transmission unit configured to transmit data indicating the areas to the display device, wherein the display device is configured to display one of the areas determined by the determination unit at intervals of a predetermined time.
The image display system according to one embodiment is capable of displaying one of wide view images at intervals of a predetermined time based on input parameters.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention as claimed.
A description will be given of embodiments with reference to the accompanying drawings.
An overall configuration of the image display system according to the first embodiment is explained.
Image data D1 is input to the PC 11. For example, the image data D1 may be image data indicating an omnidirectional image which is taken by an omnidirectional camera 3 with a field of view covering all directions of a user 200. After the image data D1 is input to the PC 11, the PC 11 displays an image on each of the projectors 1A, 1B, 1C, and 1D based on the image data D1, and displays a combined image in which the images displayed on the projectors are combined together (which combined image is called a display image) on a screen 2.
Note that the image data D1 is not restricted to image data indicating still pictures, and it may be image data indicating motion pictures.
It is assumed that optical axes of the four projectors are placed in mutually different directions as illustrated in
In the following, a horizontal direction (equivalent to a depth direction in
For example, as illustrated in
First, the plan view of the display image illustrated in
Thus, the image portions displayed by the three projectors cover the 120-degree Yaw angle ranges, and the image display system 1 is capable of displaying a display image which covers the 360-degree Yaw angle range in the horizontal direction.
Next, the side view of the display image illustrated in
Thus, the image portions displayed by the projectors cover the 60-degree Pitch angle ranges, and the image display system 1 is capable of displaying a display image which covers the 180-degree Pitch angle range in the vertical direction.
Note that the image portions displayed by the projectors may not be even. Note that the screen 2 may be a display screen or the like.
Note that the number of display devices included in the image display system 1 may not be restricted to four, and a different number of display devices may be included in the image display system 1. Note that the information processing apparatus included in the image display system 1 may not be restricted to the PC 11, and the information processing apparatus may be any of a server, a mobile PC, a smart phone, and a tablet. Note that the information processing apparatus may be replaced with an information processing system including a plurality of information processing apparatuses, and the information processing system may include a PC and a tablet.
It is preferable that the screen 2 has a hemispherical shape as illustrated. Namely, it is preferable that an object where a display image is displayed is an object having a hemispherical shape as illustrated. In the present embodiment, the dome-shaped screen 2 has a hemispherical shape, and the image display system 1 is capable of displaying a display image which covers the 360-degree Yaw angle range in the horizontal direction when viewed from the center of the hemisphere as illustrated. However, the screen 2 may not be restricted to the screen having the hemispherical shape, and the screen 2 may have a different shape.
The omnidirectional camera 3 generates the image data D1 indicating an omnidirectional image. For example, in response to an operation by the user 200, the omnidirectional camera 3 captures an image D2 (captured image D2) using the first lens 3H1 and an image D3 (captured image D3) using the second lens 3H2 simultaneously, each of the images D2 and D3 covering 180 degrees in the horizontal direction as illustrated in
The CPU 11H1 is a processor configured to perform various processes and processing of various data and control overall operations of hardware elements of the PC 11. Note that the CPU 11H1 may include an arithmetic unit or a control unit configured to support the operations of the CPU 11H1, and the CPU 11H1 may be implemented by a plurality of units.
The storage device 11H2 is configured to store data, programs, and setting values. The storage device 11H2 serves as a memory of the CPU 11H1. Note that the storage device 11H2 may include an auxiliary storage device such as a hard disk drive.
The input interface 11H3 is an interface configured to receive data, such as the image data D1, and operations by the user 200. Specifically, the input interface 11H3 is implemented by a connector and an external device connected to the PC 11 via the connector. Note that the input interface 11H3 may utilize a network or radio communication to receive the data and the operations.
The input device 11H4 is a device configured to receive command-based operations and data. Specifically, the input device 11H4 is implemented by a keyboard, a mouse, etc.
The output interface 11H5 is an interface configured to transmit data from the PC 11 to the projector. Specifically, the output interface 11H5 is implemented by a connector and an external device connected to the PC 11 via the connector. Note that the output interface 11H5 may utilize a network or radio communication to transmit the data to the projector.
The output device 11H6 is a device configured to output data. Specifically, the output device 11H6 is implemented by a display device.
Note that the input device 11H4 and the output device 11H6 may be implemented by a touch-panel display in which an input device and an output device are integrated. Alternatively, the input device 11H4 and the output device 11H6 may be implemented by another information processing apparatus, such as a smart phone or a tablet.
The input interface 1AH1 is an interface configured to input data or signals from the PC 11 to the projector. Specifically, the input interface 1AH1 is implemented by a connector, a driver, and a dedicated integrated circuit (IC).
The output device 1AH2 is implemented by optical components, such as lenses, and a light source. The output device 1AH2 is configured to display an image based on the input data or signals.
The storage device 1AH3 is configured to store data, programs, and setting values. The storage device 1AH3 is implemented by a main storage device, such as a memory, an auxiliary storage device such as a hard disk drive, or a combination of the main and auxiliary storage devices.
The CPU 1AH4 is a processor configured to perform various processes and processing of various data and control overall operations of hardware elements of the projector. Note that the CPU 1AH4 may include an arithmetic unit or a control unit configured to support the operations of the CPU 1AH4, and the CPU 1AH4 may be implemented by a plurality of units.
The input device 1AH5 is a device configured to input command-based operations and data. Specifically, the input device 1AH5 is implemented by a switch, a keyboard, and a mouse.
Each of the projectors 1A, 1B, 1C, and 1D is configured to use the input interface 1AH1 to input data or signals based on image data through a network, radio communication such as near field communication (NFC), or its combination, and display an image. Note that each projector may use a recording medium, such as a universal serial bus (USB) memory, to input the data.
As illustrated in
In step S02, the PC 11 displays a list of display images to the user 200. Note that the processing of step S02 is repeatedly performed until an operation to select a display image is performed by the user 200.
In step S03, the PC 11 receives parameters input by the user 200. For example, the PC 11 displays a graphical user interface (GUI), such as a setting screen, and receives the parameters in response to a user's input operation to the setting screen. Note that the parameters may be input in the form of data or commands.
In step S04, the PC 11 receives a display instruction input by the user 200. For example, the operation to input the display instruction may be an operation of pressing a start button or the like on the PC 11 by the user 200.
In step S05, the PC 11 generates setting data based on the received parameters. The setting data is to be output to the projectors 1A through 1D.
In step S06, the PC 11 outputs the setting data generated based on the parameters at the step S05, to each of the projectors 1A through 1D.
In step S07, each of the projectors 1A through 1D stores the setting data output from the PC 11 at the step S06.
In step S08, the PC 11 outputs display data items for indicating the display image selected by the user 200 at the step S02, to the projectors 1A through 1D, respectively.
In step S09, the projectors 1A through 1D store the display data items output from the PC 11 at the step S08, respectively.
The processing of steps S08 and S09 is repeatedly performed until all the display data items are output and stored.
In step S10, the PC 11 receives a display start instruction input by the user 200 for starting displaying based on the setting data. In response to the display start instruction, the PC 11 outputs to each of the projectors 1A through 1D a message indicating that the uploading is completed, or a message indicating that the displaying is started.
In step S11, each of the projectors 1A through 1D verifies the setting data stored at the step S07. For example, the verification is made by determining whether the setting data conforms to a predetermined format. When the setting data does not conform to the predetermined format as a result of the verification, each of the projectors 1A through 1D performs an error process. Note that this error process may be a process which displays an error message.
In step S12, the PC 11 controls the projectors 1A through 1D to display the images according to the setting data based on the parameters PAR stored at step S07, so that the display image is switched at intervals of a predetermined time.
Note that the sequence of the above steps S01 to S12 is not restricted to the sequence illustrated in
For example, as illustrated in
Further, as illustrated in
For example, the tablet 4 displays a first operation screen PN1 illustrated in
Note that the images may be input from the external device, such as the omnidirectional camera 3 (
The third operation screen PN3 may be a guide screen for connecting the tablet 4 (or the information processing apparatus 11) to the omnidirectional camera 3 as illustrated in
The fourth operation screen PN4 is displayed in list form, similar to the second operation screen PN2 illustrated in
When a thumbnail image SImg1 of the first selection image in the fifth operation screen PN5 is pressed, the tablet 4 displays a preview image Img1 of the first selection image.
Alternatively, in the fifth operation screen PN5 illustrated in
Next, various examples in which the parameters are input using the operation screens will be described.
For example, some of the parameters in the step S03 of the overall process of
In addition, a horizontal direction parameter indicating one of horizontal directions in which a display image is rotated, and a horizontal rotation speed parameter indicating a rotational speed for rotating the display image in the horizontal direction may be input. Further, a vertical direction parameter indicating one of vertical directions in which a display image is rotated, and a vertical rotation speed parameter indicating a rotational speed for rotating the display image in the vertical direction may be input.
In the following, an example in which the horizontal direction parameter, the horizontal rotation speed parameter, the vertical direction parameter, and the vertical rotation speed parameter are set up by an administrator of the image display system 1 will be described. Specifically, when a right-hand lower portion BTN3 of the second operation screen PN2 illustrated in
The eighth operation screen PN8 is a screen for causing the administrator to enter a password of the administrator as illustrated in
The ninth operation screen PN9 is an example of a setting of administrator screen. For example, the password of the administrator may be changed using the ninth operation screen PN9. Specifically, when a password change button BTN4 in the ninth operation screen PN9 is pressed, the tablet 4 displays a tenth operation screen PN10 as illustrated in
A new password may be entered using the tenth operation screen PN10. When the new password is entered, the password of the administrator is changed to the new password.
On the other hand, when a display image selection button BTN5 in the ninth operation screen PN9 illustrated in
The twelfth operation screen PN12 is an operation screen used to input the horizontal direction parameter, the horizontal rotation speed parameter, the vertical direction parameter, and the vertical rotation speed parameter. For example, the horizontal direction parameter is input using horizontal direction setup buttons BTN6 included in the twelfth operation screen PN12, and the vertical direction parameter is input using vertical direction setup buttons BTN7 included in the twelfth operation screen PN12. Further, the horizontal rotation speed parameter and the vertical rotation speed parameter are input using rotational speed setup buttons BTN8 included in the twelfth operation screen PN12. Further, the setting value for the predetermined time of the interval at which the image data is switched is input using a scroll bar SBA included in the twelfth operation screen PN12.
Furthermore, all the parameters with the above-described parameters may be listed in Table 1 below. In the following, a list of data items including the parameters listed in the Table 1 below will be called a play list. Note that the play list is not required to include all the parameters listed in the Table 1 below, and some of the parameters listed in the Table 1 below may be omitted from the play list. When some of the parameters are omitted, predetermined initial values may be used to set up for such parameters. Further, repeated reproduction in which several display images are switched at intervals of the predetermined time may be set up.
In addition, each of the parameters may be set up for each of the display images, and each of the parameters may be uniformly set up for all or several of the display images. Note that when the display images are motion pictures, the playback time is set up for each of the display images and each of the parameters may be set up based on the playback time.
Further, the method of inputting the parameters is not restricted to the inputting of the parameters using the GUIs. The parameters may be input using commands, text, numerical values, data, or a combination thereof.
In the Table 1 above, the parameter indicated by “No. 1” is an example of a parameter indicating version information.
In the Table 1 above, the parameter indicated by “No. 2” is an example of a parameter to designate the order of images being displayed as display images. Specifically, when several images are selected as illustrated in
In the Table 1 above, the parameter indicated by “No. 3” is an example of a contents-list parameter to designate an arrangement of display image settings.
In the Table 1 above, the parameter indicated by “No. 4” is an example of the time parameter to designate the predetermined time of the interval for switching the display images.
In the Table 1 above, the parameter indicated by “No. 5” is an example of the effect parameter to designate the effect at the time of switching the display images. Specifically, the effect parameter is set to one of the values “0” through “6”. For example, if the effect parameter is set to “0”, a fade-in effect is set up at a time of changing the current image to the following image. For example, the fade-in effect may be an effect in which the currently displayed image is darkened gradually to an invisible level, or an effect in which the following image is brightened gradually, or a combination of the two effects.
Further, if the effect parameter is set to “1” or “2”, a push effect is set up in which the currently displayed image is changed to the following image in a manner that the currently displayed image is pushed out. Note that a left or right direction in which the image is pushed out by the push effect is designated by setting the effect parameter to “1” or “2”.
Further, if the effect parameter is set to “3” or “4”, a wipe effect is set up in which the currently displayed image is gradually replaced with the following image. Note that a left or right direction in which the image is replaced by the wipe effect is designated by setting the effect parameter to “3” or “4”.
In the Table 1 above, the parameter indicated by “No. 6” denotes a storage destination of image data. The storage destination is expressed by a path.
In the Table 1 above, the parameter indicated by “No. 7” is an example of a horizontal position parameter which sets up a horizontal direction angle and designates a horizontal position of an area in which a display image is displayed.
In the Table 1 above, the parameter indicated by “No. 8” is an example of a vertical position parameter which sets up a vertical direction angle and designates a vertical position of an area in which a display image is displayed.
In the Table 1 above, the parameter indicated by “No. 9” is an example of a field angle parameter which designates a range in which a display image is displayed by setting up an enlargement or reduction (scaling) rate of the display image.
Namely, when each of the parameters “No. 7” through “No. 9” is input, the area in which the display image is first displayed is designated.
In the Table 1 above, the parameter indicated by “No. 10” is an example of a horizontal direction parameter indicating an orientation of horizontal directions in which a display image is rotated in the horizontal direction.
In the Table 1 above, the parameter indicated by “No. 11” is an example of a vertical direction parameter indicating an orientation of vertical directions in which a display image is rotated in the vertical direction.
In the Table 1 above, the parameter indicated by “No. 12” is an example of the brightness parameter which sets up a brightness of a display image.
In the Table 1 above, the parameter indicated by “No. 13” is an example of the contrast parameter which sets up a contrast of a display image.
Note that the parameters may include a switching condition parameter to set up the switching condition. Note that the parameters may further include a vertical rotation speed parameter indicating a speed of rotation in a vertical direction, and a horizontal rotation speed parameter indicating a speed of rotation in a horizontal direction.
Further, the switching condition is not restricted to a switching condition related to the horizontal direction. For example, the switching condition may be a switching condition related to the vertical direction. Moreover, the switching condition may be a combination of the switching condition related to the vertical direction and the switching condition related to the horizontal direction.
If a user inputs to the tablet 4 the parameters as illustrated in the Table 1 above using the operation screens illustrated in
The parameter indicated by “No. 1” in the Table 1 above is input like a first parameter “PAR1” in the play list PLS.
The parameter indicated by “No. 2” in the Table 1 above is input like a second parameter “PAR2” in the play list PLS.
The parameter indicated by “No. 4” in the Table 1 above is input like a fourth parameter “PAR4” in the play list PLS.
The parameter indicated by “No. 5” in the Table 1 above is input like a fifth parameter “PAR5” in the play list PLS.
The parameter indicated by “No. 6” in the Table 1 above is input like a sixth parameter “PAR6” in the play list PLS.
The parameter indicated by “No. 7” in the Table 1 above is input like a seventh parameter “PAR7” in the play list PLS.
The parameter indicated by “No. 8” in the Table 1 above is input like an eighth parameter “PAR8” in the play list PLS.
The parameter indicated by “No. 9” in the Table 1 above is input like a ninth parameter “PAR9” in the play list PLS.
The parameter indicated by “No. 10” in the Table 1 above is input like a tenth parameter “PAR10” in the play list PLS.
The parameter indicated by “No. 11” in the Table 1 above is input like an eleventh parameter “PAR11” in the play list PLS.
The parameter indicated by “No. 12” in the Table 1 above is input like a twelfth parameter “PAR12” in the play list PLS.
The parameter indicated by “No. 13” in the Table 1 above is input like a thirteenth parameter “PAR13” in the play list PLS.
First, the horizontal direction processing will be described. When the horizontal position parameter and the field angle parameter are input by the play list PLS (
Similarly, based on the horizontal position parameter and the field angle parameter, the PC 11 determines that the first projector 1A (
Further, based on the horizontal position parameter and the field angle parameter, the PC 11 determines that the fourth projector 1D (
The partial images indicating the first area ARA1, the second area ARA2, and the third area ARA3 based on the image data D1 are displayed by the projectors 1C, 1A, and 1D, respectively, and the image display system 1 is able to output the display image covering 360 degrees in the horizontal direction around a viewpoint PS indicated in
Here, suppose that setting to rotate the display image in a first direction DIR1 as indicated in
Similar to the change illustrated in
When the three areas illustrated in the upper portion of
Note that the positions of the first area ARA1, the second area ARA2, and the third area ARA3 in the horizontal direction (the X coordinates thereof) as illustrated in
Further, the range of each of the first area ARA1, the second area ARA2, and the third area ARA3 (the number of pixels or the amount of space of each area) as illustrated in
Further, the first direction DIR1 in which the first area ARA1, the second area ARA2, and the third area ARA3 are changed as illustrated in
Further, the frequency of changing the first area ARA1, the second area ARA2, and the third area ARA3 and the amount of a rotational angle or the predetermined period for changing these areas as illustrated in
Further, if a relatively great amount of the rotational angle for changing the areas in the first direction DIR1 as illustrated in
Next, a vertical direction processing result will be described.
If the vertical position parameter and the field angle parameter are input by the play list PLS (
Similarly, based on the vertical position parameter and the field angle parameter, the PC 11 determines that the second projector 1B (
The partial images indicating the fourth area ARA4 and the fifth area ARA5 are displayed by the projectors 1A, 1C, 1D and the projector 1B, respectively, and it is possible for the image display system 1 to output the display image covering 180 degrees in the vertical direction from a viewpoint PS indicated in
Here, suppose that setting to rotate the display image in a third direction DIR3 indicated in
Similar to the change illustrated in
When the two areas illustrated in the left portion of
Note that the positions of the fourth area ARA4 and the fifth area ARA5 in the vertical direction (the Y coordinates thereof) as illustrated in
Further, the range of each of the fourth area ARA4 and the fifth area ARA5 (the number of pixels or the amount of space of each area) as illustrated in
Further, the third direction DIR3 in which the fourth area ARA4 and the fifth area ARA5 are changed as illustrated in
Further, the frequency of changing the fourth area ARA4 and the fifth area ARA5 and the amount of the rotational angle or the predetermined period for changing these areas as illustrated in
Note that combining the horizontal direction rotation and the vertical direction rotation may allow the rotation of the display image in an oblique direction.
The input unit 1F1 is configured to receive the image data D1 and the parameters PAR related to a display image. Note that the input unit 1F1 may be implemented by the input interface 11H3 (
The determination unit 1F2 is configured to determine areas of an image indicated by the image data D1, which are displayed by the display devices (the projectors 1A through 1D) as partial images of the display image, based on the parameters PAR received by the input unit 1F1. Note that the determination unit 1F2 may be implemented by the CPU 11H1 (
The change unit 1F3 is configured to change the areas at intervals of the predetermined time based on the parameters PAR received by the input unit 1F1, so that the display image is changed. Note that the change unit 1F3 may be implemented by the CPU 11H1 (
The above units represent functions and units of the image display system 1 implemented by any of the elements and devices illustrated in
When the areas which are displayed by the display devices are determined based on the parameters PAR received by the input unit 1F1, the image display system 1 is able to display the display image by combining the partial images output by the display devices. The areas are determined by the determination unit 1F2 based on the parameters. Then, the change unit 1F3 changes the areas at intervals of the predetermined time based on the parameters. Similar to the examples of
Further, the direction of rotation of the display image or the rotational speed of the display image may be set up by the parameters PAR.
Next, an overall process by an image display system 1 according to a second embodiment is explained. In one aspect, the second embodiment provides an image display system which is capable of displaying, when displaying a wide view image such as an omnidirectional image, a user's desired area of the wide view image. The image display system 1 according to the second embodiment may be implemented by the image display system 1 according to the first embodiment. In the following, an example in which the image display system 1 which is essentially the same as the above-described image display system 1 of the first embodiment is utilized will be described. Hence, a description of a hardware configuration of the image display system 1 according to the second embodiment will be omitted.
As illustrated in
In step S02, the image display system 1 waits for an operation input by a user. When the operation input by the user is received, the image display system 1 goes to step S03.
In step S03, the image display system 1 determines whether the received operation is a vertical reduction operation to reduce the display image vertically. When it is determined that the received operation is the vertical reduction operation (YES in step S03), the image display system 1 goes to step S04. On the other hand, when it is determined that the received operation is not the vertical reduction operation (NO in step S03), the image display system 1 goes to step S05.
In step S04, the image display system 1 partially or fully reduces the image indicated by the image data and displays the reduced image.
In step S05, the image display system 1 determines whether the received operation is a rotation operation to rotate the display image. When it is determined that the received image is the rotation operation (YES in step S05), the image display system goes to step S06. On the other hand, when it is determined that the received operation is not the rotation operation (NO in step S05), the image display system 1 terminates the overall process of
In step S06, the image display system 1 determines whether the image is reduced vertically. When it is determined that the image is reduced vertically (YES in step S06), the image display system 1 goes to step S07. On the other hand, when it is determined that the image is not reduced vertically (NO in step S06), the image display system 1 goes to step S08.
In step S07, the image display system 1 partially or fully perform nonmagnification of the image indicated by the image data. Note that the image display system 1 may be configured to enable the user to set up whether to perform the nonmagnification of the image.
In step S08, the image display system 1 rotates the display image and displays the rotated image.
If the first image Img1 as illustrated in the left portion of
To avoid this, the user performs an operation to change the area displayed in the display image. For example, the user performs an operation to reduce the image in the vertical direction (Y-axis direction). Next, when the vertical reduction operation is received (YES in step S03 in the process of
In this example, the image display system 1 generates a nonmagnified image Img3 in which the vertically reduced portion thereof is partially or fully nonmagnified. For example, the nonmagnified image Img3 is generated to have a magnification rate that is the same as that of the first image Img1 by resetting the reduction state of the reduced image Img2 illustrated in
Note that the nonmagnification is not restricted to the process which converts the reduced image Img2 to have the magnification rate that is the same as that of the first image Img1. For example, the nonmagnification may be a process which converts the reduced image Img2 to have a magnification rate such that the image indicated by the predetermined pixels PIX (
Further, the nonmagnification process may be performed based on the received image data. Specifically, when a reduced image is generated, the received image data (i.e., the image data indicating the image before the reduction process is performed) is copied and stored. Subsequently, the nonmagnified image Img3 may be generated by using the stored image data indicating the image before the reduction process is performed. Namely, the image display system 1 retains the image data with the original scaling rate nonmagnified in performing the reduction process. In this case, after the nonmagnified image Img3 is generated based on the image data, the image display system 1 is able to generate the nonmagnified image Img3.
Subsequently, the image display system 1 displays a display image based on the nonmagnified image Img3 as illustrated in the middle of
As illustrated in
When the display image is displayed based on the nonmagnified image Img3 in response to reception of the rotation operation as illustrated in
If the display image is displayed based on the reduced image Img2, the image indicated by the predetermined pixels PIX appears in the output range OUT as illustrated in
Next, an image display system 1 according to a third embodiment may be implemented by the image display system 1 according to the second embodiment. In the following, an example in which the image display system which is essentially the same as the above-described image display system of the second embodiment is utilized will be described. Hence, a description of a hardware configuration of the image display system 1 according to the third embodiment will be omitted and only the difference between the third embodiment and the second embodiment will be described. Namely, an overall process performed by the image display system 1 according to the third embodiment differs from the overall process performed by the image display system according to the second embodiment.
In step S20, the image display system 1 determines whether the image indicated by the predetermined pixels is included in the display area. When it is determined that the image indicated by the predetermined pixels is included in the display area (YES in step S20), the image display system 1 goes to step S21. On the other hand, when it is determined that the image indicated by the predetermined pixels is not included in the display area (NO in step S20), the image display system 1 goes to step S08.
In step S21, the image display system 1 determines whether all of the predetermined pixels are included in the display area. When it is determined that all of the predetermined pixels are included in the display area (YES in step S21), the image display system 1 goes to step S07. On the other hand, when it is determined that all of the predetermined pixels are not included in the display are (NO in step S21), the image display system 1 goes to step S22.
In step S22, the image display system 1 determines whether some of the predetermined pixels are included in the display area. When it is determined that some of the predetermined pixels are included in the display area (YES in step S22), the image display system 1 goes to step S23. On the other hand, when it is determined that some of the predetermined pixels are not included in the display area (NO in step S22), the image display system 1 goes to step S08.
In step S23, the image display system 1 changes the reduction rate.
For example, suppose that a reduced image is displayed at step S04 of the overall process of
When the partial image PIXP appears in the output range OUT, the image display system 1 determines that some of the predetermined pixels are included in the display area (YES in step S22 of
In the example illustrated in the left portion of
As illustrated in the right portion of
After the reduction rate is changed, the image display system 1 is able to display the display image such that the partial image PIXP hardly appears in the output range.
An image display system 1 according to a fourth embodiment may be implemented by the image display system 1 according to the second embodiment. In the following, an example in which the image display system which is essentially the same as the above-described image display system 1 of the second embodiment is utilized will be described. Hence, a description of a hardware configuration of the image display system 1 according to the fourth embodiment will be omitted and only the difference between the fourth embodiment and the second embodiment will be described. Namely, an overall process performed by the image display system 1 according to the fourth embodiment differs from the overall process performed by the image display system 1 according to the second embodiment.
In step S30, the image display system 1 stores the reduction rate.
In step S31, the image display system 1 stores the rotational angle.
In step S32, the image display system 1 rotates the image based on the rotational angle.
After the image is rotated at step S32, in step S33, the image display system 1 reduces partially or fully the image in the direction toward the position of the screen top and displays the reduced image as the display image.
In this example, the image display system 1 reduces the first image after the rotation in a direction PHD toward the highest position (top) PH (step S33 of
After the image is reduced in the direction PHD toward the position of the screen top, the image indicated by the predetermined pixels PIX is situated at a position immediately under the screen top PH. Namely, the image indicated by the predetermined pixels PIX hardly appears in the output range and the image display system 1 is able to display the display image such that the image indicated by the predetermined pixels PIX hardly appears in the output range. Further, the reduced image is generated and the image display system 1 is able to display a user's desired area of a wide view image.
The input unit 1F1 is configured to receive the image data D1 and an operation OPR to change the area of the first image Img1 indicated by the image data D1. Note that the input unit 1F1 may be implemented by the input interface 11H3 (
The reduction unit 1F5 is configured to generate a reduced image Img2 by reducing in size partially or fully an image, such as the first image Img1 indicated by the image data D1. Note that the reduction unit 1F5 may be implemented by the CPU 11H1 (
The nonmagnification unit 1F6 is configured to generate, when the reduced image Img2 is generated and the operation OPR is received, a nonmagnified image Img3 based on the image data D1 or by nonmagnification of some or all of a portion of the reduced image Img2. Note that the nonmagnification unit 1F6 may be implemented by the CPU 11H1 (
The display unit 1F4 is configured to display a display image based on the nonmagnified image Img3. Note that the display unit 1F4 may be implemented by any of the first projector 1A (
When image data indicating an omnidirectional image covering 360 degrees in the horizontal direction is received by the input unit 1F1, the image display system 1 displays a display image on an object having a hemispherical shape, such as the screen 2 illustrated in
When the reduced image Img2 is generated and the rotation operation is received, there may be a case in which an image indicated by predetermined pixels is displayed if the display image is displayed based on the reduced image Img2 after the rotation. In such a case, the image display system 1 causes the nonmagnification unit 1F6 to generate the nonmagnified image Img3. Then, the image display system 1 displays the display image based on the nonmagnified image Img3, and the image display system 1 is able to prevent the image indicated by the predetermined pixels from being displayed.
Hence, the image display system 1 is able to display, when displaying a wide view image such as an omnidirectional image, a user's desired area of the wide view image.
Note that all or some of the image display processes according to the present disclosure may be implemented by computer programs described in any of the legacy programming languages, such as Assembler, C language, and Java, object-oriented programming languages, or a combination thereof. The programs are computer programs for causing a computer, such as an information processing apparatus or an information processing apparatus included in an image display system, to execute the image display processes.
The programs may be stored in a computer-readable recording medium, such as a read-only memory (ROM) or electrically erasable programmable ROM (EEPROM), and may be distributed with the recording medium. Note that examples of the recording medium include an erasable programmable ROM (EPROM), a flash memory, a flexible disk, an optical disc, a secure digital (SD) card, and a magneto-optic (MO) disc. In addition, the programs may be distributed through an electric telecommunication line.
Further, the image display system according to the present disclosure may include a plurality of information processing apparatuses which are connected with one another via a network, and all or some of the above processes may be performed by the plurality of information processing apparatuses simultaneously, in a distributed manner, or redundantly. In addition, the above processes may be performed by a different device other than the above-described device in the image display system.
The image display system according to the present disclosure is not limited to the above-described embodiments, and variations and modifications may be made without departing from the scope of the present disclosure.
The present application is based upon and claims the benefit of priority of Japanese Patent Application No. 2015-160511, filed on Aug. 17, 2015, and Japanese Patent Application No. 2015-160512, filed on Aug. 17, 2015, the contents of which are incorporated herein by reference in their entirety.
The present application additionally includes the following numbered clauses.
1. An image display system which displays a display image and includes at least one display device and at least one information processing apparatus connected to the display device, the information processing apparatus comprising a processor configured to implement
an input unit configured to receive image data and an operation to change an area of an image indicated by the image data, which area is displayed by the display device as a partial image of the display image,
a reduction unit configured to reduce partially or fully the image indicated by the image data and generate a reduced image,
a nonmagnification unit configured to generate, when the reduced image is generated and the operation is received, a nonmagnified image based on the image data or by nonmagnification of some or all of a portion of the reduced image, and
a transmission unit configured to transmit data indicating the nonmagnified image to the display device,
wherein the display device is configured to display the area based on the nonmagnified image.
2. An information processing apparatus connected to at least one display device which displays a display image, the information processing apparatus comprising a processor configured to implement
an input unit configured to receive image data and an operation to change an area of an image indicated by the image data, which area is displayed by the display device as a partial image of the display image,
a reduction unit configured to reduce partially or fully the image indicated by the image data and generate a reduced image,
a nonmagnification unit configured to generate, when the reduced image is generated and the operation is received, a nonmagnified image based on the image data or by nonmagnification of some or all of a portion of the reduced image, and
a transmission unit configured to transmit data indicating the nonmagnified image to the display device.
3. The information processing apparatus according to clause 2, which the image data indicates an image with a field angle of 360 degrees in a horizontal direction.
4. The information processing apparatus according to clause 2 or 3, wherein the reduction and the nonmagnification are performed for the image in a vertical direction.
5. The information processing apparatus according to any of clauses 2 to 4, wherein the operation includes an operation to change the area in a vertical direction.
6. The information processing apparatuses according to any of clauses 2 to 5, wherein, when predetermined pixels are included in the area changed by the operation, the nonmagnification unit is configured to generate the nonmagnified image.
7. The information processing apparatus according to any of clauses 2 to 5, wherein, when predetermined pixels are included in the area changed by the operation, the reduction unit is configured to change a reduction rate at which the reduced image is generated.
8. The information processing apparatus according to any of clauses 2 to 7, wherein the reduction unit is configured to reduce the image toward a highest position of the area changed by the operation.
9. An image display method performed by an image display system which displays a display image and includes at least one display device and at least one information processing apparatus connected to the display device, the image display method comprising
receiving, by the information processing apparatus, image data and an operation to change an area of an image indicated by the image data, which area is displayed by the display device as a partial image of the display image,
reducing partially or fully, by the information processing apparatus, the image indicated by the image data to generate a reduced image,
generating, by the information processing apparatus, when the reduced image is generated and the operation is received, a nonmagnified image based on the image data or by nonmagnification of some or all of a portion of the reduced image,
transmitting, by the information processing apparatus, data indicating the non-magnified image to the display device, and
displaying, by the display device, the area based on the nonmagnified image.
10. A non-transitory computer-readable recording medium storing a program which when executed by a computer causes the computer to execute an image display method, the computer displaying a display image and including at least one display device and at least one information processing apparatus connected to the display device, the image display method comprising
receiving, by the information processing apparatus, image data and an operation to change an area of an image indicated by the image data, which area is displayed by the display device as a partial image of the display image,
reducing partially or fully, by the information processing apparatus, the image indicated by the image data to generate a reduced image,
generating, by the information processing apparatus, when the reduced image is generated and the operation is received, a nonmagnified image based on the image data or by nonmagnification of some or all of a portion of the reduced image,
transmitting, by the information processing apparatus, data indicating the nonmagnified image to the display device, and
displaying, by the display device, the area based on the nonmagnified image.
1 image display system
11 PC
2 screen
3 omnidirectional camera
D1 image data
PAR parameters
4 tablet
Number | Date | Country | Kind |
---|---|---|---|
2015-160511 | Aug 2015 | JP | national |
2015-160512 | Aug 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/003713 | 8/10/2016 | WO | 00 |