This application claim is benefit of Japanese Application No. 2014-60534 in Japan on Mar. 24, 2014, the contents of which are incorporated by this reference.
1. Field of the Invention
The present invention relates to a display apparatus and a display method of displaying images from a plurality of image pickup sections.
2. Description of the Related Art
Recently, portable apparatuses equipped with a photographing function (photographing apparatuses), such as a digital camera, have been widespread. Some of the photographing apparatuses of this kind are provided with a display section and equipped with a function of displaying a photographed image. Some display a menu screen on the display section to facilitate operation of the photographing apparatus. Such a display section is often provided on a back of a portable apparatus body, and a user can perform a photographing operation while checking a through image displayed on the display section on the back when photographing.
Such a display apparatus adopted in a photographing apparatus is capable of displaying another image in an image by image processing or displaying two images in different areas simultaneously. As such an apparatus that acquires two images, Japanese Patent Application Laid-Open Publication No. 2009-147824 proposes an apparatus which, while acquiring video of a target object to be aimed, acquires a state around the target object simultaneously and records image pickup data in which the predetermined target object is aimed and image pickup data showing the state around the target object simultaneously.
A display apparatus according to the present invention is a display apparatus capable of switching and displaying, for a same object, a first picked-up image obtained by performing image pickup from a first point of view and a second picked-up image obtained by performing image pickup from a second point of view different from the first point of view, the display apparatus including: a communication section performing signal transmission with each of a first image pickup section obtaining the first picked-up image and a second image pickup section performing image pickup from an angle different from an angle of an optical axis of the first image pickup section to obtain the second picked-up image; an instruction inputting section inputting an instruction to select an image to be a display target between the first picked-up image obtained by the first image pickup section and the second picked-up image obtained by the second image pickup section; and a display control section switching display of the first picked-up image and display of the second picked-up image, causing the display of the first picked-up image and the display of the second picked-up image to cooperate with each other based on the instruction of the instruction inputting section.
A display method according to the present invention is a display method of switching and displaying, for a same object, a first picked-up image obtained by performing image pickup from a first point of view and a second picked-up image obtained by performing image pickup from a second point of view different from the first point of view, the method including: performing signal transmission with each of a first image pickup section obtaining the first picked-up image and a second image pickup section performing image pickup from an angle different from an angle of an optical axis of the first image pickup section to obtain the second picked-up image; in accordance with an instruction to select one image of the first and second picked-up images as an image to be a display target, receiving the first picked-up image from the first image pickup section and displaying the one image; and in accordance with an instruction to select another image of the first and second picked-up images as an image to be a display target during the one image being displayed, receiving the second picked-up image from the second image pickup section and displaying the one image and the other image simultaneously, and, after that, displaying only the other image.
The above and other objects, features and advantages of the invention will become more clearly understood from the following description referring to the accompanying drawings.
Embodiments of the present invention will be described in detail below with reference to drawings.
In
The camera control section 1 can be configured with a processor such as a CPU not shown and may be operated in accordance with a program stored in a memory not shown. The camera control section 1 is provided with a display control section 1b. The display control section 1b can display picked-up images picked up by the image pickup sections 2 and 3 separately or simultaneously. In the present embodiment, since there is need for observing a same object from different angles, the control adjusting section 1a is adapted to be able to perform angle-of-view control for causing the image pickup sections 2 and 3 to pick up images of a same object, and information collection and instruction with regard to display and to display a warning display to a user and perform photographing control for photographing timings of the image pickup sections 2 and 3 and the like so that display switching between the images picked up by the image pickup sections 2 and 3 can be smoothly performed. Of course, objects of the image pickup sections 2 and 3 do not have to be a same object.
In
A tablet PC 21 corresponds to the camera control section 1 in
As shown in
In
The lens style camera 11L is provided with an image pickup section 12L in which an optical system 12aL is housed in a barrel 35L, and the lens style camera 11R is provided with an image pickup section 12R in which an optical system 12aR is housed in a barrel 35R. The optical systems 12aL and 12aR have focus lenses movable to set a focused state by focusing, zoom lenses for changing magnification in the focused state, and the like in the barrels 35L and 35R, respectively. The optical systems 12a have mechanism sections not shown for driving the lenses and diaphragms.
The image pickup sections 12 are provided with image pickup devices not shown, which are configured with CCD or CMOS sensors, so that object images are led onto image pickup surfaces of the image pickup devices by the optical systems 12a. Control sections 13 corresponding to the photographing condition setting sections 2a and 3a and the photographing timing setting sections 2b and 3b in
The control sections 13 of the lens style cameras 11 are configured with CPUs or the like, and the control sections 13 control each section of the lens style cameras 11 based on signals from the tablet PC 21 to be described later. Photographing control sections 13b generate focus signals, zoom signals and diaphragm control signals to perform driving control of focusing, zooming and the diaphragms of the optical systems 12a. The photographing control sections 13b provide driving signals to the image pickup devices to control image pickup of an object at a predetermined photographing timing. Thereby, photographing timings of the control sections 13 at a time of photographing of a movie and at a time of photographing a still image are controlled. Angle-of-view control sections 13a are adapted to be able to control the image pickup sections 12 to adjust photographing angles of view by a photographing angle of view being specified from the tablet PC 21.
The control sections 13 are given picked-up images from the image pickup sections 12 and can give the picked-up images to recording sections 16 to record the picked-up images after performing predetermined image signal processing, for example, color adjustment processing, matrix conversion processing, noise removal processing and other various of signal processing. For example, IC memories can be adopted as the recording sections 16. The control sections 13 are adapted to be able to transfer the picked-up images to the tablet PC 21 via the communication sections 15.
The control sections 13 are also adapted to be able to transfer information about the lenses such as lens states of the zoom lens, the focus lens and the like and a diaphragm state to the tablet PC 21 via the communication sections 15. The information about the lenses includes information about distances in an optical axis direction, such as a point of focus and a range of a depth of field. The control sections 13 are adapted to transmit information about a photographing timing also to the tablet PC 21.
The communication sections 15 can communicate with the camera communication section 22 provided in the tablet PC 21 via a predetermined transmission line. As the transmission line, various wired and wireless transmission lines, for example, a USB (universal serial bus) cable or a wireless LAN transmission line such as WiFi (wireless fidelity) can be adopted. The control sections 13 are adapted so that, when communication with the tablet PC 21 is established, photographing is controlled by a control section 25 of the tablet PC 21, and the control section 13 can transfer various information about picked-up images and photographing to the tablet PC 21.
The tablet PC 21 has the control section 25 configured, for example, with a CPU, and the control section 25 controls each section of the tablet PC 21. The control section 25 outputs a driving signal for the image pickup devices to the control sections 13 of the lens style cameras 11L and 11R via the camera communication section 22 and receives picked-up images from the lens style cameras 11L and 11R. The control section 25 performs predetermined signal processing, for example, color adjustment processing, matrix conversion processing, noise removal processing, and other various kinds of signal processing for the picked-up images read out.
An operation section 26 is also arranged on the tablet PC 21. The operation section 26 is configured with various operation sections such as switches, keys and a software keyboard provided on the tablet PC 21, which are not shown, and is adapted to generate an operation signal based on a user operation and output the operation signal to the control section 25. The control section 25 controls each section based on the operation signal.
The control section 25 can perform processing related to recording and reproduction of a picked-up image. For example, the control section 25 can perform compression processing of a signal-processed photographed image and give the compressed image to a recording section 24 to cause the recording section 24 to record the compressed image. As the recording section 24, various recording media such as an IC memory can be adopted, for example, and the recording section 24 can record image information, voice information and the like to a recording medium.
The display control section 27 executes various kinds of processing related to display. The display control section 27 can be given a signal-processed photographed image from the control section 25 and give the photographed image to the display section 28. The display section 28 has the display screen 28a such as an LCD, and displays the image given from the display control section 27. The display control section 27 is also adapted to be able to cause various menu displays and the like to be displayed on the display screen 28a of the display section 28. The control section 25 can read out a picked-up image recorded in the recording section 24 and perform expansion processing thereof. The display control section 27 can reproduce the recorded image by giving the expansion-processed picked-up image to the display section 28.
A touch panel not shown is provided on the display screen 28a of the display section 28. The touch panel can generate an operation signal corresponding to a position on the display screen 28a which the user points at with a finger. The operation signal is provided to the control section 25. Thereby, the control section 25 is adapted to, when the user touches the display screen 28a or slides the finger on the display screen 28a, be able to detect various operations, such as a position the user touches, an operation of bringing fingers close to each other and then separating the fingers (a pinch operation), a slide operation and a position reached by the slide operation, a slide direction and a period of touching, and execute processing corresponding to a user operation. For example, switching between screens is performed by a touch operation.
Note that the display screen 28a is arranged, for example, such that it occupies substantially a whole area of a front of the case 21a of the tablet PC 21, and the user can check picked-up images displayed on the display screen 28a of the display section 28 at a time of photographing by the lens style cameras 11 and perform a photographing operation while checking the picked-up images.
The tablet PC 21 also has a trimming processing section 29. The trimming processing section 29 is adapted to, when a trimming range is specified by the control section 25, perform trimming of picked-up images from the lens style cameras 11L and 11R and output them.
In comparison, in an example of
In the present embodiment, it is enabled to photograph a same object with the two lens style cameras 11L and 11R and to perform photographing while smoothly switching picked-up images from the lens style cameras 11L and 11R with different points of view. In the present embodiment, in order to realize such smooth switching between picked-up images, photographing support is performed so that the user can certainly photograph a common object with each of the cameras 11. For example, the control section 25 is adapted to be able to control the display control section 27 to judge whether a photographing angle of view of each of the cameras 11 is appropriate or not, and cause a result of the judgment to be displayed as display for the photographing support.
It is assumed that flowers 41a and 41b, which are objects, are positioned on the optical axis of the lens style camera 11R. It is assumed that the lens style camera 11L is arranged with its optical axis inclined by an inclination θ1 relative to the optical axis of the lens style camera 11R. The flower 41b is positioned within photographing ranges of the lens style cameras 11L and 11R and can be photographed simultaneously by the lens style cameras 11L and 11R. On the other hand, the flower 41a is positioned outside the photographing range of the lens style camera 11L and cannot be photographed by the lens style camera 11L. A limit position of the photographing range of the lens style camera 11L is defined by a distance Dmin from the lens style camera, and the distance Dmin is determined.
A distance Dcls to a point of intersection between the optical axes of the lens style cameras 11L and 11R is given by:
Dcls=B/tan θ1
where a distance between the lens style cameras 11L and 11R is denoted by B.
If the angle of view of the lens style camera 11L is denoted by θc2, the following is obtained:
Dmin·tan(θ1+θc2)=B
Therefore, the limit distance Dmin can be shown by an equation (1) below:
Dmin=B/tan(θ1+θc2) (1)
The inclination θ1 and the distance B in the above equation (1) are fixed and known values which are determined by the attaching devices 32L and 32R attached to the lens style cameras 11L and 11R. The angle of view θc2 is a value which changes according to a zoom operation of the lens style camera 11L. However, it is a value according to control of an angle-of-view control section 13aL of the lens style camera 11L and is a value which the control section 25 can grasp.
The control section 25 can recognize a distance to an object based on information given from the lens style camera 11R. When the distance to the object is smaller than Dmin obtained by the above equation (1), the control section 25 can cause display indicating that it is not possible to photograph the object by one camera to be displayed. The control section 25 may be adapted to, when it is possible to make Dmin smaller than the distance to the object by a zoom operation, display a display to that effect. Further, the control section 25 may be adapted to, when the distance Dim smaller than the distance to the object is obtained by a zoom operation, compulsorily control the angle of view of the lens style camera 11L to enable the lens style camera 11L to photograph the object.
Note that distances from the lens style cameras 11L and 11R are determined based on lens principal points. However, even if distances from lens surfaces or lens attached surfaces are determined, errors are relatively small and can be ignored. As for the distance B also, it does not matter even if lengths of the lenses are ignored.
Next, an operation of the embodiment configured as described above will be described with reference to
Now, it is assumed that the user is going to photograph a common object by the two cameras 11L and 11R, as shown in
At step S21, the control sections 13 of the cameras 11L and 11R judge whether a power source has been turned on or not. When the power source is turned on, the control sections 13 judge whether a photographing mode has been specified or not (step S22). When the photographing mode has been specified, the control sections 13 control the image pickup sections 12 to pick up an image of the object. Picked-up images obtained by the image pickup sections 12 are taken in by the control sections 13 to obtain through images (step S23). At step S24, the control sections 13 acquire point-of-focus information Dp1 and angle-of-view information.
On the other hand, at step S1, the control section 25 of the tablet PC 21 judges whether a two-camera cooperation mode has been specified or not. When the two-camera cooperation mode is specified, the control section 25 judges whether it is a time to start cooperation photographing or not at step S2. If it is the time to start the cooperation photographing, the control section 25 performs camera communication with each of the cameras 11L and 11R to judge functions and performance of each of the cameras 11L and 11R and specify an angle of view of each of the cameras 11L and 11R at step S3. Note that the angle of view of each of the cameras 11L and 11R may be set to a maximum angle of view at initialization. At step S4, the control section 25 requests a through image from a camera determined in advance between the cameras 11L and 11R or a camera specified by the user and displays a received image. At this time, frame rates of the respective cameras and timings of respective frames are adjusted to be same so as to cause the cameras to operate as if they are a same camera. Of course, a photographing start timing and a frame rate may be specified by the tablet PC 21.
Now, it is assumed that, for example, the object is positioned on the optical axis of the camera 11R as shown in
At step S25, the control sections 13 of the cameras 11 judge whether there is a communication request or not. When the through image communication request occurs from the tablet PC 21, the camera 11R which receives the request transmits an acquired through image to the tablet PC 21 via the communication section 15R (step S26).
The control section 25 of the tablet PC 21 gives the through image received at step S4 to the display control section 27 to cause the display control section 27 to display the through image.
Next, the control section 25 requests transmission of the point-of-focus information Dp1, the angle-of-view information and the like. At step S27, each of the cameras 11L and 11R transmits the point-of-focus information Dp1 and the angle-of-view information. When receiving the point-of-focus information Dp1, the control section 25 judges whether it is possible to pick up images of a same object with the two cameras or not at step S6.
That is, by calculating the above equation (1) using the information acquired from each of the cameras 11L and 11R, the control section 25 judges whether or not the point-of-focus information Dp1 is smaller than Dcls and equal to or larger than Dmin and whether or not the point-of-focus information Dp1 is smaller than Dmin. The control section 25 also judges whether the point-of-focus information Dp1 corresponds to Dcls or not.
If the point-of-focus information Dp1 corresponds to Dcls, it means that the object is positioned on the optical axes of the cameras 11L and 11R. Therefore, at step S7, the control section 25 causes an OK display indicating that the picked-up images of both of the cameras 11L and 11R can be displayed at a center of the screen, to be displayed on the display screen 28a.
If the point-of-focus information Dp1 is smaller than Dcls and is equal to or larger than Dmin, the control section 25 causes a display (warning 1) indicating that, though images of the object have been picked up by both of the cameras 11L and 11R, the object of one camera 11L is not displayed at the center of the screen to be displayed on the display control section 27. If the point-of-focus information Dp1 is smaller than Dmin, the control section 25 causes the display control section 27 to display a display (warning 2) indicating that the object is not picked up by the camera 11L.
In a case of displaying the warning 1, the control section 25 may further perform trimming according to the point-of-focus information Dp1 at step S7 to perform control so that the object is displayed at the center of the screen.
Next, the control section 25 judges whether a camera switching operation has been mode or not at step S8.
When the camera switching operation is performed, a through image is requested from the other camera different from the camera which has outputted the through image displayed currently (step S9). That is, in this case, a through image is requested from the camera 11L.
In response to the through image request, the camera 11L transmits the image being picked up to the tablet PC 21 as a through image. When receiving the through image, the tablet PC 21 causes the through image to be displayed on the display screen 28a by the display control section 27.
By controlling the angle of view of the camera 11L, the control section 25 can cause the sizes of picked-up images 55b and 56b from the cameras 11R and 11L to be almost same, as shown in
An upper side of
It is assumed that, in the camera 11R, the optical image of the flower 41 is formed within a range of a length l1 on an image pickup surface of the image pickup device 61R. It is assumed that, in the camera 11L, the optical image of the flower 41 is formed within a range of a length l2 on an image pickup surface of the image pickup device 61L. The length l1 is given by l1=H×F1/D. The length l2 is given by l2=H×F2/D. Therefore, by adjusting the angle of view so that the optical image of the flower 41 is formed within a range of F2/F1 of the length S of the image pickup device 61L, sizes of the images obtained by the cameras 11R and 11L become same. By such a device, a feeling of discontinuity given at a time of photographing one object is eliminated, and the user can make a confirmation easily and effortlessly. In a case of performing photographing like a movie, a camera shake is prevented, and a smooth transition effect is obtained.
Next, the control section 25 displays a picked-up image 57 from the camera 11L on a whole area of the display screen 28a as shown in
At next step S11, the control section 25 judges whether a photographing instruction has been issued or not. If a photographing instruction has been given, a photographing request is issued to the camera which has outputted the picked-up image displayed currently. In this case, the photographing request is issued to the camera 11L. When detecting that communication for photographing has been done, at step S28, a control section 13L of the camera 11L performs photographing and transfers a picked-up image to the tablet PC 21 at step S29. The control section 25 of the tablet PC 21 gives the picked-up image transferred from the camera 11L to the recording section 24 to cause the recording section 24 to record the picked-up image.
Note that, when receiving a control signal for angle-of-view adjustment from the control section 25 of the tablet PC 21, the cameras 11 cause the process to transition from step S30 to step S31 and performs zoom processing for adjustment to a specified angle of view.
As described above, in the present embodiment, it is possible to, in a case of photographing a same object by two image pickup sections with different points of view, perform photographing support, for example, presenting information for obtaining a suitable angle of view to a user or automatically adjusting an angle of view and trimming so that the user can certainly photograph the common object with each image pickup section. It is also possible to perform photographing while smoothly switching between picked-up images from the image pickup sections with different points of view.
There may be a case where WiFi or the like is adopted for communication between the lens style cameras 11L and 11R and the tablet PC 21. When the communication section of the tablet PC 21 can secure only a communication line of one system, it is necessary for the tablet PC 21 to perform time-division communication with each of the lens style cameras 11L and 11R. In this case, it is conceivable that, at a time of switching from a picked-up image of one camera to a picked-up image of the other camera, a period during which an image is not displayed may occur or discontinuity may occur in an image, as shown in
At step S41 in
In order to perform communication between the tablet PC 21 and Cameras 1 and 2, a communication establishment processing period is set at a beginning of a communication period in
At a time of the transition, the control section 25 of the tablet PC 21 causes the process to transition to step S2. In the present embodiment, at the time of starting the cooperation photographing, communications with both cameras are performed at next step S42 to perform judgment about functions and angle of views and acquire photographing timing information from one camera from which a picked-up image is currently being received.
Now, it is assumed that a state of displaying a picked-up image from Camera 1, between Cameras 1 and 2, transitions to a state of displaying picked-up images of both of Cameras 1 and 2 as shown in
At step S5, the control section 13 of Camera 1 transmits the information about a photographing timing together with an angle of view and specification information. At step S36, the control section 13 transmits a through image and information for timing adjustment in response to the through image transmission request.
At step S43, the control section 25 of the tablet PC 21 requests Camera 1 to transmit point-of-focus information. Camera 1 transmits point-of-focus information Dp1. The control section 25 of the tablet PC 21 receives the point-of-focus information Dp1 and transmits information about a focusing position to Camera 2 using the point-of-focus information Dp1.
The control section 25 can control image pickup by Camera 2 based on the various pieces of information about image pickup by Camera 1 obtained through steps S42, S4 and S43, and smooth camera switching is enabled. When camera switching occurs at step S8, the control section 25 judges whether immediate change is performed or not at step S44. The immediate change means that, at a time of camera switching, the state in
The control section 25 gives a switching signal to the camera communication section 22 to switch between communication with Camera 1 and communication with Camera 2. As shown in
During transition, each camera holds a picked-up image and transmits the picked-up image in a time period shorter than a photographing time period. Thereby, even in a case of performing time-division transmission of image data of both of Cameras 1 and 2, all picked-up images of Cameras 1 and 2 can be transferred. Note that, though
As shown in
In the information communication in
Similarly to the first embodiment, it is possible to display a picked-up image from Camera 1 and a picked-up image from Camera 2 with similar sizes by controlling angles of view and trimming, and smooth transition is possible.
As shown in
In a case where immediate switching from display of the picked-up image of Camera 1 to display of a picked-up image of Camera 2, the control section 25 instructs the communication section 22 to start communication with Camera 2. After the predetermined communication establishment processing period after the start, communication with Camera 2 is started, and a picked-up image of Camera 2 is transferred. In this case, a period occurs during which a picked-up image is transferred neither from Camera 1 nor from Camera 2 as shown in
As described above, in the present embodiment, effects similar to those of the first embodiment can be obtained. In addition, when a tablet PC and a camera can perform only one-to-one communication, the tablet PC performs time-division communication with two cameras, and controls image pickup by transferring information about photographing of one camera to the other camera, and, thereby, it is possible to continuously display picked-up images from the respective cameras, and it is possible to smoothly switch the images. Though an image pickup timing of one camera is adjusted to that of the other camera here, the tablet PC may specify a timing to both cameras to control and synchronize timings of both cameras. In the case of repeatedly display a picked-up image until display of a picked-up image from Camera 2 becomes possible, switching between displays may be performed by fading in/fading out the displays. Without such detailed control, a feeling of discontinuous is given during slow-motion or fast-forward display.
The present embodiment is an example in which a smartphone having an internal camera and a lens style camera attached to a case of the smartphone are adopted as two image pickup sections. Similarly to the first embodiment, the present embodiment makes it possible to perform cooperated display, for example, shown
In
In
In the present embodiment, the control section 83 of the internal camera 81 is adapted to give and receive information about image pickup to and from the control section 75. That is, the control section 75 can acquire the information about image pickup of the internal camera 81 without performing communication by the camera communication section 72, and, therefore, the camera communication section 72 has to perform communication only with the lens style camera 11.
Next, an operation of the embodiment configured as described above will be described with reference to
Now, it is assumed that a state of displaying a picked-up image from Camera 1, between Cameras 1 and 2, transitions to a state of displaying picked-up images of both of Cameras 1 and 2 as shown in
The control section 75 of the smartphone 71 displays a picked-up image of the internal camera 81, which is Camera 1. In this case, the control section 75 has specified a photographing timing and the like to the internal camera 81, which is Camera 1, or has acquired photographing timing information directly from the internal camera 81, and the camera communication section 72 is not necessary for giving and receiving information to and from the internal camera 81. Therefore, the camera communication section 72 can be used exclusively for communication with the lens style camera 11. The control section 75 provides the photographing timing information and the like about the internal camera 81 to the lens style camera 11 via the camera communication section 72 (step S52). At step S4, the control section 75 requests a through image from the external lens style camera 11.
Next, the control section 75 performs judgment of a degree of similarity between the picked-up image from Camera 1 and the picked-up image from Camera 2 at step S53 and generates a warning according to the degree of similarity at step S54. By judging the degree of similarity, it is possible to judge whether or not an object is photographed by the two cameras in similar photographing states. If the degree of similarity is high, it can be judged that smooth switching from Camera 1 to Camera 2 is possible. On the contrary, if the degree of similarity is low, it can be judged that switching from Camera 1 to Camera 2 is not performed smoothly. Therefore, by generating a warning according to the degree of similarity, the control section 75 can perform operation support for smooth switching from the picked-up image of Camera 1 to the picked-up image of Camera 2 for the user. Note that, in the present embodiment also, methods similar to those of steps S6 and S7 of each embodiment described above may be used. Such image pickup section switching makes it possible to easily perform varied photographing and confirmation.
As described above, in the present embodiment, the camera communication section 72 has to communicate only with the lens style camera 11, and the smartphone 71 and the lens style camera 11 can continuously communicate with each other.
As shown in
In this case, the information about the photographing timing of the internal camera 81 and the like is provided to the lens style camera 11, and it is possible to acquire and display a picked-up image from the lens style camera 11 continuously after a picked-up image from the internal camera 81.
Other operations such as adjustment of timings of respective frames to be photographed to be same are similar to those of the second embodiment.
Further, though description has been made with a digital camera used as an apparatus for photographing in each of the embodiments of the present invention, a digital single-lens reflex camera, a compact digital camera or a camera for movie like a video camera and a movie camera is also possible as a camera. Further, an internal camera built into a portable information terminal (PDA: personal digital assistant), such as a mobile phone and a smartphone, is possible, of course. Industrial and medical optical apparatuses such as an endoscope and a microscope are also possible. A surveillance camera, an onboard camera, and a stationary type camera, for example, a camera attached to a television set or a personal computer are also possible.
The present invention is not limited to each of the above embodiments immediately, and the components can be modified and embodied within a range not departing from a spirit of the invention at a stage of practicing the invention. Further, various inventions can be formed by appropriately combining a plurality of components disclosed in each of the above embodiments. For example, some of all the components shown in the embodiments may be deleted. Further, components in different embodiments may be appropriately combined.
Note that, even if an operation flow in the claims, the specification and the drawings is described with “first”, “next” or the like for convenience, it does not mean that it is essential to perform the operation flow in that order. It goes without saying that, as for each of steps constituting the operation flow, such a part that does not influence essence of the invention can be omitted appropriately.
In the techniques described here, many of controls and functions described mainly with a flowchart can be set by a program, and the controls and functions described above can be realized by a computer reading and executing the program. A whole or a part of the program can be recorded or stored in a portable medium, for example, a nonvolatile memory such as a flexible disk and a CD-ROM, or a storage medium such as a hard disk and a volatile memory as a computer program product and can be distributed or provided at a time of shipment of the product or by a portable medium or through a communication line. A user can easily realize the display apparatus and display method of the present embodiments by downloading the program via a communication network and installing the program into a computer or by installing the program into the computer from a recording medium.
Number | Date | Country | Kind |
---|---|---|---|
2014-060534 | Mar 2014 | JP | national |