The present invention relates to an electronic device that can change a display portion of an image and a control method therefor.
Hitherto, there is a method for changing a display range of an image in accordance with the orientation of a device. Japanese Patent Laid-Open No. 2012-75018 discloses that when a digital camera rotates and moves in a panorama playback mode, the range of a portion of a panorama image in the direction in which the digital camera faces is displayed. In addition, there is a method for switching an image displayed on a display screen. Japanese Patent Laid-Open No. 2014-222829 discloses that a plurality of images are aligned vertically in a display region and the image displayed in the display region can be switched by scrolling.
In Japanese Patent Laid-Open No. 2012-75018, when display of an image is started, a range of an image corresponding to the orientation of the digital camera at that moment is displayed, and thus, for example, the image cannot be checked from a reference range or a range desired by the user, and the user needs to change the display range by changing the orientation of the digital camera. In Japanese Patent Laid-Open No. 2014-222829, when display of an image is started, in a case where the user wants to display the range of a portion of the image in an enlarged manner, the user needs to perform an enlargement operation.
PTL 1: Japanese Patent Laid-Open No. 2012-75018
PTL 2: Japanese Patent Laid-Open No. 2014-222829
The present invention has been made in light of the above-described problems, and an object of the present invention is to increase ease of operation for displaying the range of a portion of an image in a case where a plurality of images can be arranged and displayed in a display region.
In order to achieve the above-described objectives, an electronic device according to the present invention includes a detection means capable of detecting a change in an orientation of a display means, a switching means that switches an image displayed on a display surface among a plurality of images, a change means that changes a portion of an image displayed on the display surface, a recording means that records information regarding a portion of an image displayed on the display surface in a case where switching is performed by the switching means among the plurality of images, and a control means that performs control such that, in a case where a portion of a first image among the plurality of images is displayed on the display surface, the change means changes, in accordance with detection of the change in the orientation of the display means by the detection means, a displayed portion by an amount corresponding to the change in the orientation, the information on a second image is changed in accordance with the change in the orientation in a case where the second image meets a predetermined condition when a displayed image is switched from the first image to the second image, and the information on the second image is not changed in a case where the second image does not meet the predetermined condition.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
In the following, preferred embodiments of the present invention will be described with reference to the drawings.
In
The CPU 101 is a controller that controls the entirety of the smartphone 100, and is constituted by at least one processor. The memory 102 is constituted by, for example, a random access memory (RAM), which is for example a volatile memory using a semiconductor device. The memory 102 stores image data, which is digital data converted from data obtained by the image capturing unit 22, and image data to be displayed on the display unit 105. The memory 102 has a storage capacity sufficient for storing a predetermined number of still images and a predetermined time period of a moving image and voice and sound. In addition, the memory 102 also serves as a memory for image display (a video memory). A RAM is used as the system memory 52. For example, constants and variables for the operation of the CPU 101, and a program read out from the nonvolatile memory 103 are loaded into the system memory 52.
The CPU 101 controls, using the memory 102 as a work memory, the individual units of the smartphone 100 in accordance with, for example, a program stored in the nonvolatile memory 103. Image data, audio data, other data, and various programs for the operation of the CPU 101 are stored in the nonvolatile memory 103. The nonvolatile memory 103 is constituted by, for example, a flash memory or a read-only memory (ROM).
On the basis of control performed by the CPU 101, the image processing unit 104 performs various types of image processing on, for example, images stored in the nonvolatile memory 103 and a storage medium 108, a video signal acquired via the external I/F 109, and images acquired via the communication I/F 110. Image processing performed by the image processing unit 104 includes, for example, analog to digital (A/D) conversion processing, digital to analog (D/A) conversion processing, image data encoding processing, compression processing, decoding processing, enlargement/reduction processing (resizing), noise reduction processing, and color conversion processing. In addition, the image processing unit 104 also performs various types of image processing such as panorama development, mapping processing, and conversion on an omnidirectional image or a wide range image having data of a wide but not omnidirectional range. The image processing unit 104 may be constituted by a dedicated circuit block for performing specific image processing. In addition, depending on the type of image processing, it is also possible that the CPU 101 performs image processing in accordance with a program without using the image processing unit 104.
On the basis of control performed by the CPU 101, the display unit 105 displays, for example, images and a graphical user interface (GUI) screen constituting a GUI. The CPU 101 generates a display control signal in accordance with a program, and controls various units of the smartphone 100 so as to generate a video signal to be displayed on the display unit 105 and to output the video signal to the display unit 105. The display unit 105 displays images based on the output video signal.
Note that the smartphone 100 itself is configured to include an interface for outputting a video signal to be displayed on the display unit 105, and the display unit 105 may be configured as an external monitor (such as a television).
The operation unit 106 is an input device for accepting an operation by a user, and examples of the operation unit 106 include a character information input device such as a keyboard, pointing devices such as a mouse and a touch panel, and buttons, a dial, a joystick, a touch sensor, and a touch pad. Note that the touch panel is an input device that overlies the display unit 105 in a planar manner and that is configured to output coordinate information corresponding to a touched position.
The storage medium 108 such as a memory card, a compact disc (CD), or a digital versatile disc (DVD) is removable from the storage medium I/F 107. On the basis of control performed by the CPU 101, the storage medium I/F 107 reads out data from the storage medium 108 attached thereto and writes data into the storage medium 108.
The external I/F 109 is connected to an external device by a cable or wirelessly, and is an interface for inputting-outputting a video signal and an audio signal.
The communication I/F 110 is an interface for communicating with, for example, an external device and the Internet 111 and for transmitting and receiving various types of data such as files and commands.
The audio output unit 112 outputs voice and sound from a moving image or music data, operation sound, a ring tone, and various types of notification sound. The audio output unit 112 may perform audio output through, for example, wireless communication.
The orientation detection unit 113 detects the orientation of the smartphone 100 (the display unit 105) with respect to the direction of gravity. On the basis of the orientation detected by the orientation detection unit 113, it can be determined, for example, whether the smartphone 100 is horizontally held, is vertically held, faces upward, faces downward, or is tilted. As the orientation detection unit 113, at least one of sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, and an azimuth sensor can be used, and it is also possible to combine and use a plurality of sensors.
Note that the operation unit 106 includes the touch panel 106a. The CPU 101 can detect the following operations performed on or the following states of the touch panel 106a.
The touch panel 106a is newly touched by a user's finger or pen that has not touched the touch panel 106a, that is, touching is started (hereinafter referred to as Touch-Down).
A state in which the touch panel 106a is touched by his or her finger or pen (hereinafter referred to as Touch-On).
His or her finger or pen is moving while touching the touch panel 106a (hereinafter referred to as Touch-Move).
His or her finger or pen touching the touch panel 106a is moved away from the touch panel 106a, that is, touching is ended (hereinafter referred to as Touch-Up).
A state in which nothing is touching the touch panel 106a (hereinafter referred to as Touch-Off).
When Touch-Down is detected, Touch-On is simultaneously detected, too. Unless Touch-Up is detected after detection of Touch-Down, Touch-On is usually continuously detected.
Also in a case where Touch-Move is detected, Touch-On is simultaneously detected. Even when Touch-On is detected, if the touch position is not being moved, Touch-Move is not detected.
When Touch-Up is detected for all the fingers or pens that have been touching, Touch-Off is detected.
These operations and states and the coordinates of the position where his or her finger or pen is touching on the touch panel 106a are reported to the CPU 101 via the internal bus, and the CPU 101 performs a determination as to what operation (touch operation) has been performed on the touch panel 106a on the basis of the reported information.
Regarding Touch-Move, the movement direction of his or her finger or pen moving on the touch panel 106a can also be determined on the basis of changes in position coordinates on a vertical component basis and on a horizontal component basis on the touch panel 106a. In a case where Touch-Move is detected over at least a predetermined distance, it is determined that a slide operation has been performed.
An operation in which the user's finger is quickly moved some distance on the touch panel 106a while touching the touch panel 106a and then is simply moved away from the touch panel 106a is called a flick. In other words, a flick is an operation for moving the user's finger quickly along the surface of the touch panel 106a such that the touch panel 106a is stricken lightly with his or her finger. When it is detected that Touch-Move has been performed for at least a predetermined distance at a predetermined speed or faster and then Touch-Up is detected, it can be determined that a flick has been performed. (It can be determined that, subsequent to performance of a slide operation, a flick has been performed).
Furthermore, when a plurality of positions (for example, two positions) are touched simultaneously, an operation for bringing the touch positions close to each other is called Pinch-In and an operation for locating the touch positions away from each other is called Pinch-Out. Pinch-Out and Pinch-In are collectively called pinch operations (or simply Pinch).
As the touch panel 106a, any touch panel may be used among touch panels using various methods such as a resistive film method, a capacitive sensing method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, an image recognition method, and an optical sensing method.
There are a method in which touching is detected when something is in contact with a touch panel and a method in which touching is detected when a user's finger or pen is in close vicinity to a touch panel, and either method may be used.
First, a VR image is an image that can be displayed in VR. VR images include, for example, an omnidirectional image (a 360° image) captured by an omnidirectional camera (a 360° camera) and a panorama image having a wider image range (an effective image range) than a display range that can be displayed at once on a display means. In addition, VR images (VR content) include not only images captured by cameras but also images that are generated using computer graphics (CG) and that can be displayed in VR. VR images also include not only still images but also moving images and live view images (an image output to a display unit by acquiring, in almost real time, an image signal continuously read out from an image pick up element of a camera). A VR image has an image range (an effective image range) having a field of view with a maximum of 360 degrees in the up-down direction (a vertical angle, an angle from the zenith, an elevation angle, a depression angle, an altitude angle), and a maximum of 360 degrees in the left-right direction (a horizontal angle, an azimuth angle). In addition, VR images include an image having a wider angle of view (a field of view) than the angle of view that can be captured by a normal camera or a wider image range (an effective image range) than a display range that can be displayed at once on a display means even in a case where the image has a range of less than 360 degrees in the up-down direction and a range of less than 360 degrees in the left-right direction. For example, an image captured by an omnidirectional camera capable of capturing an image of a subject covering a field of view (an angle of view) having (a horizontal angle of, an azimuth angle of) 360 degrees in the left-right direction and a vertical angle of 210 degrees when the zenith is the center is a type of VR image. That is, an image having an image range with a field of view of 180 degrees (±90 degrees) or more in each of the up-down direction and the left-right direction, and having a wider image range than the range that a person can view at once is a type of VR image. When this VR image is displayed in VR, a seamless omnidirectional image can be viewed in the left-right direction (the horizontal rotation direction) by changing the orientation in the left-right rotation direction. When viewed from the point directly above the user (the zenith), a seamless omnidirectional image can be viewed in a range of ±105 degrees in the up-down direction (the vertical rotation direction); however, the range beyond 105 degrees from the point directly above the user is a blank region where no image is present. A VR image can also be called “an image the image range of which is at least a portion of virtual space (VR space)”.
VR display is a display method with which a display range of a VR image can be changed, the display range displaying an image having a field of view corresponding to the orientation of the smartphone 100 detected by the orientation detection unit 113. In a case where the user views an image using the smartphone 100 set in VR goggles, an image having a field of view corresponding to the orientation of the user's face is displayed. For example, in a VR image, an image having a viewing angle (an angle of view) in which the position located at 0 degrees in the left-right direction (a specific azimuth, for example, the north) and at 90 degrees in the up-down direction (90 degrees from the zenith, that is, horizontal) is treated as the center is displayed at a certain point in time. When, from this state, the orientation of the smartphone is changed such that the smartphone is facing in the opposite direction (for example, the display surface is changed to face the north from the south), the display range is changed, in the same VR image, to an image having a viewing angle in which the position located at 180 degrees in the left-right direction (the opposite azimuth, for example, the south) and at 90 degrees in the up-down direction (horizontal) is treated as the center. In the case where the user views an image using the smartphone 100 set in the VR goggles, when the user turns his or her face so as to face the south from the north (that is, when the user turns around), the image displayed on the smartphone 100 is changed from an image for the north to an image for the south. By performing such VR display, it is possible to visually cause the user to have a feeling as if he or she were actually at the place in the VR image (in the VR space).
As described above, since a portion of an image is displayed in VR display, a display angle α indicating a display range on the display unit 105 will be described. As illustrated in the circle 213 in the lower portion of
Display processing according to the present embodiment will be described using
In S301, the CPU 101 acquires a plurality of images to be displayed, that is, images having image numbers 1 to N via the communication I/F 110. The image numbers indicate the order in which a plurality of images included in one post are displayed. That is, in S301, image data is acquired such that each of the plurality of images to be displayed from now on can be displayed in VR. In S301, the image data acquired via the communication I/F 110 is loaded into the system memory 52.
In S302, the CPU 101 acquires, via the communication I/F 110, display angles α1 to αN indicating display start positions of the images having image numbers 1 to N and display information indicating, for example, whether the images are tagged. For each image, a tag indicates whether the user has determined in advance a display range displayed when display of the image is started. That is, in a case where the user has set a display range in order to start displaying a 360° image from a portion of the 360° image where, for example, a main subject or a subject of interest is seen, the image is tagged. For an image that is not tagged, display is started with a display range the center of which is the display angle αn=0, that is, a reference position. For an image that is tagged, display is started with a display range the center of which is a position specified by the user such as αn=30° or 60°. Also in S302, the display information regarding the image acquired via the communication I/F 110 is loaded into the system memory 52.
In S303, the CPU 101 displays, on the display unit 105, a display range the center of which is at the display angle α1 of the image having image number 1. The displayed image is treated as a display image H, and the display image H=an image number. In a case where the display angle α1 is set to the center, the range displayed on the display unit 105 is changed in accordance with a display magnification and the angle β of the display unit 105 (display means). For example, at α1, the display range in a case of β=30° differs from that in a case of β=210° such that the display range displays an area above the camera at the time of image capturing in the former case and an area below the camera in the latter case.
In S305, the CPU 101 determines whether to change the display image. In a case where it is determined that the display image is to be changed, the process proceeds to S306. Otherwise, the process proceeds to S307. The display image can be performed by performing a scroll operation (issuing a scroll command) on a touch panel 206a, that is, a display surface of the display unit 105. When scrolling is performed upward on the display unit 105, images having larger image numbers are displayed, and when scrolling is performed downward, images having smaller image numbers are displayed.
In S306, the CPU 101 performs active image determination processing. The active image determination processing will be described later using
In S309, the CPU 101 sets a user operation flag of the active image to ON, and records that in the system memory 52. The user operation flag is a flag for preventing the display angle of the current image from being unintentionally changed, and when the user operation flag is set to ON, even in a case where the display angles of other images other than its related images are changed in accordance with a change in the orientation of the display unit 105, the display angle thereof is not changed. That is, in a case where another image other than its related images is an active image, even in a case where the orientation of the display unit 105 is changed, the display angle of the image for which the user operation flag is ON is not changed.
In S310, the CPU 101 sets a user operation flag of the inactive image to ON, and records that in the system memory 52.
In S311, the CPU 101 records, in the system memory 52, the display angle α of the current active image as a display start position and the active image and the inactive image that are simultaneously touched as related images.
In S312, the CPU 101 determines whether to end the display processing. The display processing ends when a touch operation is performed on a return item such as an item 703 illustrated in
In S314, the CPU 101 sets the user operation flag of the image (target image) for which the switching command has been issued in S307 to ON, and the process proceeds to S304 and then to S305.
In S315, the CPU 101 performs display range change processing. The display range change processing is processing in which the display angle α of an active image is changed in accordance with the orientation of the display unit 105, and will be described later also using
Next, using
In S401, the CPU 101 acquires a display state of the image having image number 1. The display state of an image indicates whether the image is displayed and how large is the region of the displayed image in the display region of the display unit 105. In
In S403, the CPU 101 determines whether the image having image number n is currently being displayed. In a case where it is determined that the image number n is being displayed, the process proceeds to S404. Otherwise, the process proceeds to S407.
In S404, the CPU 101 determines whether, among the images displayed on the display unit 105, the area of a region where the image having image number n is displayed is larger than the areas of regions where the other display images are individually displayed. In a case where it is determined that the region where the image having image number n is displayed is the largest, the process proceeds to S405. Otherwise, the process proceeds to S406.
In S405, the CPU 101 sets the state of the image having image number n to active.
In S406, the CPU 101 sets the state of the image having image number n to inactive.
In S407, the CPU 101 determines whether the determination in S403 has been made for the images having image numbers up to N. That is, it is determined whether all the images included in the post have been determined to be any one of an active image, an inactive image, and an undisplayed image. In a case where the image having image number n corresponds to image number N and it is determined that the above-described determination has been completed, the active image determination processing ends. Otherwise, the process proceeds to S408.
In S408, the CPU 101 obtains image number n=n+1. That is, the determination in and after S403 is performed for the next image number.
In S409, the CPU 101 acquires the display state for image number n similarly to as in S401, and the process proceeds to S402 and then to S403.
Note that in the processing of
Alternatively, in step S404, it may be determined which one of display target images is the closest to a predetermined position of the display region (for example, the upper left or the center), and the closest image may be determined to be the active image in step S405.
Alternatively, in step S404, it may be determined whether an image is selected, and in a case where an image is selected, the image may be determined to be the active image in step S405.
Next, display range change processing according to the present embodiment will be described using
In S501, the CPU 101 determines whether there is a change in the orientation of the smartphone 100 by using the orientation of the smartphone 100 detected by the orientation detection unit 113. In a case where it is determined that there is a change in the orientation of the smartphone 100, the process proceeds to S502. Otherwise, the process proceeds to S505. In S502, the CPU 101 acquires an orientation change amount γ.
In S503, the CPU 101 changes the display range of the active image, and records the display angle of the active image as αa=αa+γ in the system memory 52. In
In S504, the CPU 101 performs update processing. The update processing is processing for updating the display angles of other images on the basis of a change in the display range of the active image due to a change in the orientation of the smartphone 100. The update processing will be described later using
In S505, the CPU 101 determines whether the active image is displayed in full screen in the display region of the display unit 105. In a case where it is determined that the active image is displayed in full screen on the display unit 105, the process proceeds to S506. Otherwise, the process proceeds to S304 of
In S506, similarly to as in S307, the CPU 101 determines whether the display angle switching command has been issued through a touch operation or a button operation performed by the user. In a case where it is determined that the display angle switching command has been issued, the process proceeds to S507. Otherwise, the process proceeds to S509.
In S507, the CPU 101 switches the display angle of the active image, for which the switching command has been issued in S506.
In S508, the CPU 101 determines whether the full screen display has been ended. The full screen display ends when a tapping operation is performed on the image again. In a case where it is determined that the full screen display ends, the process proceeds to S512. Otherwise, the process returns to S506.
The processing in S509 to S511 is substantially the same as that in S501 to S503 of
In S512, the CPU 101 determines whether the active image has been displayed for 360° or more, on the basis of the determination made in S506 or S509 in S505 to S511 that have just been performed. That is, it is determined whether the entire range on the XY plane of the image has been displayed in the state of full screen display. In a case where it is determined that all 360° have been displayed, the process proceeds to S513. Otherwise, the process proceeds to S514. Note that, in step S512, it is sufficient that whether a range has been displayed in almost all the images be determined. Thus, when the active image corresponds to 180°, it is determined whether the active image has been displayed for 180° or more. In addition, a determination may be made not only for all 360° or 180° but also for an arbitrary display angle such as 350° or 160°.
In S513, the CPU 101 updates the display angle αa of the active image to the angle updated in S507 or S511.
In S514, the CPU 101 returns the display angle αa of the active image to the angle used before performance of the full screen display. In a case where the user has viewed the range covering 360° or more, that is, all the range in the full screen display, it is highly likely that a subject desired by the user is present in the display range at the display angle at which display is currently performed. In contrast, in a case where the user has viewed only the range that is smaller than 360°, it is highly likely that the image displayed in full screen is not the one the user wants to view or a portion thereof is only displayed in an enlarged manner, and thus the image is displayed at the previous display angle.
Next, update processing according to the present embodiment will be described using
In S601, the CPU 101 acquires a display angle of and display information on the image having image number 1. In this case, the display information on the image includes whether the image is an active image, information on a time period during which the image has been displayed, tag information, and related-image information. In
In S603, the CPU 101 determines whether the image having image number f is the active image. In a case where it is determined that the image is the active image, the process proceeds to S609. Otherwise, the process proceeds to S604.
S604 to S607 illustrate conditions as to whether to change the display angle of each image as the display angle of the active image changes with a change in the orientation of the smartphone 100.
In S604, the CPU 101 determines whether the user operation flag of the image f, which is the target image, is ON. In a case where it is determined that the user operation flag of the image f is ON, the process proceeds to S605. Otherwise, the process proceeds to S606.
In S605, the CPU 101 determines whether a predetermined time period has elapsed after the image f is hidden. The predetermined time period is, for example, a time period of three or ten minutes, and the determination is performed in a period started when the currently displayed post is displayed and is not performed in a period elapsed from when the images of the currently displayed post were displayed last time. In a case where it is determined that the predetermined time has elapsed after the image f is hidden, the process proceeds to S608. Otherwise, the process proceeds to S609.
In S606, the CPU 101 determines whether the image f is a tagged image. In a case where it is determined that the image f is a tagged image on the basis of the display information, the process proceeds to S609. Otherwise, the process proceeds to S607.
In S607, the CPU 101 determines whether the image f is an image related to the active image. In a case where it is determined that the image f is an image related to the active image, the process proceeds to S608. Otherwise, the process proceeds to S609.
In S608, the CPU 101 performs update, so that the display angle αf=αf+γ. That is, the display angle of the image f is changed by a change in the display angle of the active image corresponding to the change in the orientation of the smartphone 100. In this manner, in S608, the display angle of the image for which the user operation flag is ON but at least the predetermined time period has elapsed and the display angle of the image having no tag and related to the active image are changed in synchronization with the change in the display angle of the active image due to the change in the orientation of the smartphone 100.
In S609, the CPU 101 determines whether the determination in S403 has been made for the images having image numbers up to N. That is, it is determined whether all the images included in the post have been determined to be any one of an active image, an inactive image, and an undisplayed image.
In S610, the CPU 101 obtains image number f=f+1. That is, the determination in and after S603 is performed for the next image number.
In S611, the CPU 101 acquires the display state for image number f similarly to as in S601, and the process proceeds to S602 and then to S603.
In this manner, for each of the images having image numbers 1 to N, whether to change the display angle of the image is determined. In S608, the display angles of the images that meet the conditions in the determinations performed in S604 to S607 are changed.
In contrast, the display angle of the image for which the user operation flag is ON and the predetermined time period has not elapsed after the image is hidden, that of the tagged image, and that of the image having no tag and unrelated to the active image are not changed. The fact that the user operation flag of an image is ON means that the user has changed the display angle of the image through, for example, a touch operation. Thus, when the display angle is changed in accordance with a change in the display angle of another image (the active image), the display range the user has been viewing shifts. In a case where the user has caused the display unit 105 to display a desired subject by performing a display range change operation, even in a case where another image is once displayed on the display unit 105 and the orientation of the smartphone 100 is changed, when the image for which the flag is ON is displayed again, it is better if the user can see the desired subject the user was immediately previously looking at. By not changing the display angle as described above, the user can compare a plurality of images including the same subject and see the display range that the user wants to check in each image without searching for it every time switching between display images is performed.
For example, when the images having image numbers M+1 and M+2 are displayed as illustrated in
As described above, as illustrated in
In this case, when the images displayed on the display unit 105 are the images having image numbers M and M+1 as illustrated in
In contrast, in a case where originally the display angle α(M+2)=90° in
A related image determination method does not have to be the method that is indicated in S308 of
According to the embodiment as described above, in a case where the user displays a plurality of images in order, the user can check a desired portion of each image in a user-friendly manner. Even in a case where the display angle of one of the plurality of images is changed as the orientation of the smartphone 100 is changed, in a case where the user has already been adjusted the display angle, the display angle is not changed, and thus the display angle is not unintentionally changed.
After the display angle of a first image is adjusted, another image is displayed once, the orientation of the smartphone 100 is changed, and then the display angle of the other image is changed. When the first image is displayed again, the subject displayed last time can be checked.
Note that the determinations in S605, S606, and S607 described in
In addition, in a case where the images having image numbers 1 to N are displayed in order and where an image that has not yet been displayed is displayed in accordance with the display image switching operation in S305 of
Note that the embodiment above has been described by taking 360° images as an example; however, an image a portion of which is displayed on the display unit 105 and a display portion of which is changed as the orientation of the smartphone 100 is changed such as, for example, a panorama image or a 180° image may also be used. Note that, in the embodiment above, it has been described that the display angle is changed as the orientation of the smartphone 100 is changed on the XY plane; however, the case where the display angle is changed is not limited to this, and the display angle may also be changed as the orientation of the smartphone 100 is changed on the XZ plane.
In addition, the embodiment above has been described by taking a case where a post in an SNS is selected and a plurality of images are displayed as an example; however, this case is not the only case, and the embodiment is also applicable to a case where an image file is selected and a plurality of images in the image file are displayed. Furthermore, it has been described that a plurality of images are aligned in the Y-axis direction on the display unit 105 and when each image is displayed, a portion of the image is displayed on the display unit 105; however, the individual images may be aligned in the X-axis direction and the Y-axis direction. Furthermore, the embodiment may be applicable to a case where not a portion of each image but the entirety of the image is displayed and a portion of each image is displayed in accordance with selection of the image. In that case, the processing described above is performed on the display angle at which display is performed in accordance with selection of the image.
Note that the above-described various types of control performed by the CPU 101 may be performed by a piece of hardware, or the entirety of the device may be controlled by a plurality of pieces of hardware sharing processing.
In addition, the present invention has been described in detail on the basis of its preferred embodiments; however, the present invention is not limited to these specific embodiments, and various embodiments included in the scope that does not depart from the gist of this invention are also included in the present invention. Furthermore, each of the above-described embodiments is a mere embodiment of the present invention, and the individual embodiments may be combined as needed.
In addition, in the embodiments described above, the case where the present invention is applied to a smartphone has been described as an example; however, this is not limited to this example, the present invention may be applicable to any electronic device that can change a display portion of an image. That is, the present invention is applicable to, for example, a portable telephone terminal, a portable image viewer, a printer device with a finder, a digital photo frame, a music player, a game machine, and an electronic book reader.
The present invention can be realized by executing the following processing. That is, the processing is processing in which the software (program) that realizes the functions of the above-described embodiment is supplied to a system or a device via a network or various types of recording medium, and a computer (or a central processing unit (CPU), a microprocessor unit (MPU), or the like) of the system or the device reads out and executes program codes. In this case, the program and the recording medium or mediums storing the program are included in the present invention.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
Number | Date | Country | Kind |
---|---|---|---|
2017-223649 | Nov 2017 | JP | national |
2018-207347 | Nov 2018 | JP | national |
This application is a Continuation of International Patent Application No. PCT/JP2018/041811, filed Nov. 12, 2018, which claims the benefit of Japanese Patent Application No. 2017-223649, filed Nov. 21, 2017 and No. 2018-207347, filed Nov. 2, 2018, all of which are hereby incorporated by reference herein in their entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2018/041811 | Nov 2018 | US |
Child | 16860931 | US |