Embodiments of the present disclosure relate to an electronic apparatus.
Various techniques relating to an electronic apparatus are conventionally proposed.
An electronic apparatus, control device, recording medium, and display method are disclosed. In one embodiment, an electronic apparatus comprises a display. The display reproduces a video. The display displays a seek bar in which a slider moves on a line-shaped object in accordance with progress of a reproduction of the video and a position of the slider on the line-shaped object indicates a reproduced part of the video. The line-shaped object has a curved shape in accordance with first information according to the reproduced part. The position of the slider indicates the reproduced part and the first information according to the reproduced part.
In one embodiment, a control device is a control device being included in an electronic apparatus reproducing a video for controlling the electronic apparatus. The control device makes the electronic apparatus reproduce the video. The control device makes the electronic apparatus display a seek bar in which a slider moves on a line-shaped object in accordance with progress of a reproduction of the video and a position of the slider on the line-shaped object indicates a reproduced part of the video. The line-shaped object has a curved shape in accordance with information according to the reproduced part. The position of the slider indicates the reproduced part and the information according to the reproduced part.
In one embodiment, a recording medium is a computer-readable non-transitory recording medium storing a control program for controlling an electronic apparatus reproducing a video. The control program makes the electronic apparatus reproduce the video. The control program makes the electronic apparatus display a seek bar in which a slider moves on a line-shaped object in accordance with progress of a reproduction of the video and a position of the slider on the line-shaped object indicates a reproduced part of the video. The line-shaped object has a curved shape in accordance with information according to the reproduced part. The position of the slider indicates the reproduced part and the information according to the reproduced part.
In one embodiment, a display method is a display method in an electronic apparatus. The display method comprises displaying a seek bar in which a slider moves on a line-shaped object in accordance with progress of a reproduction of the video and a position of the slider on the line-shaped object indicates a reproduced part of the video. The line-shaped object has a curved shape in accordance with information according to the reproduced part. The position of the slider indicates the reproduced part and the information according to the reproduced part.
As shown in
A receiver hole 12 is located in an upper end of the front surface 1a of the electronic apparatus 1 (the front surface of the apparatus case 10). A speaker hole 13 is located in a lower end of the front surface 1a of the electronic apparatus 1. A microphone hole 14 is located in a lower side surface 1c of the electronic apparatus 1.
A lens 191 included in a first camera 190, which will be described below, can be visually recognized from the upper end of the front surface 1a of the electronic apparatus 1. As shown in
An operation button group 18 having a plurality of operation buttons 15, 16, and 17 is located in a lower end of the front surface 1a of the electronic apparatus 1. Each of the operation buttons 15, 16, and 17 is a hardware button. Specifically, each of the operation buttons 15, 16, and 17 is a press button. Each of the operation buttons 15, 16, and 17 may also be a software button displayed in the display region 11.
The operation button 15 is a back button, for example. The back button is an operation button for switching a display in the display region 11 to an immediately preceding display. The user operates the operation button 15 to switch the display in the display region 11 to the immediately preceding display.
The operation button 16 is a home button, for example. The home button is an operation button for displaying a home screen in the display region 11. The user operates the operation button 16 to display the home screen in the display region 11.
The operation button 17 is a history button, for example. The history button is an operation button to display a history of an application executed by the electronic apparatus 1 in the display region 11. When the user operates the operation button 17, the history of the application executed by the electronic apparatus 1 is displayed in the display region 11.
The controller 100 is a type of arithmetic processing device, and is a type of electrical circuit. The controller 100 controls the other components of the electronic apparatus 1 to be able to collectively manage the operation of the electronic apparatus 1. The controller 100 includes at least one processor for providing control and processing capability to execute various functions as described in detail below.
In accordance with various embodiments, the at least one processor may be executed as a single integrated circuit (IC) or as multiple communicatively coupled IC's and/or discrete circuits. The at least one processor can be executed in accordance with various known techniques.
In one embodiment, the processor includes one or more circuits or units configurable to perform one or more data computing procedures or processes by executing instructions stored in an associated memory, for example. In the other embodiment, the processor may be firmware configurable to perform one or more data computing procedures or processes (a discrete logic component, for example).
In accordance with various embodiments, the processor may comprise one or more processors, controllers, microprocessors, microcontrollers, application specific integrated circuits (ASICs), digital signal processors, programmable logic devices, field programmable gate arrays, or any combination of these devices or structures, or other known devices and structures, to perform the functions described below.
As shown in
The storage 103 comprises a volatile memory 103a such as a random access memory (RAM) and a non-volatile memory 103b such as a flash read only memory (ROM). Each of the volatile memory 103a and the non-volatile memory 103b is a non-transitory recording medium readable by the CPU 101 and the DSP 102. The non-volatile memory 103b stores a plurality of control programs 103bb to control the electronic apparatus 1. The CPU 101 and the DSP 102 execute the various control programs 103bb in the storage 103 to achieve various functions of the controller 100.
All or some of the functions of the controller 100 may be achieved by a hardware circuit that needs no software to achieve the functions above. The storage 103 may include a non-transitory computer readable recording medium other than the ROM and the RAM. The storage 103 may include, for example, a compact hard disk drive and a solid state drive (SSD).
The plurality of control programs 103bb in the storage 103 include various applications (application programs). The storage 103 stores, for example, a call application to perform a voice call and a video call, a browser to display a website, and a mail application to create, browse, send, and receive an e-mail. The storage 103 also stores a camera application to take a picture of an object using the first camera 190 and the second camera 200, a video reproduction application to reproduce a video, a map display application to display a map, and a music reproduction control application to control a reproduction of music data. The storage 103 may store at least one application in the storage 103 in advance. The electronic apparatus 1 may download the at least one application in the storage 103 from the other device and store it in the storage 103.
The wireless communication unit 110 includes an antenna 111. The wireless communication unit 110 can perform a wireless communication under control of the controller 100, using the antenna 111. The wireless communication unit 110 can receive a signal from a mobile phone different from the electronic apparatus 1 or a signal from a communication device such as a web server connected to Internet by the antenna 111 via a base station, for example. The wireless communication unit 110 can perform an amplification processing and a down-conversion on the received signal and output the processed signal to the controller 100. The controller 100 can perform a demodulation processing, for example, on the received signal which has been input, to acquire user data and control data, for example, contained in the received signal. The wireless communication unit 110 can perform an up-conversion and an amplification processing on the transmitted signal, which has been generated in the controller 100, containing the user data and the control data, and wirelessly transmit the transmitted signal which has been processed from the antenna 111. The mobile phone different from the electronic apparatus 1 or the communication device connected to Internet, for example, receives the transmitted signal from the antenna 111 via the base station, for example.
The display 120 has the display region 11 located in the front surface 1a of the electronic apparatus 1 and a display panel 130. The display 120 can display various types of information in the display region 11. The display panel 130 is a liquid crystal display panel or an organic EL panel, for example. The display panel 130 can display various types of information such as characters, symbols, and graphics under control of the controller 100. The display panel 130 faces the display region 11 in the apparatus case 10. The information displayed on the display panel 130 is displayed in the display region 11.
The touch panel 140 can detect an operation performed on the display region 11 with the operator such as the finger. The touch panel 140 is, for example, a projected capacitive touch panel. The touch panel 140 is located on a back side of the display region 11, for example. When the user performs the operation on the display region 11 with the operator such as his/her finger, the touch panel 140 can input, to the controller 100, an electrical signal in accordance with the operation. The controller 100 can specify contents of the operation performed on the display region 11 based on the electrical signal from the touch panel 140 and perform processing in accordance with the contents.
When the user operates the operation buttons 15, 16, and 17 of the operation button group 18, each of the operation buttons 15, 16, and 17 can output to the controller 100 an operation signal indicating that each of the operation buttons 15, 16, and 17 has been operated. The controller 100 can accordingly determine whether or not each operation button has been operated for each of the operation buttons 15, 16, and 17. The controller 100 to which the operation signal is input controls the other component, thereby causing the electronic apparatus 1 to execute the function allocated to the operated operation button described above.
The GPS receiver 150 has an antenna 151. The GPS receiver 150 can receive a wireless signal from a satellite of Global Positioning System (GPS) under control of the controller 100, using the antenna 151. The GPS receiver 150 can calculate a current position of the electronic apparatus 1 based on the received wireless signal. The current position obtained in the GPS receiver 150 is input to the controller 100. The GPS receiver 150 functions as a position acquisition unit to acquire a current position of the electronic apparatus 1.
The microphone 180 can convert a sound from the outside of the electronic apparatus 1 into an electrical sound signal and then output the electrical sound signal to the controller 100. The sound from the outside of the electronic apparatus 1 is taken inside the electronic apparatus 1 through the microphone hole 14 and input to the microphone 180.
The speaker 170 is, for example, a dynamic speaker. The speaker 170 can convert an electrical sound signal from the controller 100 into a sound and then output the sound. The sound being output from the speaker 170 is output outside through the speaker hole 13. The sound being output from the speaker hole 13 can be heard in a place apart from the electronic apparatus 1.
The receiver 160 can output a received sound. The receiver 160 is, for example, a dynamic speaker. The receiver 160 can convert an electrical sound signal from the controller 100 into a sound and then output the sound. The sound being output from the receiver 160 is output outside through the receiver hole 12. A volume of the sound being output through the receiver hole 12 is set to be smaller than a volume of the sound being output through the speaker hole 13. The sound being output through the receiver hole 12 can be heard when the user brings the receiver hole 12 close to his/her ear. The electronic apparatus 1 may comprise a vibration element such as a piezoelectric vibration element for causing a portion of the front surface of the apparatus case 10 to vibrate instead of the receiver 160. In the above case, the sound is transmitted to the user from the portion of the front surface.
The first camera 190 comprises the lens 191 and an imaging element, for example. The second camera 200 has the lens 201 and an imaging element, for example. Each of the first camera 190 and the second camera 200 can take a still image or a video under control of the controller 100 and then output the still image or the video to the controller 100.
The lens 191 of the first camera 190 can be visually recognized from the front surface 1a of the electronic apparatus 1. Accordingly, the first camera 190 can take an image of an object located on a side of the front surface 1a (a side of the display region 11) of the electronic apparatus 1. The lens 201 of the second camera 200 can be visually recognized from the rear surface 1b of the electronic apparatus 1. Accordingly, the second camera 200 can take an image of an object located on the side of the rear surface 1b of the electronic apparatus 1. The first camera 190 is referred to as the “in-camera 190”, and the second camera 200 is referred to as the “out-camera 200” in some cases hereinafter. Each of the in-camera 190 and the out-camera 200 may be simply referred to as “the camera” in a case where they need not be specifically distinguished from each other.
The accelerometer 210 can detect an acceleration rate of the electronic apparatus 1 and output a detection signal in accordance with the detected acceleration rate to the controller 100. The controller 100 can operate the accelerometer 210 and stop the operation of the accelerometer 210.
The temperature sensor 220 can detect a temperature of the electronic apparatus 1 and output a detection signal in accordance with the detected temperature to the controller 100. The controller 100 can operate the temperature sensor 220 and stop the operation of the temperature sensor 220.
The geomagnetic sensor 230 can detect geomagnetism and output a detection signal in accordance with the detected geomagnetism to the controller 100. The controller 100 can operate the geomagnetic sensor 230 and stop the operation of the geomagnetic sensor 230.
The RTC 240 can measure a current date and time and output the current date and time to the controller 100. The RTC 240 functions as a date and time acquisition unit to acquire the current date and time.
The pressure sensor 250 can detect a pressure of a gas and a liquid. The pressure sensor 250 can detect a pressure on the electronic apparatus 1 and output a detection signal in accordance with the detected pressure. The controller 100 can operate the pressure sensor 250 and stop the operation of the pressure sensor 250.
The battery 260 can output a power source for the electronic apparatus 1. The battery 260 is, for example, a rechargeable battery. The battery 260 can supply the power source to various components such as the controller 100 and the wireless communication unit 110 included in the electronic apparatus 1.
The electronic apparatus 1 has a speed acquisition unit 350 capable of acquiring a speed of the electronic apparatus 1. The speed acquisition unit 350 has the accelerometer 210 and the speed calculation unit 300. The speed calculation unit 300 can calculate the speed of the electronic apparatus 1 based on the detection signal being output from the accelerometer 210. Hereinafter, “the speed” means the speed of the electronic apparatus 1 unless otherwise described.
The electronic apparatus 1 has a temperature acquisition unit 360 capable of acquiring a temperature of the electronic apparatus 1. The temperature acquisition unit 360 has the temperature sensor 220 and the temperature calculation unit 310. The temperature calculation unit 310 can calculate the temperature of the electronic apparatus 1 based on the detection signal being output from the temperature sensor 220. Hereinafter, “the temperature” means the temperature of the electronic apparatus 1 unless otherwise described.
The electronic apparatus 1 comprises an atmospheric pressure acquisition unit 370 capable of acquiring an atmospheric pressure around the electronic apparatus 1 and an altitude acquisition unit 380 capable of acquiring an altitude of a position of the electronic apparatus 1. The atmospheric pressure acquisition unit 370 has the pressure sensor 250 and the atmospheric pressure calculation unit 320. The altitude acquisition unit 380 has the pressure sensor 250, the atmospheric pressure calculation unit 320, and the altitude calculation unit 330. The atmospheric pressure calculation unit 320 can calculate the atmospheric pressure around the electronic apparatus 1 based on the detection signal being output from the pressure sensor 250. The altitude calculation unit 330 can calculate the altitude of the position of the electronic apparatus 1 based on the atmospheric pressure obtained in the atmospheric pressure calculation unit 320. Hereinafter, “the altitude” means the altitude of the position of the electronic apparatus 1 unless otherwise described.
The electronic apparatus 1 comprises a direction acquisition unit 390 capable of acquiring a direction in which lenses of the in-camera 190 and the out-camera 200 face. The direction acquisition unit 390 has the geomagnetic sensor 230 and the direction specifying unit 340. The direction specifying unit 340 can specify the direction in which the lens 191 of the in-camera 190 faces based on the detection signal being output from the geomagnetic sensor 230. The direction specifying unit 340 can specify the direction in which the lens 201 of the out-camera 200 faces based on the detection signal being output from the geomagnetic sensor 230.
The electronic apparatus 1 may take the video with the in-camera 190. The state of display of the electronic apparatus 1 taking the video is not limited to the example in
As shown in
Next, if the touch panel 140 receives a start instruction operation for instructing to start the video shooting in Step s4, the controller 100 makes the camera being used start taking the video in Step s5. The video taken with the camera being used is stored in the non-volatile memory 103b. The video taken with the camera being used means not the through image but the video stored in the non-volatile memory 103b unless otherwise described. The video taken with the camera being used and stored in the non-volatile memory 103b is simply referred to as the “shooting video” in some cases.
If the camera being used starts the video shooting, the controller 100 makes the display 120 display the shooting video and the shooting supplemental information in Step s6. While the camera being used takes the video, the display 120 displays the video taken with the camera being used and the shooting supplemental information acquired in parallel with the video shooting in real time.
Herein, if the touch panel 140 receives the start instruction operation for instructing to start the video shooting, the controller 100 activates components, each of which is necessary to acquire the shooting supplemental information but stops its operation. Specifically, the controller 100 activates the accelerometer 210, the geomagnetic sensor 230, and the pressure sensor 250, each of which stops its operation. The speed acquisition unit 350 having the accelerometer 210 thereby starts acquiring the speed. The altitude acquisition unit 380 having the pressure sensor 250 starts acquiring the altitude. The direction acquisition unit 390 having the geomagnetic sensor 230 starts acquiring the direction of the camera lens. Since the RTC 240 always operates, for example, while the electronic apparatus 1 operates, the RTC 240 operates when the touch panel 140 receives the start instruction operation for instructing to start the video shooting.
When the camera being used starts the video shooting, the controller 100 makes the display 120 display the video taken with the camera being used and the shooting supplemental information acquired in parallel with the video shooting, that is the speed, the time, the altitude, and the direction of the camera lens in the present example, in real time.
The supplemental information screen 430 includes an analog clock 440 indicating a current time acquired by the RTC 240. The supplemental information screen 430 includes speed information 450 indicating a current speed acquired by the speed acquisition unit 350 by a numeral value and an analog type speed meter 460 indicating the current speed. The supplemental information screen 430 includes altitude information 470 indicating a current altitude acquired by the altitude acquisition unit 380 and direction information 480 indicating a current direction of the camera lens acquired by the direction acquisition unit 390.
The altitude information 470 includes a scale axis 471 on which a plurality of scales 472 are marked. A numeral value 473 are marked on each scale 472. The altitude information 470 includes a triangle mark 474 pointing the scale 472 corresponding to the current altitude. The numeral value 473 marked on the scale 472 pointed by the mark 474 indicates the current altitude.
The direction information 480 includes the scale axis 481 on which a plurality of scales 482 each indicating a direction are marked. Characters 483 each indicating a direction are marked on some of the plurality of scales 482. The character 483 of “N” shown in
After Step s6, if the touch panel 140 receives a finish instruction operation for instructing to finish the video shooting in Step s7, the controller 100 makes the camera being used finish taking the video in Step s8. Next, the controller 100 makes the display 120 display the through image taken with the camera being used in Step s9. If the video shooting is finished, the display 120 does not display the shooting supplemental information.
The shooting supplemental information may include only some of the speed, the time, the altitude, and the direction of the camera lens. The shooting supplemental information may also include information other than the speed, the time, the altitude, and the direction of the camera lens. The shooting supplemental information may also include a current position of the electronic apparatus 1 acquired by the GPS receiver 150, for example. The shooting supplemental information may also include a temperature acquired by the temperature acquisition unit 360. The shooting supplemental information may also include an atmospheric pressure acquired by the atmospheric pressure acquisition unit 370.
The partial screen 501 shows a full-screen display button 510 and a character string 511 indicating that the screen displayed on the display 120 is the reproduction screen 500.
The partial screen 502 includes a shooting date and time 520 of a frame image in a currently-reproduced part of the video which is being reproduced. The controller 100 specifies the shooting date and time of each frame image of the video while the video is taken with the camera being used based on the date and time being output from the RTC 240. Then, the controller 100 associates each frame image with its shooting date and time, and stores them in the non-volatile memory 103b. Accordingly, the display 120 can display the shooting date and time 520 of the frame image in the currently-reproduced part of the video 530 which is being reproduced, that is to say, the frame image which is currently displayed, under control of controller 100. The shooting date and time 520 changes from moment to moment during the reproduction of the video 530. The partial screen 502 shows a reproduction time 521 of the video 530. The controller 100 can obtain the reproduction time 521 based on the date and time being output from the RTC 240.
The partial screen 503 shows the video 530 being reproduced. The partial screen 503 sequentially displays each frame image 531 of the video 530 which is being reproduced. The partial screen 504 shows a seek bar 540 indicating the currently-reproduced part of the video 530 which is being reproduced. The seek bar is also referred to as a progress bar or a reproduction indicator. The partial screen 505 shows a pause button 550, a fast-backward button 551, and a fast-forward button 552.
If the touch panel 140 detects a predetermined operation (a tap operation, for example) performed on the full-screen display button 510, the controller 100 makes the display 120 display the video 530 in almost the whole display region 11.
If the touch panel 140 detects a predetermined operation (a long-tap operation, for example) performed on the fast-backward button 551, the controller 100 turns back the reproduced part of the video 530 which is reproduced by the display 120 to the previous one. The controller 100 turns back the reproduced part of the video 530 to the previous one while the user touches the fast-backward button 551 with an operator such as his/her finger.
If the touch panel 140 detects a predetermined operation (a long-tap operation, for example) performed on the fast-forward button 552, the controller 100 moves the reproduced part of the video 530 which is reproduced by the display 120 forward. The controller 100 moves the reproduced part of the video 530 forward while the user touches the fast-forward button 551 with the operator such as his/her finger.
If the touch panel 140 detects a predetermined operation (a tap operation, for example) performed on the pause button 550, the controller 100 controls the display 120 so that the reproduction of the video 530 in the display 120 is stopped. At this time, the reproduction of the video 530 is stopped in the reproduced part of the video 530 when the touch panel 140 detects the predetermined operation performed on the pause button 550.
The reproduction screen 500 in which the video is being reproduced is referred to as a “reproduction screen 500a” and the reproduction screen 500 in which the reproduction of the video is stopped is referred to as a “reproduction screen 500b” in some cases.
The seek bar 540 is described in detail next. In the seek bar 540, a slider 542 moves on a line-shaped object 541 in accordance with a progress of the reproduction of the video 530. The slider 542 moves on the line-shaped object 541 from left to right as the reproduction of the video 530 proceeds. A position of the slider 542 on the line-shaped object 541 indicates a currently-reproduced part of the video 530 which is being reproduced. In the present example, if a center 542a of the slider 542 is located in a left end 541a of the line-shaped object 541, the reproduced part of the video 530 falls on an initial frame image. If the center 542a of the slider 542 is located in a right end 541b of the line-shaped object 541, the reproduced part of the video 530 falls on a last frame image.
As shown in
As described above, the position of the slider 542 on the line-shaped object 541 is deemed to indicate the reproduced part of the video 530 regardless of whether the video 530 is reproduced or the reproduction of the video 530 is stopped. In other words, the position of the slider 542 on the line-shaped object 541 is deemed to indicate which frame image in the video 530 the display 120 currently displays.
In the seek bar 540, a first portion 541c in the line-shaped object 541 corresponding to a part in which the reproduction of the video 530 is finished is displayed by a thick line. That is to say, a width of the first portion 541c located on a left side of the center 542a of the slider 542 in the line-shaped object 541 increases. In the meanwhile, a second portion 541d in the line-shaped object 541 corresponding to a part in which the reproduction of the video 530 is not yet finished is displayed by a thin line. That is to say, a width of the second portion 541d located on a right side of the center 542a of the slider 542 in the line-shaped object 541 decreases.
In the present example, the line-shaped object 541 has a curved shape in accordance with predetermined information according to the reproduced part of the video 530. Then, the predetermined information according to the reproduced part of the video 530 is indicated by the position of the slider 542 indicating the reproduced part. That is to say, the position of the slider 542 moving on the curved line-shaped object 541 indicates not only the reproduced part of the video 530 but also the predetermined information according to the reproduced part. This predetermined information is referred to as the “reproduction supplemental information” hereinafter. The curved line-shaped object means the linear object which does not have a straight shape. Accordingly, the curved line-shaped object includes a bent line-shaped object, a folded-line-shaped object, and a pulsed line-shaped object, for example. The curved line-shaped object is also deemed as the non-straight line-shaped object.
A position of the slider 542 on the line-shaped object 541 in the X axis direction indicates the reproduced part of the video 530. That is to say, the position of the slider 542 on the line-shaped object 541 in the X axis direction indicates the reproduced part which is reproduced when the elapsed time for the reproduction coincides with an X coordinate value X1 of the center 542a of the slider 542. In other words, the position of the slider 542 on the line-shaped object 541 in the X axis direction indicates the reproduced part which is reproduced when the X coordinate value X1 indicating the position has elapsed since the reproduction of the video 530 has been started from its beginning. During the reproduction of the video 530, the position of the slider 542 on the line-shaped object 541 in the X axis direction indicates the currently-reproduced part of the video 530. In a case where the reproduction of the video 530 is stopped, the position of the slider 542 on the line-shaped object 541 in the X axis direction indicates the reproduced part in which the reproduction of the video 530 is started, that is to say, the reproduced part which is reproduced first when the reproduction of the video 530 is started.
The X coordinate value on the left end 541a of the line-shaped object 541 indicates 0. The X coordinate value on the right end 541b of the line-shaped object 541 indicates the shooting time of the video 530. The X coordinate value on the right end 541b of the line-shaped object 541 is deemed to indicate a time necessary to reproduce the video 530 from beginning to end. If the shooting time is ten minutes, for example, the X coordinate value on the right end 541b of the line-shaped object 541 indicates ten minutes. A length of the line-shaped object 541 in the X axis direction is also deemed to indicate the shooting time of the video 530 which has been taken.
The position of the slider 542 on the line-shaped object 541 in the X axis direction is referred to as the “X axis direction position of the slider 542” hereinafter. The position of the slider 542 on the line-shaped object 541 in the Y axis direction is referred to as the “Y axis direction position of the slider 542”.
In the meanwhile, the Y axis direction position of the slider 542 indicates the reproduction supplemental information in accordance with the reproduced part indicated by the X axis direction position of the slider 542. A Y coordinate value Y1 of the center 542a of the slider 542 indicating the Y axis direction position of the slider 542 indicates the reproduction supplemental information in accordance with the reproduced part which is reproduced when the elapsed time for the reproduction coincides with the X coordinate value X1. In other words, the Y coordinate value Y1 indicates the reproduction supplemental information in accordance with the reproduced part which is reproduced when the X coordinate value X1 has elapsed since the reproduction of the video 530 has been started from its beginning. During the reproduction of the video 530, the Y axis direction position of the slider 542 indicates the reproduction supplemental information in accordance with the currently-reproduced part of the video 530. In the case where the reproduction of the video 530 is stopped, the Y axis direction position of the slider 542 indicates the reproduction supplemental information in accordance with the reproduced part which is reproduced first when the reproduction of the video 530 is started.
Adopted as the reproduction supplemental information is, for example, an altitude of a shooting position of a frame image of the video 530 at the time of taking the frame image. As described above, the controller 100 associates each frame image of the shooting video with its shooting date and time, and stores them in the non-volatile memory 103b. While the video is taken with the camera being used, the controller 100 further associates the altitude acquired by the altitude acquisition unit 380 at the time of taking the frame image of the video and the frame image, and stores them in the non-volatile memory 103b. The altitude acquired by the altitude acquisition unit 380 at the time of taking the frame image is deemed as the altitude of the position in which the electronic apparatus 1 is located at the time of taking the frame image. Accordingly, the altitude acquired by the altitude acquisition unit 380 at the time of taking the frame image is deemed as the altitude of the shooting position of the frame image at the time of taking the frame image. Thus, as shown in
In the case where the shooting position altitude of the frame image is adopted as the reproduction supplemental information, the Y axis 582 indicates the shooting position altitude of the frame image. The Y axis direction position of the slider 542 indicates the shooting position altitude of the frame image in the reproduced part indicated by the X axis direction position of the slider 542. The Y coordinate value Y1 of the center 542a of the slider 542 indicates the shooting position altitude of the frame image in the reproduced part which is reproduced when the elapsed time for the reproduction coincides with the X coordinate value X1 of the center 542a. During the reproduction of the video 530, the Y axis direction position of the slider 542 indicates the shooting position altitude of the frame image in the currently-reproduced part of the video 530. In the case where the reproduction of the video 530 is stopped, the Y axis direction position of the slider 542 indicates the shooting position altitude of the frame image in the reproduced part which is reproduced first when the reproduction of the video 530 is started. In the present example, the shooting position altitude indicated by the Y axis 582 increases toward a plus direction of the Y axis 582.
As described above, the frame image and the shooting position altitude of the frame image are associated with each other in the non-volatile memory 103b. The elapsed time for the reproduction and the shooting time can be specified from the shooting date and time associated with each frame image. Thus, the controller 100 can make the display 120 display the seek bar 540 shown in
The user can make the electronic apparatus 1 change the position of the slider 542 on the line-shaped object 541 to be located in a desired position by operating the electronic apparatus 1 regardless of whether or not the reproduction of the video 530 is stopped. For example, as shown in
If the user makes the operator such as his/her finger come in contact with the line-shaped object 541, the center 542a of the slider 542 comes to be located in a position where the operator comes in contact in the line-shaped object 541. Thus, the user can make the electronic apparatus 1 change the position of the slider 542 on the line-shaped object 541 to be located in the desired position.
As described above, in the seek bar 540, the position of the slider 542 on the curved line-shaped object 541 indicates only the reproduced part of the video 530 but also the reproduction supplemental information in accordance with the reproduced part. Thus, the user can easily find the desired part in the video 530 based on the reproduction supplemental information indicated by the position of the slider 542.
In the present example, the reproduction supplemental information is the shooting position altitude of the frame image, thus the user can easily find the desired part in the video 530 based on the shooting position altitude of the frame image indicated by the position of the slider 542. For example, the user can easily specify the part of the video 530 taken in a position located at a high altitude. Thus, the user can make the electronic apparatus 1 change the position of the slider 542 to the position in which the part taken in the position located at the high altitude is reproduced as shown in
For example, the user can easily specify the part of the video 530 taken in a position located at a low altitude. Thus, the user can make the electronic apparatus 1 change the position of the slider 542 to the position in which the part taken in the position located at the low altitude is reproduced as shown in
The reproduction supplemental information may be the information other than the shooting position altitude of the frame image. The reproduction supplemental information may be, for example, an atmospheric pressure in a shooting position of a frame image at the time of taking the frame image. In this case, while the video is taken with the camera being used, the controller 100 associates the atmospheric pressure acquired by the atmospheric pressure acquisition unit 370 at the time of taking the frame image of the video and the frame image, and stores them in the non-volatile memory 103b. The atmospheric pressure acquired by the atmospheric pressure acquisition unit 370 at the time of taking the frame image is deemed as the atmospheric pressure around the electronic apparatus 1 at the time of taking the frame image. The atmospheric pressure acquired by the atmospheric pressure acquisition unit 370 at the time of taking the frame image is deemed as the atmospheric pressure in the shooting position of the frame image at the time of taking the frame image. Thus, the controller 100 can acquire the atmospheric pressure in the shooting position of the frame image at the time of taking the frame image. In the case where the reproduction supplemental information is the atmospheric pressure in the shooting position of the frame image at the time of taking the frame image, the Y axis direction position of the slider 542 indicates the atmospheric pressure of the shooting position of the frame image at the time of taking the frame image in the reproduced part indicated by the X axis direction position of the slider 542. In other words, the Y coordinate value Y1 of the center 542a of the slider 542 indicates the atmospheric pressure in the shooting position of the frame image at the time of taking the frame image in the reproduced part which is reproduced when the elapsed time for the reproduction coincides with the X coordinate value X1 of the center 542a. The user can easily find the desired part in the video 530 based on the atmospheric pressure indicated by the position of the slider 542. The atmospheric pressure in the shooting position of the frame image at the time of taking the frame image is simply referred to as the “shooting position atmospheric pressure” in some cases hereinafter.
The reproduction supplemental information may be the speed of the electronic apparatus 1 at the time of taking the frame image. In this case, while the video is taken with the camera being used, the controller 100 associates the speed acquired by the speed acquisition unit 350 at the time of taking the frame image of the video and the frame image, and stores them in the non-volatile memory 103b. The speed acquired by the speed acquisition unit 350 at the time of taking the frame image is deemed as the speed of the electronic apparatus 1 at the time of taking the frame image. Thus, the controller 100 can acquire the speed of the electronic apparatus 1 at the time of taking the frame image. In the case where the reproduction supplemental information is the speed of the electronic apparatus 1 at the time of taking the frame image, the Y axis direction position of the slider 542 indicates the speed of the electronic apparatus 1 at the time of taking the frame image in the reproduced part indicated by the X axis direction position of the slider 542. In other words, the Y coordinate value Y1 of the center 542a of the slider 542 indicates the speed of the electronic apparatus 1 at the time of taking the frame image in the reproduced part which is reproduced when the elapsed time for the reproduction coincides with the X coordinate value X1 of the center 542a. The user can easily find the desired part in the video 530 based on the speed indicated by the position of the slider 542.
The reproduction supplemental information may be the temperature of the electronic apparatus 1 at the time of taking the frame image. In this case, while the video is taken with the camera being used, the controller 100 associates the temperature acquired by the temperature acquisition unit 360 at the time of taking the frame image of the video and the frame image, and stores them in the non-volatile memory 103b. The temperature acquired by the temperature acquisition unit 360 at the time of taking the frame image is deemed as the temperature of the electronic apparatus 1 at the time of taking the frame image. Thus, the controller 100 can acquire the temperature of the electronic apparatus 1 at the time of taking the frame image. In the case where the reproduction supplemental information is the temperature of the electronic apparatus 1 at the time of taking the frame image, the Y axis direction position of the slider 542 indicates the temperature of the electronic apparatus 1 at the time of taking the frame image in the reproduced part indicated by the X axis direction position of the slider 542. In other words, the Y coordinate value Y1 of the center 542a of the slider 542 indicates the temperature of the electronic apparatus 1 at the time of taking the frame image in the reproduced part which is reproduced when the elapsed time for the reproduction coincides with the X coordinate value X1 of the center 542a. The user can easily find the desired part in the video 530 based on the temperature indicated by the position of the slider 542.
The reproduction supplemental information may be a distance of the electronic apparatus 1 moving from a position of starting taking the video 530 to a position of taking the frame image of the video 530. In this case, while the video is taken with the camera being used, the controller 100 associates the position of the electronic apparatus 1 acquired by the GPS receiver 150 at the time of taking the frame image of the video and the frame image, and stores them in the non-volatile memory 103b. Then, the controller 100 obtains the distance of the electronic apparatus 1 moving from the position of starting taking the video 530 to the position of taking each frame image based on the position associated with each frame image in the non-volatile memory 103b. The distance of the electronic apparatus 1 moving from the position of starting taking the video 530 to the position of taking the frame image is referred to as the “moving distance at the time of taking the frame image” in some cases hereinafter.
In the case where the reproduction supplemental information is the moving distance at the time of taking the frame image, the Y axis direction position of the slider 542 indicates the moving distance at the time of taking the frame image in the reproduced part indicated by the X axis direction position of the slider 542. In other words, the Y coordinate value Y1 of the center 542a of the slider 542 indicates the moving distance at the time of taking the frame image in the reproduced part which is reproduced when the elapsed time for the reproduction coincides with the X coordinate value X1 of the center 542a. The user can easily find the desired part in the video 530 based on the moving distance at the time of taking the frame image indicated by the position of the slider 542.
The reproduction supplemental information may be the information whether or not a predetermined event occurs at the time of taking the frame image.
In the case where the reproduction supplemental information is the information whether or not the predetermined event occurs at the time of taking the frame image, the controller 100 determines whether or not the predetermined event has occurred while the video 530 is taken. If the controller 100 determines that the predetermined event has occurred while the video 530 is taken, the controller 100 specifies an event occurrence period when the predetermined event occurs. The event occurrence period can be specified based on the time acquired by the RTC 240 while the video 530 is taken. Then, the controller 100 specifies the frame image taken in the event occurrence period. Accordingly, the controller 100 can specify whether or not the predetermined event occurs at the time of taking the frame image for each frame image of the video 530. In a case where the predetermined event has occurred several times while the video 530 is taken, the controller 100 specifies the frame image taken in each event occurrence period for the predetermined event having occurred several times.
Various events are considered as the predetermined event. For example, a trouble of the user of the electronic apparatus 1 is considered as the predetermined event. For example, in a case where the electronic apparatus 1 is fixed to the bicycle 60 of the user 50 as shown in
An event that the electronic apparatus 1 goes under the water may be the predetermined event. The controller 100 can determine whether or not the electronic apparatus 1 is located in the water based on the detection signal being output from the pressure sensor 250. If the controller 100 determines that the electronic apparatus 1 goes under the water while the video 530 is taken, the controller 100 specifies a period when the electronic apparatus 1 is located in the water. Then, the controller 100 specifies the frame image in the video 530 taken in the period when the electronic apparatus 1 is located in the water. Accordingly, the controller 100 can specify whether or not the electronic apparatus 1 is located in the water at the time of taking the frame image for each frame image of the video 530.
The controller 100 may switch a type of the reproduction supplemental information indicated by the Y axis direction position of the slider 542 in accordance with the instruction from the user. For example, as shown in
Considered herein, for example, is a case where the shooting position altitude and the shooting position atmospheric pressure of the frame image are used as the reproduction supplemental information. In a case where the reproduction supplemental information which the Y axis direction position of the slider 542 currently indicates is the shooting position altitude of the frame image, if the user performs the predetermined operation on the switching button 650, the controller 100 switches the reproduction supplemental information indicated by the Y axis direction position of the slider 542 to the shooting position atmospheric pressure of the frame image. Then, if the user performs the predetermined operation on the switching button 650 again, the controller 100 switches the reproduction supplemental information indicated by the Y axis direction position of the slider 542 to the shooting position altitude of the frame image. The controller 100 operates in the similar manner afterward.
Considered is a case where the shooting position altitude of the frame image, the speed of the electronic apparatus 1 at the time of taking the frame image, and the temperature of the electronic apparatus 1 at the time of taking the frame image are used as the reproduction supplemental information. In a case where the reproduction supplemental information which the Y axis direction position of the slider 542 currently indicates is the shooting position altitude of the frame image, if the user performs the predetermined operation on the switching button 650, the controller 100 switches the reproduction supplemental information indicated by the Y axis direction position of the slider 542 to the speed of the electronic apparatus 1 at the time of taking the frame image, for example. Then, if the user performs the predetermined operation on the switching button 650 again, the controller 100 switches the reproduction supplemental information indicated by the Y axis direction position of the slider 542 to the temperature of the electronic apparatus 1 at the time of taking the frame image. Then, if the user performs the predetermined operation on the switching button 650 again, the controller 100 switches the reproduction supplemental information indicated by the Y axis direction position of the slider 542 to the shooting position altitude of the frame image. The controller 100 operates in the similar manner afterward.
Since such a switching button 650 is provided, the user can easily make the electronic apparatus 1 change the type of the reproduction supplemental information indicated by the Y axis direction position of the slider 542.
As shown in
The controller 100 may make the display 120 display a selection screen 670 for the user to select the type of the reproduction supplemental information indicated by the Y axis direction position of the slider 542.
As shown in
The various modification examples of the electronic apparatus 1 are described below.
In the example described above, the electronic apparatus 1 itself acquires the reproduction supplemental information, however the user may input the reproduction supplemental information to the electronic apparatus 1. In the present example, the reproduction supplemental information is a favorite degree of the user on the frame image. The favorite degree is also deemed as a degree of importance or a degree of interest. In the present example, the Y axis direction position of the slider 542 indicates the favorite degree of the user on the frame image in the reproduced part indicated by the X axis direction position of the slider 542. In other words, the Y coordinate value Y1 of the center 542a of the slider 542 indicates the favorite degree of the user on the frame image in the reproduced part which is reproduced when the elapsed time for the reproduction coincides with the X coordinate value X1 of the center 542a. The user can input the favorite degree on the frame image to the electronic apparatus 1 by operating the line-shaped object 541 of the seek bar 540. In the present example, the favorite degree increases as the Y axis coordinate value gets larger.
The electronic apparatus 1 according to the present example has an input mode, as an operation mode, which enables the user to input the favorite degree of the frame image to the electronic apparatus 1. For example, if the long-tap operation is performed on the area other than the seek bar 540 in the partial screen 504 in the reproduction screen 500, the operation mode of the electronic apparatus 1 is switched to the input mode. A condition where the operation mode of the electronic apparatus 1 is switched to the input mode is not limited thereto.
If the user performs an operation along the Y axis 582 on the line-shaped object 541 displayed on the display 120 (an operation along an upper and lower direction in
In the present example, if the user performs the slide operation of sliding the operator such as his/her finger along the Y axis 582 with the operator being in contact with a certain position in the line-shaped object 541, the height of the certain position along the Y axis 582 changes in accordance with the slide operation. Assumed as shown in
Assumed is a case where the user performs the slide operation of sliding the finger 600 along a minus direction of the Y axis 582 with the finger 600 being in contact with a certain position in the line-shaped object 541. In this case, the height of the position in the line-shaped object 541, in the Y axis direction, with which the finger 600 comes in contact (the Y coordinate value in the position) decreases, and the position coincides with the position in which the slide operation of the finger 600 is finished.
As described above, the height of the position in the line-shaped object 541 on which the operation has been performed along the Y axis 582 changes in accordance with the operation performed by the user along the Y axis 582 on the line-shaped object 541. In the present example, the Y axis 582 indicates the favorite degree of the frame image, thus the user can, by performing the operation along the Y axis 582 on the line-shaped object 541 with the operator such as his/her finger, input the favorite degree of the frame image corresponding to the position in the line-shaped object 541 on which the operation is performed to the electronic apparatus 1. For example, the user performs the slide operation of sliding the finger 600 with reference to the frame image shown in the partial screen 703 when the finger 600 comes in contact with the certain position in the line-shaped object 541 to make the electronic apparatus 1 change the height of the position in the Y axis direction. Accordingly, the frame image corresponding to the certain position, that is to say, the favorite degree of the frame image shown in the partial screen 703 is adjusted.
For example, if the long-tap operation is performed on the area other than the seek bar 540 in the partial screen 704 in the input screen 700, the input mode is released and the display 120 displays the reproduction screen 500. The condition where the input mode is released is not limited thereto.
Even if the reproduction supplemental information is other than the favorite degree of the frame image, the user can input the reproduction supplemental information to the electronic apparatus 1 by performing the operation in the similar manner on the line-shaped object 541.
In the present example, the line-shaped object 541 is color-coded in accordance with the second reproduction supplemental information according to the reproduced part of the video 530. Then, the second reproduction supplemental information according to the reproduced part of the video 530 is indicated by a color of a part in which the slider 542, whose X axis direction position in the line-shaped object 541 indicates the reproduced part, is located. The second reproduction supplemental information according to the reproduced part of the video 530 is other than information indicating whether or not the reproduced part has been reproduced. This point is specifically described hereinafter.
The first reproduction supplemental information is the shooting position altitude of the frame image, for example. The second reproduction supplemental information is information that a predetermined type of value regarding the reproduced part falls within a predetermined range, for example.
In the present example, two types of information are used as the second reproduction supplemental information, for example. First information which is initial information is information that the predetermined type of value regarding the reproduced part is equal to or larger than a first predetermined value, for example. Second information which is secondary information is information that the predetermined type of value regarding the reproduced part is equal to or smaller than a second predetermined value, for example. In the present example, the speed at the time of taking the frame image is adopted as the predetermined type of value. Accordingly, in the present example, the first information is the information that the speed at the time of taking the frame image is equal to or larger than the first predetermined value. The second information is the information that the speed at the time of taking the frame image is equal to or smaller than the second predetermined value. The first predetermined value is set to a value larger than the second predetermined value, for example. The first predetermined value may be the same as the second predetermined value.
The line-shaped object 541 of the seek bar 540 is color-coded to have three colors. For example, the line-shaped object 541 has a blue part 750, a red part 751, and a green part 752. The line-shaped object 541 is made up of the blue part 750 except for the red part 751 and the green part 752. The red part 751 corresponds to the first information. The green part 752 corresponds to the second information. The blue part 750 does not correspond to the second reproduction supplemental information. The red part 751 and the green part 752 are thicker than the blue part 750. In
The red part 751 of the line-shaped object 541 means that the speed at the time of taking the frame image in the reproduced part corresponding to the red part 751 is equal to or larger than the first predetermined value. That is to say, the speed at the time of taking the frame image in the reproduced part indicated by the X axis direction position of the slider 542, in which the position of the center 542a is located on the red part 751, is equal to or larger than the first predetermined value.
The green part 752 of the line-shaped object 541 means that the speed at the time of taking the frame image in the reproduced part corresponding to the green part 752 is equal to or smaller than the second predetermined value. That is to say, the speed at the time of taking the frame image in the reproduced part indicated by the X axis direction position of the slider 542, in which the position of the center 542a is located on the green part 752, is equal to or smaller than the second predetermined value.
As described above, in the present modification example, the second reproduction supplemental information according to the reproduced part of the video 530 is indicated by a color of a part in which the slider 542, whose X axis direction position in the line-shaped object 541 indicates the reproduced part, is located. Thus, the user can easily find the desired part in the video 530 based on the second reproduction supplemental information.
The types of colors used for color-coding the line-shaped object 541 are not limited to the example described above. Also in the present example, the reproduction supplemental information may be the information other than the shooting position altitude of the frame image. The first reproduction supplemental information is the temperature at the time of taking the frame image, for example. The second reproduction supplemental information is not limited to the example described above. The modification example of the second reproduction supplemental information is described below.
The second information needs not be adopted as the second reproduction supplemental information. In this case, the green part 752 becomes the blue part 750. The first information needs not be adopted as the second reproduction supplemental information. In this case, the red part 751 becomes the blue part 750.
The first information may be information that the predetermined type of value regarding the reproduced part is larger than the first predetermined value. The second information may be information that the predetermined type of value regarding the reproduced part is smaller than the second predetermined value.
Third information that the predetermined type of value regarding the reproduced part is equal to or larger than a third predetermined value and equal to or smaller than a fourth predetermined value may be adopted as the second reproduction supplemental information. The fourth predetermined value is set to a value larger than the third predetermined value. In the case where the third information is adopted as the second reproduction supplemental information, only the first information in the first and second information may be adopted as the second reproduction supplemental information. In this case, the fourth predetermined value is set to be equal to or smaller than the first predetermined value. In the case where the third information is adopted as the second reproduction supplemental information, only the second information in the first and second information may be adopted as the second reproduction supplemental information. In this case, the third predetermined value is set to be equal to or larger than the second predetermined value. In the case where the third information is adopted as the second reproduction supplemental information, the first and second information may be adopted as the second reproduction supplemental information. In this case, the first predetermined value is set to be larger than the second predetermined value, the third predetermined value is set to be equal to or larger than the second predetermined value, and the fourth predetermined value is set to be equal to or smaller than the first predetermined value. The third information may be information that the predetermined type of value regarding the reproduced part is larger than the third predetermined value and equal to or smaller than the fourth predetermined value. The third information may be information that the predetermined type of value regarding the reproduced part is equal to or larger than the third predetermined value and smaller than the fourth predetermined value. The third information may be information that the predetermined type of value regarding the reproduced part is larger than the third predetermined value and smaller than the fourth predetermined value.
The predetermined type of value regarding the second reproduction supplemental information may be a value other than the speed at the time of taking the frame image. For example, the predetermined type of value may be the temperature at the time of taking the frame image, the shooting position altitude of the frame image, or the shooting position atmospheric pressure of the frame image. The first to fourth predetermined values are appropriately determined by the type of the predetermined type of value.
The second reproduction supplemental information may be information that an event occurs at the time of taking the frame image.
The red part 751 means that a predetermined event occurs at the time of taking the frame image in the reproduced part corresponding to the red part 751. That is to say, the predetermined event occurs at the time of taking the frame image in the reproduced part indicated by the X axis direction position of the slider 542, in which the position of the center 542a is located on the red part 751. For example, the trouble of the user of the electronic apparatus 1 is considered as the predetermined event as described. The event that the electronic apparatus 1 goes under the water is considered as the predetermined event. The predetermined event is not limited thereto.
The type of the second reproduction supplemental information can be changed by the user as is the case where the type of the first reproduction supplemental information can be changed by the user. In this case, for example, a switching button similar to the switching button 650 shown in
One end 541e of the line-shaped object 541 corresponds to a starting point of the moving route. Accordingly, if the center 542a of the slider 542 is located in the one end 542e, the re0156produced part of the video 530 falls on an initial frame image. That is to say, if the center 542a of the slider 542 is located in the one end 542e, the frame image taken in the starting point of the moving route is reproduced. Other end 541f of the line-shaped object 541 corresponds to an ending point of the moving route. Accordingly, if the center 542a of the slider 542 is located in the other end 542f, the reproduced part of the video 530 falls on a last frame image. That is to say, if the center 542a of the slider 542 is located in the other end 542f, the frame image taken in the ending point of the moving route is reproduced. A length of the line-shaped object 541 is deemed to indicate a total moving distance that the electronic apparatus 1 has moved from start to finish of taking the video 530. A moving direction of the slider 542 is deemed to indicate a moving direction of the electronic apparatus 1, in other words, a traveling direction of the user.
The display 120 according to the present modification example has first and second display modes as a display mode. The display 120 switches the display mode in accordance with an instruction from the user. In the first display mode, the seek bar 540 is displayed on a map so that the line-shaped object 541 coincides with the moving route on the map. In the meanwhile, in the second display mode, the seek bar 540 is displayed without the display of the map, and the video 530 is displayed to be larger than that in the first display mode. The reproduction screen 500 shown in
The partial screen 504 of the reproduction screen 500A shows a switching button 545 for switching the display mode of the display 120 from the second display mode to the first display mode. If the user performs a predetermined operation (a tap operation, for example) on the switching button 545, the display mode of the display 120 changes from the second display mode to the first display mode.
The reproduction screen 500B includes partial screens 506 to 509. The partial screens 506 and 509 are the same as the partial screens 501 and 505 of the reproduction screen 500A.
The partial screen 507 shows a switching button 570 for switching the display mode of the display 120 from the first display mode to the second display mode. If the user performs a predetermined operation (a tap operation, for example) on the switching button 570, the display mode of the display 120 changes from the first display mode to the second display mode.
The partial screen 507 further shows the video 530 being reproduced. As shown in
The partial screen 508 shows a map 580 including the moving route. The partial screen 508 shows the seek bar 540 on the map 580 so that the line-shaped object 541 coincides with the moving route on the map 580. Accordingly, the user can recognize how the electronic apparatus 1 has moved on the map 580 at the time of taking the video 530. In other words, the user can recognize how the user has moved on the map 580 at the time of taking the video 530.
The display mode of the display 120 may include only the first display mode in the first and second display modes. That is to say, the reproduction screen 500A shown in
The slider 542 may indicate the direction of the camera lens when the electronic apparatus 1 takes the frame image in the reproduced part indicated by the position of the slider 542.
As shown in
The display 120 may display a display object having a shape similar to the shape of the seek bar 540 when the camera being used takes the video.
The shooting screen 800 includes partial screens 801 to 805. The partial screens 801 to 805 are arranged in this order from above in the shooting screen 800. The partial screen 801 shows a character string 810 indicating that the video is being taken. The partial screen 802 shows a current shooting date and time 820 of the video and an elapsed time for a video shooting 821. The partial screen 803 shows a video 830 being taken. The partial screen 803 sequentially shows the frame images taken with the camera being used. The partial screen 805 shows a shooting finish button 850 for finishing the shooting of the video 830. If the user performs a predetermined operation (a tap operation, for example) on the shooting finish button 850, the shooting of the video 830 is finished. The predetermined operation performed on the shooting finish button 850 falls under the finish instruction operation in Step s7 described above. The partial screen 804 shows a shooting indicator 840 indicating how the shooting of the video 830 proceeds.
The shooting indicator 840 includes a line-shaped object 841 which becomes elongated in accordance with the progress of the shooting of the video 830. A tip 841a of the line-shaped object 841 indicates a mark 842. The mark 842 indicates a position of the tip 841a of the line-shaped object 841. A position of a center 842a of the mark 842 coincides with the position of the tip 841a of the line-shaped object 841. The line-shaped object 841 has a curved shape in accordance with predetermined information according to the frame image included in a part in which the shooting of the video 830, which is being taken, is finished. This predetermined information is referred to as the “second shooting supplemental information” to separate from the shooting supplemental information described above. The position of the tip 841a of the line-shaped object 841 indicates how the shooting of the video 830 proceeds and the second shooting supplemental information in accordance with the current shooting frame image. In other words, the position of the mark 842 indicates how the shooting of the video 830 proceeds and the second shooting supplemental information in accordance with the current shooting frame image. In still other words, the position of the tip 841a of the line-shaped object 841 indicates how the shooting of the video 830 proceeds and the second shooting supplemental information in accordance with the frame image currently displayed in the partial screen 803.
The X coordinate value X2 on the tip 841a of the line-shaped object 841 indicates the elapsed time for the shooting. Thus, the position of the tip 841a of the line-shaped object 841 in the X axis direction is deemed to indicate how the shooting of the video 830 proceeds. As the elapsed time for the shooting increases, the X coordinate value X2 on the tip 841a of the line-shaped object 841 gets large. In other words, as the elapsed time for the shooting increases, the length of the line-shaped object 841 in the X axis direction gets large. An X coordinate value of a fixed end 841b located on an opposite side of the tip 841a in the line-shaped object 841 indicates zero.
In the meanwhile, the position of the tip 841a of the line-shaped object 841 in the Y axis direction indicates the second shooting supplemental information in accordance with the current shooting frame image. That is to say, the Y coordinate value Y2 of the tip 841a of the line-shaped object 841 indicates the second shooting supplemental information in accordance with the current shooting frame image. The Y coordinate value in a certain position in the line-shaped object 841 indicates the second shooting supplemental information in accordance with the frame image taken when the shooting of the video 830 has proceeded for a period of time indicated by the X coordinate value in the certain position. The Y coordinate value of the fixed end (the starting point) 841b of the line-shaped object 841 indicates the second shooting supplemental information in accordance with the frame image which has taken first.
Information similar to the reproduction supplemental information described above, for example, can be adopted as the second shooting supplemental information. Adopted as the second shooting supplemental information is, for example, an altitude of a shooting position of a frame image of the video 830 at the time of taking the frame image (a shooting position altitude of the frame image). In other words, adopted as the second shooting supplemental information is an altitude acquired by the altitude acquisition unit 380 at the time of taking the frame image.
In the case where the shooting position altitude of the frame image is adopted as the second shooting supplemental information, the Y axis 852 indicates the shooting position altitude of the frame image. The position of the tip 841a of the line-shaped object 841 in the Y axis direction indicates the shooting position altitude of the current shooting frame image. That is to say, the Y coordinate value Y2 of the tip 841a of the line-shaped object 841 indicates the shooting position altitude of the current shooting frame image. In the present modification example, the shooting position altitude indicated by the Y axis 852 increases toward the plus direction of the Y axis 852. The X axis 851 and the Y axis 852 shown in
As described above, in the shooting indicator 840, the position of the tip 841a of the line-shaped object 841 indicates how the shooting of the video proceeds and the second shooting supplemental information in accordance with the current shooting frame image. Thus, the user can easily recognize how the shooting of the video proceeds and the second shooting supplemental information in accordance with the current shooting frame image.
The second shooting supplemental information may be the information other than the shooting position altitude of the frame image. The second shooting supplemental information may be the shooting position atmospheric pressure of the frame image, for example. In this case, the position of the tip 841a of the line-shaped object 841 indicates how the shooting of the video proceeds and the shooting position atmospheric pressure of the current shooting frame image. The second shooting supplemental information may be the speed of the electronic apparatus 1 at the time of taking the frame image, the temperature of the electronic apparatus 1 at the time of taking the frame image, or the moving distance at the time of taking the frame image.
The second shooting supplemental information may be the information whether or not a predetermined event occurs at the time of taking the frame image.
In a manner similar to the switching of the type of the reproduction supplemental information indicated by the Y axis direction position of the slider 542, the controller 100 may switch the type of the second shooting supplemental information indicated by the position of the tip 841a of the line-shaped object 841 in the Y axis direction in accordance with the instruction from the user. In this case, for example, a switching button similar to the switching button 650 shown in
The shooting screen 800 may include specific information similar to the specific information 660 shown in
The controller 100 may make the display 120 display a selection screen similar to the selection screen 670 shown in
The display 120 may display a value similar to the value 680 of the reproduction supplemental information shown in
In the electronic apparatus 1 according to the fourth modification example described above, another information different from the second shooting supplemental information indicated by the position of the tip 841a of the line-shaped object 841 in the Y axis direction may be indicated by color-coding the line-shaped object 841 in a manner similar to the second modification example described above. This another information is referred to as the “third shooting supplemental information” hereinafter.
The line-shaped object 841 shown in
Information similar to the second reproduction supplemental information described above, for example, can be adopted as the third shooting supplemental information. Information that the speed at the time of taking the frame image is equal to or larger than the first predetermined value, for example, can be adopted as the third shooting supplemental information. In this case, the red part 901 means that the speed at the time of taking the shooting frame image corresponding to the red part 901 is equal to or larger than the first predetermined value. If the speed at the time of taking the current shooting frame image is equal to or larger than the first predetermined value, the tip 841a of the line-shaped object 841 becomes the red part 901. That is to say, if the current speed of the electronic apparatus 1 is equal to or larger than the first predetermined value, the tip 841a of the line-shaped object 841 becomes the red part 901. In a case where a certain frame image in the video 830 is taken when a certain period time has elapsed since the shooting of the video 830 has been started, if a color of a part of the line-shaped object 841 in which the X coordinate value indicates the certain period of time is the red color, the speed at the time of taking the certain frame image is equal to or larger than the first predetermined value.
As described above, in the present modification example, the third shooting supplemental information according to the shooting frame image is indicated by a color of a part corresponding to the shooting frame image in the line-shaped object 841. Accordingly, the user can easily recognize how the shooting of the video 830 proceeds, the second shooting supplemental information, and the third shooting supplemental information by referring to the shooting indicator 840.
In the electronic apparatus 1 according to the fourth and fifth modification examples described above, the line-shaped object 841 may have a curved shape in accordance with a route along which the electronic apparatus 1 moves while taking the video 830 as is the case of the third modification example described above.
In the shooting indicator 840 shown in
The display 120 according to the present modification example has third and fourth display modes as a display mode. The display 120 switches the display mode in accordance with an instruction from the user. In the third display mode, the shooting indicator 840 is displayed on a map so that the line-shaped object 841 coincides with the moving route on the map. In the meanwhile, in the fourth display mode, the shooting indicator 840 is displayed without the display of the map, and the video 830 is displayed to be larger than that in the third display mode. The shooting screen 800 shown in
The partial screen 804 of the shooting screen 800A shows a switching button 910 for switching the display mode of the display 120 from the fourth display mode to the third display mode. If the user performs a predetermined operation (a tap operation, for example) on the switching button 910, the display mode of the display 120 changes from the fourth display mode to the third display mode.
The shooting screen 800B includes partial screens 806 to 809. The partial screens 806 and 809 are the same as the partial screens 801 and 805 of the shooting screen 800A.
The partial screen 807 shows a switching button 920 for switching the display mode of the display 120 from the third display mode to the fourth display mode. If the user performs a predetermined operation (a tap operation, for example) on the switching button 920, the display mode of the display 120 changes from the third display mode to the fourth display mode.
The partial screen 807 further shows the video 830 being taken. As shown in
The partial screen 808 shows a map 930 including the moving route. The partial screen 808 shows the shooting indicator 840 on the map 930 so that the line-shaped object 841 coincides with the moving route on the map 930. Accordingly, the user can recognize how the electronic apparatus 1 has moved on the map 930 at the time of taking the video 830. In other words, the user can recognize how the user has moved on the map 930 at the time of taking the video 830.
The display mode of the display 120 may include only the third display mode in the third and fourth display modes. That is to say, the shooting screen 800A shown in
The mark 842 indicating the tip 841a of the line-shaped object 841 may indicate the current direction of the camera lens.
As shown in
In the second and fifth modification examples described above, for example, the line-shaped object 541 and the line-shaped object 841 are indicated by plural types of lines whose colors are different from each other, however, they may be indicated by plural type of lines whose thicknesses are different from each other.
Each of the line-shaped object 541 and the line-shaped object 841 may be indicated by the plural types of lines including a discontinuous line.
As described above, if the line-shaped object 541 is indicated by the plural types of lines, the second reproduction supplemental information according to the reproduced part is indicated by the type of line in a part where the slider 542 whose position indicates the reproduced part is located. If the line-shaped object 841 is indicated by the plural types of lines, the third shooting supplemental information according to the shooting frame image is indicated by the type of line in a part of the line-shaped object 841 corresponding to the shooting frame image.
Although the electronic apparatus 1 reproduces the video which is taken with the electronic apparatus 1 itself in the example described above, the electronic apparatus 1 may reproduce the video which is taken with a photographing device different from the electronic apparatus 1. Although the electronic apparatus 1 acquires the first and second reproduction supplemental information and the first to third shooting supplemental information by itself, a device different from the electronic apparatus 1 may acquire the first and second reproduction supplemental information and the first to third shooting supplemental information.
Although the electronic apparatus 1 is a mobile phone, such as a smartphone, in the above-mentioned examples, the electronic apparatus 1 may be the other types of electronic apparatuses. The electronic apparatus 1 may be a tablet terminal, a personal computer, and a wearable apparatus, for example.
While the electronic apparatus 1 has been described above in detail, the above description is in all aspects illustrative and not restrictive. The various modifications described above are applicable in combination as long as they are not mutually inconsistent. It is understood that numerous modifications which have not been exemplified can be devised without departing from the scope of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
2016-032976 | Feb 2016 | JP | national |
The present application is a National Phase entry based on PCT Application No. PCT/JP2017/004572 filed on Feb. 8, 2017, which claims the benefit of Japanese Application No. 2016-032976, filed on Feb. 24, 2016. PCT Application No. PCT/JP2017/004572 is entitled “ELECTRONIC DEVICE, CONTROL DEVICE, RECORDING MEDIUM, AND DISPLAY METHOD”, and Japanese Application No. 2016-032976 is entitled “ELECTRONIC APPARATUS, CONTROL DEVICE, CONTROL PROGRAM, AND DISPLAY METHOD”.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/004572 | 2/8/2017 | WO | 00 |