Aspects of the disclosure generally relate to an imaging apparatus transmitting a captured image to an external apparatus, and a control method thereof.
Conventionally, a learning support system that distributes a learning material to a terminal such as a personal computer or a tablet device has been discussed. Japanese Patent Application Laid-Open No. 2006-337495 discusses a learning support system that distributes a learning material based on the skill level and learning progress of a user in order to give a sense of accomplishment of learning objectives to the user.
The learning support system discussed in Japanese Patent Application Laid-Open No. 2006-337495 distributes an English word question or a crossword puzzle as a learning material, and does not distribute the assignment content to be accomplished by the user performing image capturing.
According to some embodiments, there is provided a system, an apparatus, or a method for distributing assignment content regarding image capturing (hereinafter referred to as an image capturing mission) to a terminal, and prompting a user to perform image capturing.
According to some embodiments, an imaging apparatus includes an image capture unit, a management unit configured to manage a mission regarding image capturing, a communication unit configured to transmit an image captured based on the mission to an external apparatus, and receive an evaluation of the transmitted image, and a display unit configured to perform a display indicating progress of the mission managed by the management unit, based on the received evaluation.
According to some embodiments, a method includes managing a mission regarding image capturing, transmitting an image captured based on the mission to an external apparatus, receiving an evaluation of the transmitted image, and performing a display indicating progress of the managed mission, based on the received evaluation.
Further aspects of the embodiments will become apparent from the following description of exemplary embodiments.
Exemplary embodiments, features, and aspects of the disclosure will be described below with reference to the drawings. However, aspects of the disclosure are not limited to the following embodiments.
The exemplary embodiments described below are examples of the present disclosure, and may be appropriately modified or changed depending on the configuration of an apparatus to which any of the exemplary embodiments is applied, and various conditions.
A control unit 101 controls each component of the digital camera 100 based on an input signal and a program to be described below. Instead of the control unit 101 controlling the entire digital camera 100, a plurality of hardware may share processes, thereby controlling the entire digital camera 100.
An image capture unit 102 includes, for example, an optical lens unit, an optical system for controlling the aperture, zoom, and focus, and an image sensor for converting light incident thereon through the optical lens unit into an electrical video signal. As the image sensor, a complementary metal-oxide-semiconductor (CMOS) sensor or a charge-coupled device (CCD) sensor is generally used.
The image capture unit 102, which is controlled by the control unit 101, causes the image sensor to convert a formed subject light image into an electric signal, and performs a noise reduction process on the electric signal and outputs image data as digital data. The digital camera 100 according to the exemplary embodiment records the image data in a recording medium 110 in compliance with, for example, the Design Rule for Camera File system (DCF) standard.
A non-volatile memory 103 is electrically erasable and recordable, and stores a program (described below) to be executed by the control unit 101.
A work memory 104 is used as a buffer memory for temporarily holding image data of an image captured by the image capture unit 102, an image display memory for a display unit 106, and a work area for the control unit 101.
An operation unit 105 receives from a user an instruction to operate the digital camera 100. The operation unit 105 includes, for example, a power button for the user to give an instruction to turn on or off the digital camera 100, a release switch 105a (refer to
The operation unit 105 may include a touch panel 105d (refer to
The display unit 106 displays a viewfinder image in image capturing, reproduces and displays captured image data, and displays characters for performing an interactive operation with the user. The display unit 106 does not necessarily need to be built into the digital camera 100. The digital camera 100 only needs to be able to connect to the display unit 106 inside or outside the digital camera 100, and have at least the function of controlling the display of the display unit 106. For example, the control unit 101 can be configured to have this display control function and function as a display control unit.
The recording medium 110 records image data output from the image capture unit 102. The recording medium 110 may be configured to be attachable to or detachable from the digital camera 100, or may be built into the digital camera 100. The digital camera 100 only needs to have at least a method for accessing the recording medium 110.
The communication unit 111 is an interface for wirelessly connecting to an information terminal as an external apparatus. The digital camera 100 according to the exemplary embodiment can transmit or receive image data to or from the information terminal via the communication unit 111. For example, the digital camera 100 can transmit image data generated by the image capture unit 102 to the information terminal via the communication unit 111. The image capturing by the image capture unit 102 may be controlled from the information terminal via the communication unit 111. In the exemplary embodiment, the communication unit 111 includes an interface for communicating with the information terminal via a wireless local area network (LAN) compliant with the Institute of Electrical and Electronics Engineers (IEEE) 802.11 standard. The control unit 101 controls the communication unit 111 to implement wireless communication with the information terminal.
A near-field wireless communication unit 112 includes, for example, an antenna for wireless communication, a modulation/demodulation circuit for processing a wireless signal, and a communication controller. The near-field wireless communication unit 112 outputs a modulated wireless signal from the antenna and demodulates a wireless signal received by the antenna, thereby implementing near-field wireless communication compliant with the IEEE 802.15 standard (i.e., Bluetooth®).
In the exemplary embodiment, Bluetooth® communication employs Bluetooth® Low Energy version 4.0, which consumes low energy. In Bluetooth® Low Energy communication, the range where communication can be performed is narrower (i.e., the distance within which communication can be performed is shorter) than that in wireless LAN communication. In addition, the communication speed of Bluetooth® Low Energy communication is slower than that of wireless LAN communication. Meanwhile, in Bluetooth® Low Energy communication, power consumption is lower than in wireless LAN communication. The digital camera 100 can transmit or receive data to or from the external apparatus via the near-field wireless communication unit 112. For example, the image capturing by the image capture unit 102 of the digital camera 100 may be controlled from the information terminal via the near-field wireless communication unit 112. However, since the communication speed is slow, image data generated by the image capture unit 102 is not transmitted via the near-field wireless communication unit 112.
The communication unit 111 of the digital camera 100 has an access point (AP) mode where the digital camera 100 operates as an access point in an infrastructure mode, and a client (CL) mode where the digital camera 100 operates as a client in the infrastructure mode.
When the digital camera 100 causes the communication unit 111 to operate in the CL mode, the digital camera 100 can operate as a CL device in the infrastructure mode. The digital camera 100 operating as the CL device connects to an AP device near the digital camera 100, so that the digital camera 100 can participate in a network formed by the AP device.
When the digital camera 100 causes the communication unit 111 to operate in the AP mode, the digital camera 100 can operate as a simple AP, which is a type of AP but has more limited functions. The digital camera 100 operating as the simple AP forms a wireless network using the digital camera 100 itself as an access point. An apparatus near the digital camera 100 can recognize the digital camera 100 as an AP device and participate in the network formed by the digital camera 100. A program for operating the digital camera 100 as described above is held in the non-volatile memory 103.
The digital camera 100 operating in the AP mode is a simple AP that does not have a gateway function for transferring data received from a CL device to an Internet provider. Thus, even if the digital camera 100 receives data from another apparatus participating in the network formed by the digital camera 100, the digital camera 100 cannot transfer the data to a network such as the Internet.
Next, an external appearance of the digital camera 100 will be described.
The release switch 105a, the reproduction button 105b, a direction key 105c, and the touch panel 105d are operation members included in the operation unit 105. The display unit 106 reproduces and displays an image obtained as a result of the image capturing by the image capture unit 102. The digital camera 100 includes the antenna of the near-field wireless communication unit 112 on the side surface of the camera housing. The near-field wireless communication unit 112 of the digital camera 100 is brought close to the near-field wireless communication unit 112 of another device within a certain distance, so that the digital camera 100 can establish near-field wireless communication with another device. As a result, the digital camera 100 can communicate with another device in a contactless manner not via a cable and also limit communication partners based on the user's intention.
The above is the description of the digital camera 100.
A control unit 201 controls each component of the mobile phone 200 based on an input signal and a program to be described below. Instead of the control unit 201 controlling the entire mobile phone 200, a plurality of hardware may share processes, thereby controlling the entire mobile phone 200.
An image capture unit 202 converts a subject light image formed by a lens included in the image capture unit 202 into an electric signal, performs a noise reduction process on the electric signal, and outputs image data as digital data. The captured image data is stored in a buffer memory, then subjected to a predetermined calculation by the control unit 201, and recorded in a recording medium 207.
A non-volatile memory 203 is electrically erasable and recordable. In the non-volatile memory 203, an operating system (OS), which is basic software to be executed by the control unit 201, and an application that cooperates with the OS to implement an applicative function are recorded. In the non-volatile memory 203, a camera application to be described below is stored.
A work memory 204 is used as an image display memory for a display unit 206 and a work area for the control unit 201.
An operation unit 205 receives from a user an instruction to operate the mobile phone 200. The operation unit 205 includes, for example, a power button 205a (refer to
The display unit 206 reproduces and displays image data, and displays characters for performing an interactive operation with the user. The display unit 206 does not necessarily need to be built into the mobile phone 200. The mobile phone 200 only needs to be able to connect to the display unit 206 and have at least the function of controlling the display of the display unit 206.
The mobile phone 200 includes, as one of the operation members of the operation unit 205, the touch panel 205d capable of detecting contact with the display unit 206. The touch panel 205d and the display unit 206 are configured in an integrated manner. For example, the touch panel 205d is configured to have light transmittance that does not hinder the display of the display unit 206, and is made attachable to an upper layer of the display surface of the display unit 206. Then, input coordinates on the touch panel 205d are associated with display coordinates on the display unit 206. As a result, it is possible to configure a graphical user interface (GUI) as if the user could directly operate a screen displayed on the display unit 206. As the touch panel 205d, a touch panel using any of various methods such as a resistive method, a capacitive method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, an image recognition method, and a photosensor method may be used.
The control unit 201 can detect the following operations on the touch panel 205d.
(1) Bringing a finger or a pen into contact with the touch panel 205d (hereinafter referred to as a “touch-down”)
(2) Keeping the finger or the pen in contact with the touch panel 205d (hereinafter referred to as a “touch-on”)
(3) Moving the finger or the pen while keeping the finger or the pen in contact with the touch panel 205d (hereinafter referred to as a “move”)
(4) Releasing from the touch panel 205d the finger or the pen being in contact with the touch panel 205d (hereinafter referred to as a “touch-up”)
(5) Bringing nothing into contact with the touch panel 205d (hereinafter referred to as a “touch-off”)
The display unit 206 acquires information regarding these operations and the positional coordinates where the finger or the pen is in contact with the touch panel 205d. Then, the display unit 206 notifies the control unit 201 of the information. Based on the information, the control unit 201 determines what operation is performed on the touch panel 205d. In the case of a move, the control unit 201 can also determine, based on changes in the positional coordinates, the moving direction of the finger or the pen on the touch panel 205d with respect to each of the vertical and horizontal components on the touch panel 205d. If the user performs a touch-down, a certain move, and a touch-up in a consecutive manner on the touch panel 205d, the user is regarded as drawing a stroke. Hereinafter, the operation of quickly drawing a stroke will be referred to as a flick. A flick is an operation of quickly moving the finger by some distance while keeping the finger on the touch panel 205d, and releasing the finger from the touch panel 205d immediately after the quick move. In other words, a flick is an operation of quickly tracing the touch panel 205d with the finger in a flipping manner. If a move performed by a predetermined distance or more at a predetermined speed or more is detected, and a touch-up is detected immediately after the move, the control unit 201 determines that a flick is performed. If a move performed by a predetermined distance or more at less than a predetermined speed is detected, the control unit 201 determines that a drag is performed.
The recording medium 207 records image data output from the image capture unit 202. The recording medium 207 may be configured to be attachable to or detachable from the mobile phone 200, or may be built into the mobile phone 200. The mobile phone 200 only needs to have at least a method for accessing the recording medium 207.
A communication unit 211 is an interface for wirelessly connecting to and communicating with the digital camera 100. The mobile phone 200 can transmit or receive data to or from the digital camera 100 via the communication unit 211. The communication unit 211 is an antenna, and the control unit 201 can connect to the digital camera 100 via the antenna.
The connection to the digital camera 100 may be a direct connection or a connection via an access point. As a protocol for data communication, for example, Picture Transfer Protocol over Internet Protocol (PTP/IP) via a wireless LAN can be used. The communication with the digital camera 100 is not limited thereto. For example, the communication unit 211 can include a wireless communication module such as an infrared communication module, a Bluetooth® communication module, or a Wireless Universal Serial Bus (USB) communication module.
A near-field wireless communication unit 212 includes, for example, an antenna for wireless communication, a modulation/demodulation circuit for processing a wireless signal, and a communication controller. The near-field wireless communication unit 212 outputs a modulated wireless signal from the antenna and demodulates a wireless signal received by the antenna, thereby implementing near-field wireless communication compliant with the IEEE 802.15 standard (i.e., Bluetooth®). In the exemplary embodiment, similarly to the configuration of the digital camera 100, Bluetooth® communication employs Bluetooth® Low Energy, which consumes low energy.
In near-field wireless communication with the digital camera 100, the mobile phone 200 first needs to connect to the near-field wireless communication unit 112 of the digital camera 100 by performing a pairing operation, which is an operation for establishing a one-to-one connection in near-field wireless communication. In the pairing operation, the digital camera 100 operates as a peripheral device in Bluetooth® Low Energy and periodically transmits a signal called an Advertise signal for notifying other devices about the presence of the digital camera 100, using the near-field wireless communication unit 112.
Then, the mobile phone 200 operates as a central device and performs a scan operation using the near-field wireless communication unit 212. Accordingly, the mobile phone 200 receives the Advertise signal from the digital camera 100, thereby finding the digital camera 100. When finding the digital camera 100, the mobile phone 200 performs an initiate operation to make a participation request, thereby establishing a connection for near-field wireless communication.
A public network communication unit 213 is an interface used to perform public wireless communication. The mobile phone 200 can make a telephone call with another device via the public network communication unit 213. At this time, the control unit 201 implements the telephone call by inputting and outputting sound signals via a microphone 214 and a loudspeaker 215. The public network communication unit 213 is an antenna, and the control unit 201 can connect to a public network via the antenna. A single antenna can be used both as the communication unit 211 and the public network communication unit 213.
Next, an external appearance of the mobile phone 200 will be described.
In the mobile phone 200, a standard camera application is installed on the OS that controls the mobile phone 200. The camera application can automatically detect a Quick Response (QR) code (registered trademark) from a captured subject image to analyze the QR code and obtain a character string. The mobile phone 200 can recognize, from the character string, Wi-Fi network information or a Uniform Resource Locator (URL) which is information for accessing data.
The above is the description of the mobile phone 200.
The digital camera 100 has two modes, namely a “mission mode” and a “collection mode”. The mobile phone 200 has an application (hereinafter referred to as a communication application) that manages the “mission mode” and the “collection mode”, and communicates with the digital camera 100 operating in the “mission mode” or the “collection mode” to transmit or receive information to or from the digital camera 100.
An overview of the “mission mode” and the “collection mode” will be described next.
The “mission mode” is a mode for capturing images using the digital camera 100 based on an image capturing mission transmitted from the mobile phone 200. This assumes a use case where a parent uses the mobile phone 200 to transmit an image capturing mission, which the parent wishes to assign to a child, to the digital camera 100, and the child operates the digital camera 100 to capture images and complete the image capturing mission. More specifically, the mission mode is performed by the following procedure. In the following description, an “image capturing mission” will be referred to simply as a “mission”.
(1) The parent operates the mobile phone 200 to select a mission that the parent wishes to assign to the child and then transmit the mission to the digital camera 100.
(2) The child operates the digital camera 100 to capture images based on the received mission. A single mission includes tasks, and each task gives an instruction to capture a still image.
(3) When image capturing is completed for all the tasks, the digital camera 100 enters the state where the digital camera 100 can submit mission data to the mobile phone 200. In this state, the child operates the digital camera 100 to submit mission data to the mobile phone 200. More specifically, the mission data is submitted by transmitting a captured image obtained by the image capturing for each task.
(4) The parent evaluates the submitted mission data. More specifically, the parent uses the mobile phone 200 to check and evaluate the captured image for each task (on a pass/fail basis, for example). If the captured image is in accordance with the content of the task, the parent evaluates the task as “pass”.
(5) The evaluation is fed back from the mobile phone 200 to the digital camera 100. If all the tasks are evaluated as “pass”, the digital camera 100 treats the mission as completed. The digital camera 100 manages mission completion points. Every time a mission is completed, the digital camera 100 increments the number of mission completion points by “plus 1”.
(6) When the mission is completed, the digital camera 100 can receive a present as a reward from the mobile phone 200. The reception of the reward will be described below.
Specific examples of the mission include the following. Each mission prompts the user to capture a still image or a moving image based on a theme. As described above, a single mission includes one to about six tasks, and the tasks are related to each other.
Next, an overview of the “collection mode” will be described.
The “collection mode” is a mode for repeating image capturing under a theme set by the user without determining the number of images. In other words, in the “collection mode”, unlike the “mission mode”, an image capturing target is not designated. More specifically, the “collection mode” is a mode for the child to voluntarily set the theme of a subject and capturing images under the theme. Also in the “collection mode”, the parent can use the mobile phone 200 to check an image captured by the child.
In either mode, the parent and the child can communicate with each other through image capturing, and the digital camera 100 and the mobile phone 200 function as communication tools. By actually capturing images using the digital camera 100, the child can not only master camera operation, but also acquire own knowledge and cultivate sensitivity through image capturing experience.
With reference to
In step S401, when the control unit 201 reads the communication application from the non-volatile memory 203 and executes the application, a startup screen 501 is displayed on the display unit 206.
In step S402, the control unit 201 determines whether the communication application is initially started. If the communication application is initially started (YES in step S402), the process of
In step S403, the control unit 201 displays on the display unit 206 an avatar information input screen 511 as a screen for inputting information regarding the user of the digital camera 100 (hereinafter referred to as “avatar information”).
In step S404, the control unit 201 displays on the display unit 206 a pairing setting screen 521 as a screen for setting pairing with the digital camera 100. When pairing with the digital camera 100 is set, the mobile phone 200 communicates with the digital camera 100 via the communication unit 211 and performs the pairing.
In step S405, the control unit 201 performs a application home screen process (described below).
With reference to
In step S601, the control unit 201 displays an application home screen 701 on the display unit 206.
In step S602, the control unit 201 determines whether the submission of mission data is received from the digital camera 100. If mission data is submitted (YES in step S602), the process of
In step S603, the control unit 201 displays on the display unit 206 a notification indicating that mission data has been submitted. The display unit 206 displays a submitted mission data notification 702 in a superimposed manner on the application home screen 701. Alternatively, the notification may be given by another method such as a sound.
In step S604, the control unit 201 determines whether the mission tab button 703 is selected. If the mission tab button 703 is selected (YES in step S604), the process of
In step S605, the control unit 201 displays on the display unit 206 a mission management screen 711 for managing missions for the digital camera 100.
In step S606, the control unit 201 determines whether an operation for adding a mission is performed on the operation unit 205. If the operation is performed (YES in step S606), the process of
In step S607, the control unit 201 displays a mission search screen 721 on the display unit 206.
In step S608, the control unit 201 displays a mission details screen 731 indicating the details of a mission selected by the user.
In step S609, the control unit 201 adds, to a slot, the mission to be transmitted to the digital camera 100.
In step S610, the control unit 201 transmits the mission 742 added to the slot to the digital camera 100 via the communication unit 211.
In step S611, the control unit 201 determines whether a mission evaluation operation is performed on the operation unit 205. If the mission evaluation operation is performed (YES in step S611), the process of
In step S612, the control unit 201 performs a mission evaluation process for evaluating the mission data submitted from the digital camera 100. The mission evaluation process will be described below.
In step S613, the control unit 201 transmits the evaluation results of the mission to the digital camera 100.
In step S614, the control unit 201 determines whether an operation for returning to the home screen is performed on the operation unit 205. If the operation for returning to the home screen is performed (YES in step S614), the process of
In step S615, the control unit 201 determines whether another operation is performed on the home screen. If another operation is performed (YES in step S615), the process of
In step S616, the control unit 201 performs another process. A move to another tab is made in this step.
In step S617, the control unit 201 determines whether an end operation is performed. If the end operation is performed (YES in step S617), the process of
With reference to
In step S801, the control unit 201 displays on the display unit 206 a submitted mission data details screen 901 indicating the details of the mission data submitted from the digital camera 100.
In step S802, the control unit 201 determines whether the evaluate button 903 is operated. If the evaluate button 903 is selected (YES in step S802), the process of
In step S803, the control unit 201 displays one of the images 902 captured for the selected mission in an enlarged manner for evaluation.
In step S804, the control unit 201 gives an evaluation corresponding to a pressed button among the buttons 913, to the captured image 912.
In step S805, the control unit 201 determines whether all the images included in the images 902 are evaluated. If all the images are evaluated (YES in step S805), the process of
In step S806, the control unit 201 displays an evaluation result transmission screen 921.
In step S807, the control unit 201 determines whether the evaluation result transmission button 922 is operated. If the evaluation result transmission button 922 is operated (YES in step S807), the control unit 201 transmits the evaluation results. Then, the process of
In step S808, the control unit 201 determines whether another operation is performed. If another operation is performed (YES in step S808), the process of
In step S809, the control unit 201 performs another process.
In step S810, the control unit 201 determines whether an end operation is performed. If the end operation is performed (YES in step S810), the process of
With reference to
In step S1001, the control unit 101 displays the startup screen 1101 on the display unit 106.
In step S1002, the control unit 101 determines whether a mode dial included in the operation unit 105 is at a mission mode position. If the mode dial is at the mission mode position (YES in step S1002), the process of
In step S1003, the control unit 101 performs a mission mode process (described below).
In step S1004, the control unit 101 determines whether the mode dial included in the operation unit 105 is at a collection mode position. If the mode dial is at the collection mode position (YES in step S1004), the process of
In step S1005, the control unit 101 performs a collection mode process (described below).
In step S1006, the control unit 101 determines whether another operation is performed. If another operation is performed (YES in step S1006), the process of
In step S1007, the control unit 101 performs another process. For example, the control unit 101 can perform an operation such as a normal image capturing operation not related to the mission mode or the collection mode. In a case where a normal image capturing operation is performed, a setting for applying an image capturing frame obtained as a reward (present) may be enabled.
In step S1008, the control unit 101 determines whether an end operation is performed. If the end operation is performed (YES in step S1008), the process of
With reference to
In step S1201, the control unit 101 displays on the display unit 106 a mission list screen 1301 indicating a list of missions installed on the digital camera 100.
In step S1202, the control unit 101 determines whether a mission status display operation is performed using the operation unit 105. If the operation is performed (YES in step S1202), the process of
In step S1203, the control unit 101 performs a status display process (described below).
In step S1204, the control unit 101 determines whether a mission selection operation is performed using the operation unit 105. If the operation is performed (YES in step S1204), the process of
In step S1205, the control unit 101 displays a task list screen 1311 on the display unit 106.
In step S1206, the control unit 101 determines whether the return button 1314 is operated. If the return button 1314 is operated (YES in step S1206), the process of
In step S1207, the control unit 101 determines whether the select button 1315 is operated. If the select button 1315 is operated (YES in step S1207), the process of
In step S1208, the control unit 101 displays an image capturing screen 1321 on the display unit 106.
In step S1209, the control unit 101 performs display for the user to confirm the image captured in step S1208.
In step S1210, the control unit 101 determines whether image capturing is completed for all the tasks. If image capturing is completed (YES in step S1210), the process of
In step S1211, the control unit 101 determines whether the submit button 1353 is operated. If the submit button 1353 is operated (YES in step S1211), the process of
In step S1212, the control unit 101 transmits mission data including the captured images for the respective tasks to the mobile phone 200 via the communication unit 111, and the process of
In step S1213, the control unit 101 determines whether another operation is performed. If another operation is performed (YES in step S1213), the process of
In step S1214, the control unit 101 performs another process.
In step S1215, the control unit 101 determines whether an end operation is performed. If the end operation is performed (YES in step S1215), the process of
With reference to
In step S1401, the control unit 101 displays on the display unit 106 a status screen 1501 indicating the status of the user of the digital camera 100.
In step S1402, the control unit 101 determines whether completed missions are selected on the status screen 1501 using the operation unit 105. If the completed missions are selected (YES in step S1402), the process of
In step S1403, the control unit 101 displays a completed mission screen 1511.
In step S1404, the control unit 101 determines whether an operation corresponding to the view button 1514 is performed. If the operation is performed (YES in step S1404), the process of
In step S1405, the control unit 101 displays a completed mission data reproduction screen 1521.
In step S1406, the control unit 101 determines whether an operation corresponding to the return button 1513 is performed. If the operation is performed (YES in step S1406), the process of
In step S1407, the control unit 101 determines whether an operation for displaying a list of presents, which are acquired by the user as rewards for the accomplishment of missions, is performed. If the operation is performed (YES in step S1407), the process of
In step S1408, the control unit 101 displays a present list screen 1531.
In step S1409, the control unit 101 determines whether an operation corresponding to the view button 1534 is performed. If the operation is performed (YES in step S1409), the process of
In step S1410, the control unit 101 displays a present details screen 1541 on the display unit 106.
In step S1411, the control unit 101 determines whether an operation corresponding to the return button 1533 is performed. If the operation is performed (YES in step S1411), the process of
In step S1412, the control unit 101 determines whether another operation is performed. If another operation is performed (YES in step S1412), the process of
In step S1413, the control unit 101 performs another process.
In step S1414, the control unit 101 determines whether an end operation is performed. If the end operation is performed (YES in step S1414), the process of
With reference to
In step S1601, the control unit 101 displays a collection list screen 1701 on the display unit 106.
In step S1602, the control unit 101 determines whether an operation for selecting a slot where a theme is not set is performed using the operation unit 105. If the operation is performed (YES in step S1602), the process of
In step S1603, the control unit 101 displays a collection theme selection screen 1711 as a screen for selecting the theme of a collection.
If an operation for selecting a theme is performed on the collection theme selection screen 1711, then as illustrated in
After step S1603, the process of
In step S1604, the control unit 101 determines whether an operation for selecting a slot where a theme is set is performed using the operation unit 105. If the operation is performed (YES in step S1604), the process of
In step S1605, the control unit 101 displays a collection screen 1731 on the display unit 106.
In step S1606, the control unit 101 determines whether an operation corresponding to the return button 1734 is performed. If the operation is performed (YES in step S1606), the process of
In step S1607, if an operation corresponding to the select button 1735 is performed, the process of
In step S1608, the control unit 101 displays an image capturing screen 1741 on the display unit 106.
In step S1609, the control unit 101 performs display for the user to confirm the captured image.
In step S1610, the control unit 101 determines whether another operation is performed. If the operation is performed (YES in step S1610), the process of
If the view button 1763 is operated in the state where the collection screen 1761 is displayed, the control unit 101 displays a view screen 1771 illustrated in
In step S1611, the control unit 101 performs another process.
In step S1612, the control unit 101 determines whether an end operation is performed. If the end operation is performed (YES in step S1612), the process of
While the description has been given by using the example where the digital camera 100 and the mobile phone 200 are connected to each other through wireless communication, the digital camera 100 and the mobile phone 200 may be connected to each other through wired communication. Alternatively, the digital camera 100 and the mobile phone 200 may be connected to each other via another communication apparatus or another communication network.
The control unit 101 of the digital camera 100 may be implemented by a single hardware, or may be implemented by a plurality of hardware sharing processes. The control unit 201 of the mobile phone 200 may also be implemented by a single hardware, or may also be implemented by a plurality of hardware sharing processes.
Exemplary embodiment(s) of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, or the like.
While aspects of the disclosure are described with reference to exemplary embodiments, it is to be understood that the aspects of the disclosure are not limited to the exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures.
This application claims the benefit of Japanese Patent Application No. 2020-028055, filed Feb. 21, 2020, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2020-028055 | Feb 2020 | JP | national |