The present disclosure relates to a technique that manages content uploaded by a system user in a server and charges a fee to the system user depending on a usage state of the content.
Conventionally, a content providing system that provides content such as a game and an educational material to a user via the internet has been provided. The system is specifically designed to encourage the user to continuously use the system by frequently updating the content.
In the area of imaging capturing, because a user of an image capturing apparatus such as a digital camera has to personally look for an object as an image capturing target, sometimes it can be difficult for the user to maintain enthusiasm for capturing an image. As one solution, there is provided a method employing a gaming element of the content, which prompts a user to continuously use a camera by providing assignment content relating to image capturing (hereinafter, called “image capturing mission”) to the user from the camera.
In the system using the above-described method, a business model that makes a profit from an advertising effect of the content (image capturing mission) in addition to making a profit from distribution of image capturing apparatuses can be considered. For example, by distributing an image capturing mission that can only be achieved by capturing an image at a specific location, an effect of attracting customers to that specific location can be expected. For example, if a business partner (alliance partner) that manages an amusement park uses the system in order to acquire an advertising effect, a system usage fee can be collected from the alliance partner.
Generally, in the above-described system, such a business model will not work unless an appropriate amount of system usage fee is charged to the alliance partner. Therefore, it is important to set a system usage fee satisfactory to both the alliance partner (system user) and the system provider.
Japanese Patent Application Laid-Open No. 2001-216416 discusses a technique of increasing an advertising/promotion effect by providing a participatory game that makes a user enthusiastically browse an advertising page displayed on an advertising site of the internet. Specifically, according to the technique discussed in the above document, the advertising cost charged to the alliance partner is determined by calculating the advertising effect depending on the number of participants of the game.
Because “the number of participants of the game” in the business model discussed in Japanese Patent Application Laid-Open No. 2001-216416 merely corresponds to “the number of users who have installed the image capturing mission” in the above-described image capturing system, it is uncertain whether the advertising/promotion effect can be increased by simply installing the image capturing mission. Therefore, if the system usage fee is determined depending on the number of participants of the game as discussed in Japanese Patent Application Laid-Open No. 2001-216416, there is a possibility that the alliance partner (system user) cannot experience a sense of satisfaction.
According to an aspect of the present disclosure, charging system includes a server configured to manage an image capturing mission uploaded by a user, a communication unit configured to communicate with a communication terminal having an image capturing function, an acquisition unit configured to acquire information indicating that an image capturing mission downloaded by the communication terminal is achieved by the communication terminal, and a calculation unit configured to calculate a charging amount charged to a user who has uploaded the image capturing mission based on the number of image capturing missions achieved by the communication terminal, which is determined based on the acquired information.
Further features will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
An exemplary embodiment will be described below in detail with reference to the appended drawings.
The exemplary embodiment described below is merely an example for implementing the present disclosure, and can be modified or changed as appropriate depending on a configuration or various conditions of an apparatus to which the present disclosure is applied.
A camera (also called “digital camera”) 100 includes a display unit 28 for displaying a captured image and information about various settings relating to image capturing operation. The display unit 28 includes a rear face display panel 28a and an electronic viewfinder 28b, and display thereof is shifted based on the operation.
The camera 100 includes various operation units. A shutter button 61 arranged on an upper face of the camera 100 is an operation unit for receiving an image capturing instruction. A mode shifting switch 60 arranged on a rear face thereof is an operation unit for shifting an image capturing mode. An operation unit 70 includes operation members such as various switches, buttons, and a touch panel for receiving various types of operation from a user. A controller wheel 73 included in the operation unit 70 is an operation member that can be operated rotationally.
A power switch 72 arranged on the upper face of the camera 100 is a push button for switching the power of the camera 100 between ON and OFF. A connection cable 111 for connecting the camera 100 to an external apparatus such as a personal computer or a printer is attached to a connector 112 arranged on a side face of the camera 100.
A recording medium slot 201 for storing a recording medium 200 such as a memory card or a hard disk is arranged on a lower face of the camera 100. When the recording medium 200 is stored in the recording medium slot 201, the recording medium 200 can communicate with the camera 100, so that an image can be recorded in the recording medium 200, and an image recorded in the recording medium 200 can be reproduced by the camera 100. A cover 202 covers the recording medium slot 201.
A lens barrel 300 is arranged on a front face of the camera 100, and a part of the operation unit 70 is arranged on a side face of the lens barrel 300. The user can operate the camera 100 by using the operation unit 70 arranged on the side face of the lens barrel 300.
In
An image processing unit 24 executes resizing processing and color conversion processing such as pixel interpolation and reduction on data output from the A/D converter 23 and the memory control unit 15. Predetermined calculation processing is executed by the image processing unit 24 by using captured image data, and exposure processing and range finding control are executed by a system control unit 50 based on the acquired calculation result. With this configuration, autofocus (AF) processing, autoexposure (AE) processing, and electronic flash (EF) pre-emission processing using a through-the-lens (TTL) method are executed. The image processing unit 24 also executes predetermined calculation processing by using the captured image data, and executes auto-white balance (AWB) processing using the TTL method based on the acquired calculation result.
Data output from the A/D converter 23 is written into a memory 32 via the image processing unit 24 and the memory control unit 15, or via the memory control unit 15. The memory 32 stores image data that is acquired by the image capturing unit 22 and converted into digital data by the A/D converter 23 and image data that is to be displayed on the display unit 28. The memory 32 has a storage capacity sufficient for storing a predetermined number of still images and a predetermined period of a moving image and audio data.
The memory 32 also serves as an image-display memory (video memory). A digital-to-analog (D/A) converter 13 converts image-display data stored in the memory 32 into an analog signal and supplies the analog signal to the display unit 28. In this way, image data used for displaying an image, which is written into the memory 32, is displayed on the display unit 28 via the D/A converter 13. In addition, the digital signals converted by the A/D converter 23 and accumulated in the memory 32 are converted into analog signals by the D/A converter 13 and sequentially transferred to and displayed on the display unit 28, so that the display unit 28 can execute live-view display.
A non-volatile memory 56 is a memory capable of electrically recording and deleting data. A memory such as an electrically erasable programmable read-only memory (EEPROM) is used as the non-volatile memory 56. The non-volatile memory 56 stores a constant number for operating the system control unit 50 and a program. Herein, the program includes a computer program for executing various flowcharts described below.
The system control unit 50 controls the camera 100. The system control unit 50 executes the program stored in the non-volatile memory 56 to implement respective pieces of processing described below. The system memory 52 is a random access memory (RAM) used for loading a constant and a variable for operating the system control unit 50, and a program read from the non-volatile memory 56. The system control unit 50 can execute display control by controlling the memory 32, the D/A converter 13, and the display unit 28.
A system timer 53 is a timer unit that measures time used for various types of control and time of a built-in clock.
The mode shifting switch 60, the shutter button 61, and the operation unit 70 are operation units for inputting various operation instructions to the system control unit 50. By operating the mode shifting switch 60, the user can shift the operation mode of the system control unit 50 to any one of a still image recording mode, a moving image capturing mode, or a reproduction mode.
A first shutter switch 62 is turned ON and generates a first shutter switch signal SW1 when the user inputs an image capturing preparation instruction by halfway pressing the shutter button 61 arranged on the digital camera 100. The system control unit 50 starts executing the operation for the AF processing, the AE processing, the AWB processing, and the EF pre-emission processing when the first shutter switch signal SW1 is input thereto.
A second shutter switch 64 is turned ON and generates a second shutter switch signal SW2 when the user inputs an image capturing instruction by fully pressing the shutter button 61. The system control unit 50 starts executing the operation for a series of image capturing processing including processing for reading a signal from the image capturing unit 22 and writing image data to the recording medium 200 when the second shutter switch signal SW2 is input thereto.
A power control unit 80 includes a battery detection circuit, a direct current-to-direct current (DC-DC) converter, and a switching circuit for switching a block to be energized, and detects presence or absence of a mounted battery, a type of battery, and a remaining battery level are detected. Based on the detection result and an instruction from the system control unit 50, the power control unit 80 controls the DC-DC converter to supply required voltage to the respective units including the recording medium 200 for a necessary period.
A power supply unit 30 includes a primary battery such as an alkaline battery or a lithium battery, or a secondary battery such as a nickel-cadmium (NiCd) battery, a nickel-metal hydride (NiMH) battery, or a lithium-ion (Li) battery, and an alternating-current (AC) adapter. A recording medium interface (I/F) 18 serves as an interface between the camera 100 and the recording medium 200. The recording medium 200 is a recording medium such as a memory card for recording a captured image, configured of a semiconductor memory, an optical disk, or a magnetic disk.
A communication unit 54 is a communication interface for executing wireless or wired communication with an external apparatus. The communication unit 54 transmits and receives a video signal and an audio signal to/from the external apparatus. The communication unit 54 can also connect to a wireless local area network (LAN) and the internet. The communication unit 54 can transmit an image (including a live view image) captured by the image capturing unit 22 and an image recorded in the recording medium 200 to the external apparatus and receive image data and various types of information from the external apparatus.
An orientation detection unit 55 detects the orientation of the digital camera 100 in the gravitational direction. It is possible to determine whether the image captured by the image capturing unit 22 is an image captured by the digital camera 100 held in a horizontal orientation or a vertical orientation based on the orientation detected by the orientation detection unit 55. The system control unit 50 can attach direction information based on the orientation detected by the orientation detection unit 55 to an image file of the image captured by the image capturing unit 22 to rotate and record the image. An acceleration sensor or a gyroscope can be used as the orientation detection unit 55.
An eyepiece detection unit 57 detects an approaching eye (object) of the photographer. Depending on the state detected by the eyepiece detection unit 57, the system control unit 50 switches display and non-display of the rear face display panel 28a and the electronic viewfinder 28b.
The camera 100 can include a global positioning system (GPS) device (not illustrated) for acquiring a position of the camera 100. Attribute information such as the user's age can be input via the above-described operation members and stored in the camera 100. The content appropriate for the user can be distributed by using the above information for distributing an assignment content distributed by the external device such as a server apparatus described below or for outputting a captured image.
The digital camera 100 provides a mission mode as one of the image capturing modes. When a camera user selects the mission mode by operating a mode selection dial, an image capturing mission is displayed on a display screen of the digital camera 100.
An application of the image capturing mission (i.e., mission application) is installed in the smartphone 303. The user operates the mission application and communicates with the digital camera 100 and the mission server 302 to transmit and receive data relating to the image capturing mission.
A series of processing including the processing for creating and registering (uploading) an image capturing mission, distributing (downloading) the image capturing mission, executing and evaluating the image capturing mission, and charging a fee to a system user will be described mainly with reference to
Turning to
Turning to
In step S402, the PC 301 stores the mission data based on the operation of the system user.
In step S403, the PC 301 transmits the mission data to the mission server 302 based on the operation of the system user.
In step S404, the mission server 302 receives the mission data transmitted from the PC 301 and stores the mission data in a storage area.
Returning to
Turning to
In step S406, the mission server 302 changes the data attribute of the mission to be distributed. For example, the data attribute is changed so that the smartphone 303 can confirm that the mission is released and downloadable via the mission application.
In step S407, the smartphone 303 receives a notification indicating that the attribute of the mission has been changed by the mission server 302, and displays a list of downloadable missions. The user operates the smartphone 303 to select a desired mission from among the missions displayed in the list.
In step S408, the smartphone 303 communicates with the mission server 302 to receive and store the mission data in the storage area.
Returning to
Turning to
In step S410, the smartphone 303 transmits the mission to be installed to the digital camera 100.
In step S411, the digital camera 100 receives the mission from the smartphone 303.
In step S412, the digital camera 100 installs the received mission.
Turning back to
Turning to
Returning to
Herein, the mission 508 relates to a virtual amusement park called “Service Vehicle Land”. When the user executes determination operation in a state where the cursor 509 is adjusted to the mission 508, the mission selection screen 504 is shifted to a detail screen that displays the content of the mission.
Returning to
In step S416, the system control unit 50 of the digital camera 100 superimposes the mission icon on the captured image and records the captured image.
Turning back to
Turning to
In step S418, as described above, the digital camera 100 transmits the application for approval based on the request content to the smartphone 303. The application for approval can be executed when the user touches any one of the pieces of request content 602 to 604 displayed on the screen in
In step S419, the smartphone 303 receives the application for approval.
In step S420, the smartphone 303 notifies the user of receipt of the application for approval by displaying a notification on the screen.
Returning to
Turning to
In step S422, based on the user operation, the smartphone 303 displays a list of images recorded in the digital camera 100 associated with the smartphone 303 via the mission application.
In step S423, based on the user operation, the smartphone 303 evaluates the image, approval for which has been requested, as to whether the mission is achieved to a degree that satisfies a predetermined standard.
In step S424, based on the user operation, the smartphone 303 transmits the evaluation result to the digital camera 100.
In step S425, the digital camera 100 displays the evaluation result.
In step S426, based on the evaluation result, the digital camera 100 updates an achievement degree of each mission managed by the digital camera 100.
After the user executes evaluation via the smartphone 303, in step S316, the mission application transmits information about the achievement degree of the mission to the mission server 302. As a result, the mission server 302 shares an achievement state of each mission with the smartphone 303. The processing for acquiring the achievement state executed in step S316 will be described in detail with reference to
Turning to
In step S428, the smartphone 303 transmits information about the achievement degree of each mission to the mission server 302.
In step S429, the mission server 302 stores the received achievement degree of the mission.
In step S430, the mission server 302 calculates an amount of a usage fee charged to the user who has created the mission depending on the number of achievements of the mission. The mission server 302 recognizes a terminal that downloads the mission for each of the missions, and acquires the achievement state of the mission from the terminal. Accordingly, the mission server 302 functions as a charging system that calculates a system usage fee depending on the number of achievements of the mission. For example, the mission server 302 calculates an amount of a usage fee to be higher when the number of achievements of the mission is greater.
As described above, when the image capturing mission is to search the Service Vehicle Land for a specific vehicle to capture an image of that vehicle, a user who has achieved the mission should be a visitor of the Service Vehicle Land. Accordingly, from a viewpoint of a system user (i.e., a business operator of the amusement park) who has created the image capturing mission, this image capturing mission has achieved a customer attracting effect proportionate to the number of achievements of the image capturing mission. Therefore, with the system usage fee that is priced depending on the number of users who have achieved the mission, the system user can experience a sense of satisfaction. In other words, for the system user who creates and registers the image capturing mission, the system usage fee is commensurate with the customer attracting effect, and thus the image capturing mission is an effective advertisement commensurate with cost.
In addition, priority of missions displayed on the smartphone 303 can be changed depending on the achievement degree of each of the missions. For example, by preferentially displaying a mission that has been achieved by only a small number of users, a customer attracting effect can be provided to a wide range of system users.
The mission server 302 can also acquire position information from the smartphone 303 and distribute a mission depending on the position of the user. For example, a mission can be registered to be distributed to a user who is visiting a specific facility such as a tourist site.
While the above-described mission is a mission that can be completed by one time of image capturing, the mission can be achieved by a plurality of times of image capturing or moving-image capturing. The mission server 302 can determine whether the mission is completed by one time of image capturing or a plurality of times of image capturing and change a calculation method of the charging amount depending on the determination result. Because the mission that is completed by a plurality of times of image capturing has a profound advertising effect, a charging amount can be set to be higher.
In the above-described exemplary embodiment, the user has to capture an image at a specific location in order to achieve the mission. However, the user can capture an image relating to a specific product. For example, there is a case where a system user such as a company that sells a specific product registers the image capturing mission. Specifically, a food manufacturer can distribute a mission prompting a customer to capture an image of food seasoned with seasonings manufactured by the food manufacture.
Evaluation of the achievement degree of the mission is typically manually executed by the user of the smartphone 303 by checking the submitted captured image. However, evaluation can be automatically executed by the mission application. In this case, the smartphone 303 requests the digital camera 100 to transmit parameters associated with image capturing operation together with the captured image to execute evaluation by using the following acquired parameters:
As a reward for achievement of the mission, service content can be distributed to the user who has achieved the image capturing mission. For example, in the case of a mission that the user has to capture an image at a specific location such as an amusement park, a coupon that offers a discount on the admission fee of a next visit can be distributed to the smartphone 303.
In the above-described exemplary embodiment, as illustrated in
While an exemplary embodiment has been provided, this exemplary embodiment is not seen to be limiting. Many variations that do not depart from the essential spirit of the present disclosure are applicable. A configuration in which the above-described exemplary embodiments are appropriately combined is also included in the scope of the present disclosure.
The above-described exemplary embodiment is described using a digital camera. The exemplary embodiment is not limited thereto, and any communication terminal having an image capturing function, such as a tablet PC, a personal digital assistance (PDA), or a mobile phone, etc., is applicable.
Embodiment(s) of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
According to the aspect of the present disclosure, a system usage fee can be appropriately set for a system user who has uploaded content depending on an advertising effect of the uploaded content.
While exemplary embodiments have been described, these exemplary embodiments are not seen to be limiting. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2019-215445, filed Nov. 28, 2019, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2019-215445 | Nov 2019 | JP | national |