The present disclosure relates to an information processing device, an information processing system, an information processing method, and a program.
In recent years, with increasing health consciousness, various services for health promotion have been proposed. One of such services is an insurance product called health promotion medical insurance. In life insurance and medical insurance, an insurance premium is determined on the basis of attribute information (age, sex, address, occupation, medical history, smoking history, etc.) of an insured person, but in the health promotion insurance, the health condition and health enhancement efforts (e.g., walking) of the insured person are evaluated, and a discount of insurance premium or repayment thereof is given according to the evaluation. According to such insurance, the insured person actively and continuously promotes health enhancement, such as walking, even after purchasing an insurance policy, in order to obtain an incentive such as the discount of insurance premium or repayment thereof.
However, the creation of such a product as described above makes some insured persons falsify the number of steps or movement distance, such as measurement of the number of steps or movement distance with pedometers or the like by an improper method, instead of measurement of the number of steps or movement distance in actual walking, in order to obtain fraudulent incentives.
Therefore, the present disclosure proposes an information processing device, an information processing system, an information processing method, and a program that are configured to prevent falsification of the number of steps or the like.
According to the present disclosure, there is provided an information processing device including: a sensing data acquisition unit that acquires a plurality of pieces of sensing data from a device worn by a user or carried by the user; a calculation unit that calculates the number of steps or a movement distance of the user based on inertial data included in the plurality of pieces of sensing data; a reliability calculation unit that calculates reliability based on a feature of each of position data, biological data, and environment data about the user obtained from the plurality of pieces of sensing data; a determination unit that determines whether to accept the number of steps or movement distance calculated, based on the reliability calculated; and an output unit that outputs data of the number of steps or movement distance received.
Furthermore, according to the present disclosure, there is provided an information processing system including: a server that calculates an incentive to a user; and an information processing device that is worn by the user or carried by the user. In the information processing system, the information processing device includes: a sensing data acquisition unit that acquires a plurality of pieces of sensing data from a device worn by the user or carried by the user; a calculation unit that calculates the number of steps or a movement distance of the user based on inertial data included in the plurality of pieces of sensing data; a reliability calculation unit that calculates reliability based on a feature of each of position data, biological data, and environment data about the user obtained from the plurality of pieces of sensing data; a determination unit that determines whether to accept the number of steps or movement distance calculated, based on the reliability calculated; an output unit that outputs, to the server, data of the number of steps or movement distance received; and a presentation unit that presents, to the user, the incentive calculated by the server based on the data of the number of steps or movement distance.
Furthermore, according to the present disclosure, there is provided an information processing method including: acquiring a plurality of pieces of sensing data from a device worn by a user or carried by the user; calculating the number of steps or a movement distance of the user based on inertial data included in the plurality of pieces of sensing data; calculating reliability based on a feature of each of position data, biological data, and environment data about the user obtained from the plurality of pieces of sensing data; determining whether to accept the number of steps or movement distance calculated, based on the reliability calculated; and outputting data of the number of steps or movement distance received, by an information processing device.
Furthermore, according to the present disclosure, there is provided a program causing a computer to perform: a function of acquiring a plurality of pieces of sensing data from a device worn by a user or carried by the user; a function of calculating the number of steps or a movement distance of the user based on inertial data included in the plurality of pieces of sensing data; a function of calculating reliability based on a feature of each of position data, biological data, and environment data about the user obtained from the plurality of pieces of sensing data; a function of determining whether to accept the number of steps or movement distance calculated, based on the reliability calculated; and a function of outputting data of the number of steps or movement distance received.
Preferred embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. Note that in the present description and the drawings, component elements having substantially the same functional configurations are denoted by the same reference numerals, and redundant descriptions thereof will be omitted. Furthermore, in the present description and the drawings, a plurality of component elements having substantially the same functional configurations is distinguished by giving the same reference numerals followed by different alphabets in some cases. However, when there is no need to particularly distinguish the plurality of component elements having substantially the same or similar functional configuration, the components are denoted by the same reference numeral alone.
Note that the description will be given in the following order.
First, before description of the embodiments of the present disclosure, the background to create the embodiments of the present disclosure by the present inventors will be briefly described.
As described above, with increasing health consciousness, various services for health promotion have been devised. One of such services is an insurance product called health promotion medical insurance. In life insurance and medical insurance, an insurance premium is determined on the basis of attribute information (age, sex, address, occupation, medical history, smoking history, etc.) of an insured person, but in the health promotion insurance, the health condition and health enhancement efforts of the insured person are evaluated, and a discount of insurance premium or repayment thereof is given according to the evaluation. According to such insurance, the insured person actively and continuously promotes health enhancement, even after purchasing an insurance policy, in order to obtain an incentive such as the discount of insurance premium or repayment thereof. For example, daily walking is one of the efforts for health promotion for which the incentive can be obtained. Specifically, the number of steps of the insured person is detected, and in a case where the number of steps is expected to be enough to obtain the effect of promoting the health of the insured person, an insurance company provides the above-described incentive depending on the number of steps, to the insured person.
However, the creation of such a product as described above makes some insured persons falsify the number of steps, in order to obtain fraudulent incentives. For example, the insured persons who perform such a fraudulent act arbitrarily give vibrations to devices that measure the number of steps to cause the devices to count the number of steps even though the persons are not actually walking. Specifically, the number of steps is detected by calculated through analysis of the acceleration data, and therefore, it is difficult to distinguish a change in acceleration caused by a vibrator from a change in acceleration caused by actual walking. In addition, it is conceivable to detect the number of steps by using a global navigation satellite system (GNSS) signal, but it is difficult to accurately detect the GNSS signal indoors, with inevitable measurement error increased. Furthermore, calculation of the insurance premium (discount amount) with the number of steps including a large error gives a sense of dissatisfaction with the insurance premium to an insured person who does not perform the fraudulent act as described above, leading to the reduction in the number of insurance subscribers.
Therefore, in view of such a situation, the present inventors have created the embodiments of the present disclosure configured to prevent falsification of the number of steps. In the embodiments of the present disclosure described below, the reliability of the number of steps measured that is measured using various sensing data about a user, such as behavior recognition, is calculated, and whether to adopt the number of steps upon calculating the insurance premium is determined depending on the reliability. Furthermore, in the present embodiment, in order to prevent falsification, it is preferable to perform personal authentication.
The embodiments of the present disclosure described below is applied to information processing for acquiring the number of steps for health promotion insurance. Note that the present embodiment is not limited to application to the information processing for acquiring the number of steps for calculation of the insurance premium of the health amplification insurance or for the discount (repayment). For example, the present embodiment may be applied to, for example, a service of giving discount points (incentive) that can be used for shopping or the like instead of cash to the user depending on the number of steps, or a service of giving health advice to the user depending on the number of steps.
Note that, in the present description, “the number of steps” includes not only the number of steps in user's walking but also the number of steps in user's running. In other words, “the number of steps” in the present description can be said to be the number of times of user's stepping during the body movement of him-/her-self.
First, an outline of an information processing system 10 according to an embodiment of the present disclosure will be described with reference to
As illustrated in
The wearable device 100 can be a device that is configured to be worn on a part of the user's body (earlobe, neck, arm, wrist, ankle, etc.) or an implant device (implant terminal) inserted into the user's body. More specifically, the wearable device 100 can be wearable devices of various types, such as a head mounted display (HMD) type, spectacle type, ear device type, anklet type, bracelet (wristband) type, collar type, eyewear type, pad type, batch type, and clothing type. Furthermore, the wearable device 100 includes, for example, a plurality of sensors including a sensor detecting a pulse wave signal from the user's pulse. Note that, in the following description, the wearable device 100 is assumed to be, for example, a bracelet (wristband) wearable device. Furthermore, details of the wearable device 100 will be described later.
The mobile device 200 is an information processing terminal carried by the user. Specifically, the mobile device 200 is configured to receive information input from the user and sensing data from the wearable device 100, process the received information and the like, and output the processed information and the like to the server 300 which is described later. For example, the mobile device 200 can be a device such as a tablet personal computer (PC), smartphone, mobile phone, laptop PC, notebook PC, or HMD. Furthermore, the mobile device 200 includes a display unit (not illustrated) that performs display for the user, an input unit (not illustrated) that receives an input operation from the user, a speaker (not illustrated) that outputs voice to the user, a microphone (not illustrated) that acquires surrounding voice, and the like. Note that, in the following description, the mobile device 200 is assumed to be, for example, a smartphone. Furthermore, details of the mobile device 200 will be described later.
Note that, in the present embodiment, the mobile device 200 may be provided with various sensors included in the wearable device 100 described above, or the sensors may be provided separately from the wearable device 100 and the mobile device 200.
The server 300 includes, for example, a computer and the like. For example, the server 300 processes sensing data or information acquired by the wearable device 100 or the mobile device 200, and outputs information obtained by the processing to another device (e.g., mobile device 200). Specifically, for example, the server 300 is configured to process data about the number of steps obtained by processing the sensing data from the wearable device 100 by the mobile device 200 to calculate the insurance premium (e.g., discount amount etc.) (incentive). Furthermore, the server 300 is configured to output the calculated insurance premium to the mobile device 200. Note that details of the server 300 will be described later.
Note that, in
Furthermore, the information processing system 10 according to the present embodiment may not include the wearable device 100. In such a configuration, for example, the mobile device 200 may function as the wearable device 100, and sensing data acquired by the mobile device 200 or information obtained by processing the sensing data may be output to the server 300.
Next, a detailed configuration of the wearable device 100 according to an embodiment of the present disclosure will be described with reference to
As described above, as the wearable device 100, the wearable devices of various types, such as a bracelet type and an HMD type can be adopted.
Specifically, as illustrated in
Furthermore, as illustrated in
The input unit 110 receives inputs of data and a command from the user to the wearable device 100. More specifically, the input unit 110 is implemented by a touch screen, a button, a microphone, or the like. Furthermore, in the present embodiment, the input unit 110 may be, for example, a gaze sensor that detects the line of sight of the user and receives a command associated with a display the user is gazing. The gaze sensor can be implemented by, for example, an imaging device including a lens, an imaging element, and the like. Furthermore, the input unit 110 may be an input unit that receives an input by detecting a gesture of a hand or arm on which the wearable device 100 is worn, by using an inertial measurement unit (IMU) 152 included in the sensor unit 150 which is described later.
The authentication information acquisition unit 120 is configured to acquire a fingerprint pattern image, an iris pattern image, vein pattern image, or face image of the user, a voiceprint based on the user's voice, or the like, in order to perform personal authentication for the user, and the acquired information is transmitted to the mobile device 200 which is described later. Furthermore, in the present embodiment, the authentication information acquisition unit 120 may receive a password, a trajectory shape, and the like input from the user for the personal authentication for the user.
In the present embodiment, for example, in a case where the personal authentication is performed on the basis of fingerprint information of the user, the authentication information acquisition unit 120 can be a capacitive fingerprint sensor that acquires a fingerprint pattern by sensing the capacitance at each point on a sensing surface generated when a fingertip of the user is placed on the sensing surface. The capacitive fingerprint sensor is configured to detect a fingerprint pattern by applying a small current to microelectrodes arranged in a matrix on the sensing surface to detect a potential difference appearing in capacitance generated between the microelectrode and the fingertip.
Furthermore, in the present embodiment, the authentication information acquisition unit 120 may be, for example, a pressure fingerprint sensor that acquires the fingerprint pattern by sensing pressure at each point on the sensing surface generated when the fingertip is placed on the sensing surface. In the pressure fingerprint sensor, for example, semiconductor micro-sensors whose resistance values change depending on pressure are arranged in matrix on the sensing surface.
Furthermore, in the present embodiment, the authentication information acquisition unit 120 may be, for example, a thermal fingerprint sensor that acquires the fingerprint pattern by sensing a temperature difference generated when the fingertip is placed on the sensing surface. In the thermal fingerprint sensor, for example, temperature micro-sensors whose resistance values change depending on temperature are arranged in matrix on the sensing surface.
Furthermore, in the present embodiment, the authentication information acquisition unit 120 may be, for example, an optical fingerprint sensor that acquires a captured image of a fingerprint pattern by detecting reflected light generated when the fingertip is placed on the sensing surface. The optical fingerprint sensor includes, for example, a micro lens array (MLA) which is an example of a lens array, and a photoelectric conversion element. In other words, the optical fingerprint sensor can be said to be one type of imaging device.
Furthermore, in the present embodiment, the authentication information acquisition unit 120 may be, for example, an ultrasonic fingerprint sensor that acquires a fingerprint pattern by emitting an ultrasonic wave and detecting ultrasonic waves reflected on uneven skin surface of the fingertip.
The display unit 130 is a device for presenting information to the user, and, for example, outputs various information to the user by using images. More specifically, the display unit 130 is implemented by a display or the like. Note that some of the functions of the display unit 130 may be provided by the mobile device 200. Furthermore, in the present embodiment, a functional block that presents information to the user is not limited to the display unit 130, and the wearable device 100 may have a functional block such as a speaker, earphone, light emitting element (e.g., light emitting diode (LED)), or vibration module.
The control unit 140 is provided in the wearable device 100 so as to control each functional unit of the wearable device 100 and acquire sensing data from the sensor unit 150 described above. The control unit 140 is implemented by hardware such as a central processing unit (CPU), a read only memory (ROM), and a random access memory (RAM). Note that some of the functions of the control unit 140 may be provided by the server 300 which is described later.
The sensor unit 150 is provided in the wearable device 100 worn on the body of the user, includes various sensors that detect the conditions of the user or the conditions of the surrounding environment of the user, and transmits the sensing data acquired by these various sensors to the mobile device 200 which is described later. Specifically, the sensor unit 150 includes the inertial measurement unit (IMU) 152 that detects inertial data generated by the movement of the user, a positioning sensor 154 that measures a position of the user, and a biological information sensor 156 that detects the pulse or heart rate of the user. Furthermore, the sensor unit 150 can include an image sensor 158 that acquires an image (moving image) around the user, one or a plurality of microphones 160 that detects environmental sound around the user, and the like. Details of various sensors of the sensor unit 150 will be described below.
˜IMU 152˜
The IMU 152 is configured to acquire sensing data (inertial data) indicating a change in acceleration or angular velocity that occurs with the motion of the user. Specifically, the IMU 152 includes an acceleration sensor, a gyro sensor, a geomagnetic sensor, and the like (not illustrated).
The positioning sensor 154 is a sensor that detects the position of the user wearing the wearable device 100, and can be specifically a global navigation satellite system (GNSS) receiver or the like. In this configuration, the positioning sensor 154 can generate sensing data indicating the latitude and longitude of the current location of the user on the basis of a signal (GNSS signal) from a GNSS satellite. Furthermore, in the present embodiment, it is possible to detect a relative positional relationship of the user from, for example, information about radio frequency identification (RFID), a Wi-Fi access point, a radio base station, and the like, and therefore, such a communication device can also be used as the positioning sensor 154.
Note that, in the present embodiment, a proof of location (PoL) technology may also be used to increase the reliability of positioning by the GNSS signal. For example, the PoL technology is a technology that performs short-range communication with a fixed access point in the vicinity of a position determined by the GNSS signal, simultaneously with positioning by the GNSS signal, confirming that the user is positioned at the position, increasing the reliability of positioning by the GNSS signal.
The biological information sensor 156 is a sensor that detects biological information of the user, and can be, for example, various sensors that are directly attached to a part of the user's body to measure the user's heart rate, pulse, blood pressure, brain wave, respiration, perspiration, myoelectric potential, skin temperature, electrical skin resistance, and the like.
For example, a heart rate sensor (example of pulse sensor) is a sensor that detects the heart rate that is a pulsation of the heart of the user. In addition, a pulse sensor (example of the pulse sensor) is a sensor that detects the pulse on a body surface or the like, the pulse being a pulsation of an artery caused by a change in pressure on an inner wall of the artery due to sending of blood to the whole body through the arteries by the pulsation (heart rate) of the heart. Furthermore, a blood flow sensor (including blood pressure sensor) is, for example, a sensor that emits infrared rays or the like to the body to obtain the absorptivity or reflectance of light or the change thereof and detects the blood flow rate, pulse, heart rate, and blood pressure. Furthermore, the heart rate sensor or the pulse sensor may be an imaging device that images the skin of the user. In this case, the pulse or heart rate of the user can be detected on the basis of a change in light reflectance on the skin obtained from the image of the skin of the user.
For example, a respiration sensor can be a respiratory flow sensor that detects a change in respiration. A brain wave sensor is a sensor that detects a brain wave by removing noise from a change in potential difference between a plurality of electrodes measured by attaching the electrodes to a scalp of the user to extract periodic waves. A skin temperature sensor is a sensor that detects the surface temperature of the user, and a skin conductivity sensor is a sensor that detects the electrical skin resistance of the user. A perspiration sensor is a sensor that is worn on the skin of the user to detect the change in voltage or resistance between two points on the skin, caused by perspiration. Furthermore, a myoelectric sensor is a sensor that quantitatively detects a muscle activity of a muscle by measuring a myoelectric potential. The myoelectric potential is measured based on an electric signal that is generated in a muscle fiber when a muscle of an arm or the like is contracted and that is propagated to a body surface, with a plurality of electrodes attached to the arm or the like of the user.
The image sensor 158 is, for example, an image sensor for color imaging that has a Bayer array capable of detecting blue, green, and red light. Furthermore, this RGB sensor may include a pair of image sensors in order to recognize a depth (stereo system).
Furthermore, the image sensor 158 may be a time of flight (ToF) sensor that acquires depth information of a real space around the user. Specifically, the ToF sensor emits irradiation light such as infrared light around the user to detect reflected light reflected from the surface of an object around the user. Then, the ToF sensor calculates a phase difference between the irradiation light and the reflected light to acquire a distance (depth information) from the ToF sensor to a real object. Therefore, it is possible to obtain a distance image as three-dimensional shape data, from such depth information. Note that the method of obtaining distance information based on the phase difference as described above is referred to as an indirect ToF method. Furthermore, in the present embodiment, it is also possible to use a direct ToF method in which a round-trip time from emission of light to reception of the light reflected from the object is detected to acquire the distance (depth information) from the ToF sensor to the object. In detail, the ToF sensor is configured to acquire the distance (depth information) from the ToF sensor to the object, and therefore, the distance image including the distance information (depth information) indicating the distance to the object can be obtained as the three-dimensional shape data of the real space. Here, the distance image is, for example, image information generated by associating the distance information (depth information) acquired for each pixel of the ToF sensor with position information of the corresponding pixel.
˜Microphone 160˜
The microphone 160 is a sound sensor that detects a sound generated by speech voice or motion of the user or a sound generated around the user. Note that, in the present embodiment, the number of microphones 160 is not limited to one, and a plurality of the microphones 160 may be employed. Furthermore, in the present embodiment, the microphone 160 is provided in the wearable device 100, but the microphone 160 is not limited to this description, and one or more microphones 160 may be installed around the user.
Furthermore, the sensor unit 150 may include an ambient environment sensor that detects the conditions of the surrounding environment of the user, and specifically, may include various sensors that detect the temperature, humidity, brightness, and the like of the surrounding environment of the user. In the present embodiment, the sensing data from these sensors may be used to improve the accuracy in recognition of user's behaviors which are described later.
Furthermore, the sensor unit 150 may incorporate a clock mechanism (not illustrated) that provides the accurate time associate the acquired sensing data with the time at which the sensing data is acquired. Furthermore, as described above, the various sensors may not be provided in the sensor unit 150 of the wearable device 100. For example, the various sensors may be provided separately from the wearable device 100, or may be provided in another device or the like used by the user.
Furthermore, the sensor unit 150 may include a sensor for detecting a mounted state of the sensor unit 150. For example, the sensor unit 150 may include a pressure sensor or the like that detects appropriate mounting of the sensor unit 150 on a part of the body of the user (e.g., mounting thereof in close contact with the part of the body).
The storage unit 170 is provided in the wearable device 100 to store programs, information, and the like for the control unit 140 to execute various processing, and information obtained by the processing. Note that the storage unit 170 is implemented by a nonvolatile memory or the like such as a flash memory.
The communication unit 180 is provided in the wearable device 100 so as to transmit and receive information to and from an external device, such as the mobile device 200 or the server 300. In other words, it can be said that the communication unit 180 is a communication interface having a function of transmitting and receiving data. Note that the communication unit 180 is implemented by communication devices such as a communication antenna, a transmission/reception circuit, and a port. Furthermore, in the present embodiment, the communication unit 180 may be caused to function as a radio wave sensor that detects a radio field intensity or radio arrival direction.
Note that, in the present embodiment, the configuration of the wearable device 100 is not limited to that illustrated in
Next, a detailed configuration of the mobile device 200 according to the present embodiment will be described with reference to
The input unit 210 receives inputs of data and a command from the user to the mobile device 200. More specifically, the input unit 210 is implemented by a touch screen, a button, a microphone, and the like.
The display unit 230 is a device for presenting information to the user, and, for example, is configured to output various information to the user by using images based on information acquired from the server 300. More specifically, the display unit 130 is implemented by a display or the like. Furthermore, in the present embodiment, a functional block that presents information to the user is not limited to the display unit 230, and the mobile device 200 may have a functional block such as a speaker, earphone, light emitting element, or vibration module.
The processing unit 240 is configured to process the sensing data from the sensor unit 150 of the wearable device 100. The processing unit 240 is implemented by hardware such as CPU, ROM, and RAM. As illustrated in
The authentication information acquisition unit 242 is configured to acquire, from the authentication information acquisition unit 120 of the wearable device 100, the fingerprint pattern image, iris pattern image, vein pattern image, or face image of the user, the voiceprint based on the user's voice, or the like, in order to perform personal authentication for the user. Furthermore, the authentication information acquisition unit 242 is configured to output the acquired information to the authentication unit 244 which is described later. In the present embodiment, for example, in a case where the personal authentication is performed on the basis of the fingerprint information of the user, the authentication information acquisition unit 242 may acquire the fingerprint pattern of the user from the authentication information acquisition unit 120 of the wearable device 100 to perform predetermined processing for emphasizing the fingerprint pattern, removing noise, and the like. More specifically, the authentication information acquisition unit 242 is configured to use various filters for smoothing and removing noise, such as, a moving average filter, difference filter, median filter, and Gaussian filter. Furthermore, the authentication information acquisition unit 242 may perform processing by using, for example, various algorithms for binarization and thinning.
The authentication unit 244 is configured to collate, in advance, the fingerprint information (fingerprint pattern), iris information, face image, password, trajectory, and the like of the user acquired from the authentication information acquisition unit 242 described above, with personal authentication information associated with a personal identification (ID) in a personal information database (DB) stored in the storage unit 270 which is described later, for personal authentication.
In the present embodiment, for example, when personal authentication is performed based on the fingerprint information of the user, the authentication unit 244 calculates features of the fingerprint pattern. Here, the features of the fingerprint pattern refer to a distribution of the feature points on the fingerprint pattern, that is, the number of feature points or a distribution density (distribution information) of the feature points. Furthermore, the feature points refer to attribute information such as the shapes, directions, and positions (relative coordinates) of a center point of the fingerprint pattern, and a branch point, intersection point, and end point of a ridge (referred to as minutiae) of a pattern of the fingerprint pattern. Furthermore, the feature points may be attribute information such as the shapes, directions, widths, interval, and distribution density of the ridges.
Then, for example, the authentication unit 244 is also configured to collate feature points extracted from part of the fingerprint pattern output from the authentication information acquisition unit 242 with feature points of the fingerprint pattern recorded in the storage unit 130 or the like in advance, for authentication of the user (minutiae method). Furthermore, for example, the authentication unit 244 is configured to collate the fingerprint pattern output from the authentication information acquisition unit 242 described above with a fingerprint template of the fingerprint pattern stored in the storage unit 270 or the like in advance, for authentication of the user (pattern matching method). Furthermore, for example, the authentication unit 244 is also configured to perform spectral analysis of a pattern for each sliced fingerprint pattern obtained by slicing the fingerprint pattern into strips, and perform collation by using a result of spectral analysis of the fingerprint pattern stored in advance in the storage unit 270 or the like, for performing authentication.
In the present embodiment, when the personal authentication of the user is successfully performed by the authentication unit 244, it is possible to start acquisition of the sensing data, process the sensing data, and transmit data obtained by processing the sensing data to an external device (e.g., the server 300).
The sensing data acquisition unit 246 is configured to acquire a plurality of pieces of sensing data from the wearable device 100 and output the sensing data to the step calculation unit 248 and the feature calculation unit 250 which are described later.
The step calculation unit 248 is configured to calculate (count) the number of steps of the user on the basis of a change in the inertial data (acceleration data, angular velocity data, etc.) from the sensing data acquisition unit 246 described above. Note that the step calculation unit 248 may calculate the number of steps of the user, referring to a model obtained in advance by machine learning. Furthermore, the step calculation unit 248 is configured to output data about the number of steps calculated, to the output unit 256 which is described later, and the like. Note that in the present embodiment, the step calculation unit 248 may calculate a movement distance of the user.
The feature calculation unit 250 is configured to calculate the features from the inertial data, position data, biological data, and environment data included in the plurality of pieces of sensing data from the sensing data acquisition unit 246 described above (details of the inertial data, position data, biological data, and environment data will be described later). Furthermore, the feature calculation unit 250 is configured to output the calculated features to the reliability calculation unit 252 which is described later. For example, the feature calculation unit 250 is configured to calculate the features by performing statistical processing (average, variance, normalization, etc.) on one or more of the plurality of pieces of sensing data. Alternatively, the feature calculation unit 250 may calculate the features from the sensing data, referring to a model obtained in advance by machine learning.
Furthermore, for example, the feature calculation unit 250 is also configured to obtain a walking distance of the user (second distance data) by multiplying the data about the number of walking steps of the user obtained from the inertial data, and data about a stride of the user input from the user. Furthermore, the feature calculation unit 250 may calculate a distance (first distance data) through which the user has moved by walking on the basis of the sensing data from the positioning sensor 154 and calculate, as the features, a difference between a walking distance based on the inertial data and a walking distance based on the sensing data from the positioning sensor 154.
Furthermore, the feature calculation unit 250 is configured to recognize the user's behaviors (walking, running, driving, etc.) as the features, on the basis of at least one of the plurality of pieces of sensing data from the sensing data acquisition unit 246 described above. For example, in a case where the same type of sensor (e.g., IMU 152) is mounted on both of the wearable device 100 and the mobile device 200, the feature calculation unit 250 is configured to compare the same type of sensing data (inertial data) from different devices to recognize the user's behaviors. Note that details of calculation of the features in the present embodiment will be described later.
The reliability calculation unit 252 is configured to calculate the reliability on the basis of the features obtained from the position data, biological data, environment data, and behavior recognition data about the user obtained by the feature calculation unit 250 described above. Furthermore, the reliability calculation unit 252 is configured to output the calculated reliability to the determination unit 254 which is described later. Specifically, the reliability calculation unit 252 is configured to calculate the reliability by weighting each of the features with a predetermined coefficient given to each feature. Furthermore, in the present embodiment, the reliability calculation unit 252 may dynamically change the predetermined coefficient according to the position, behavior recognition data, a change in position, and the like about the user. Note that details of the calculation of the reliability in the present embodiment will be described later.
The determination unit 254 is configured to determine whether to accept the data about the number of steps calculated by the step calculation unit 248 (specifically, whether the number of steps is true or false) on the basis of the reliability calculated by the reliability calculation unit 252 described above. Specifically, the determination unit 254 compares the reliability with a predetermined threshold. For example, when the reliability is equal to or above the predetermined threshold, it is determined to receive the data about the number of steps calculated, and the result of determination is output to the output unit 256 which is described later. Furthermore, in the present embodiment, the determination unit 254 may dynamically change the predetermined threshold on the basis of information from the server 300.
The output unit 256 outputs the data about the number of steps calculated by the step calculation unit 248 to the server 300, on the basis of the determination by the determination unit 254 described above. Note that the output unit 256 may output the data about the number of steps to the display unit 230 or the storage unit 270.
In the server 300, the insurance premium information acquisition unit 260 is configured to acquire information about the insurance premium or discount amount (incentive) of the insurance premium that is calculated on the basis of the data about the number of steps, from the server 300 and output the information to the display unit 230.
The storage unit 270 is provided in the mobile device 200 to store programs, information, and the like for the above-described processing unit 240 to perform various processing, and information obtained by the processing. Note that the storage unit 270 is implemented by a nonvolatile memory or the like such as a flash memory.
The communication unit 280 is provided in the mobile device 200 so as to transmit and receive information to and from an external device such as the wearable device 100 or the server 300. In other words, it can be said that the communication unit 280 is a communication interface having a function of transmitting and receiving data. Note that the communication unit 280 is implemented by communication devices such as a communication antenna, a transmission/reception circuit, and a port. Furthermore, in the present embodiment, the communication unit 280 may be caused to function as a radio wave sensor that detects a distance to the wearable device 100 or that detects a radio field intensity or radio arrival direction.
Note that, in the present embodiment, the configuration of the mobile device 200 is not limited to that illustrated in
Next, a detailed configuration of the server 300 according to the present embodiment will be described with reference to
The input unit 310 receives inputs of data and a command from the user to the server 300. More specifically, the input unit 310 is implemented by a touch screen, a keyboard, and the like.
The display unit 330 includes, for example, a display, a video output terminal, and the like and outputs various information to the user by using images or the like.
The processing unit 340 is provided in the server 300 to control each block of the server 300. Specifically, the processing unit 340 controls various processing such as calculation of the insurance premium performed in the server 300. The processing unit 340 is implemented by hardware such as CPU, ROM, and RAM. Note that the processing unit 340 may perform some of the functions of the processing unit 240 of the mobile device 200. Specifically, as illustrated in
The step information acquisition unit 342 is configured to acquire the data about the number of steps from the mobile device 200 and output the data to the insurance premium calculation unit 344 and the storage unit 370 which are described later.
The insurance premium calculation unit 344 is configured to calculate the insurance premium of the user on the basis of the data about the number of steps from the step information acquisition unit 342 described above and output the calculated insurance premium to the output unit 356 which is described later. Specifically, the insurance premium calculation unit 344 is configured to calculate the insurance premium of the user, on the basis of the number of steps of the user and the attribute information (gender, age, places where the user has lived, medical history, occupation, desired compensation, etc.) of the user, referring to an insurance premium table stored in the storage unit 370 which is described later. At this time, the insurance premium calculation unit 344 may also calculate and output a difference (discount amount) between the current insurance premium of the user and a newly calculated insurance premium.
The threshold calculation unit 346 is configured to determine the predetermined threshold used by the determination unit 254 of the mobile device 200 described above, referring to a past history (the number of steps or the like) of the user (a plurality of users) and the results of the guarantee and management of the insurance company and output the predetermined threshold to the mobile device 200 via the output unit 356 which is described later. Specifically, for example, the threshold calculation unit 346 adjusts the threshold so that the insurance company can achieve a benefit even when the insurance premium is reduced for each user according to the result of the number of steps of each user.
The output unit 356 is configured to output the insurance premium and threshold calculated by the insurance premium calculation unit 344 and the threshold calculation unit 346 described above, to the mobile device 200.
The storage unit 370 is provided in the server 300 to store programs and the like for the processing unit 340 described above to perform various processing, and information obtained by the processing. More specifically, the storage unit 370 is implemented by a magnetic recording medium such as a hard disk (HD).
The communication unit 380 is provided in the server 300 so as to transmit and receive information to and from an external device such as the mobile device 200. Note that the communication unit 380 is implemented by, for example, communication devices such as a communication antenna, a transmission/reception circuit, and a port.
Note that, in the present embodiment, the configuration of the server 300 is not limited to that illustrated in
Next, an information processing method according to an embodiment of the present disclosure will be described with reference to
Specifically, as illustrated in
First, the wearable device 100 or the mobile device 200, which are user-side devices, performs personal authentication of the user (Step S100). For example, in Step S100, the mobile device 200 performs unlocking using the fingerprint pattern of the fingertip of the user.
Furthermore, in the present embodiment, the authentication information such as the fingerprint pattern used in the unlocking described above and information about the user's insurance policy (insured person identification information, insurance content information, insurance premium, etc.) are stored in the server 300 in association with each other in advance, in the first use. For example, as illustrated in
Next, the wearable device 100 or the mobile device 200, which is the user-side device, transmits authentication information of the user or identification information (policyholder information) associated with the user to the server 300, and inquires for the policyholder information (Step S200).
Then, the server 300 confirms whether the policyholder information or the like transmitted from the wearable device 100 or the mobile device 200 matches policyholder information or the like stored in advance, and transmits a confirmation result as an inquiry result to the wearable device 100 or the mobile device 200 (Step S300).
When the confirmation result transmitted from the server 300 indicates that the information matches each other, the wearable device 100 or the mobile device 200 starts to detect the number of steps (Step S400). On the other hand, when the confirmation result transmitted from the server 300 indicates that the information does not match each other, the wearable device 100 or the mobile device 200 finishes the processing. At this time, for example, as illustrated in
Furthermore, in the present embodiment, during the detection of the number of steps, for example, the wearable device 100 or the like may notify the user that the number of steps is being detected, as illustrated in
Then, for example, when the wearable device 100 is dismounted or the short-range communication between the wearable device 100 and the mobile device 200 is interrupted, the wearable device 100 or the mobile device 200 deauthenticates the user (Step S500). Furthermore, upon deauthentication, the wearable device 100 or the mobile device 200 transmits data, such as the number of steps and the reliability, having been calculated to the server 300. Note that, in the present embodiment, the transmission is not limited to the transmission of data, such as the number of steps and the reliability, having been calculated, to the server 300 at the deauthentication timing, and may be, but not particularly limited to, transmission at the end of a day or every predetermined time period.
At this time, as illustrated in
Next, the server 300 calculates the insurance premium on the basis of data such as the number of steps transmitted from the wearable device 100 or mobile device 200 (Step S600). Then, the server 300 redefines and updates an insurance policy condition and the like of the user on the basis of the calculated insurance premium, and transmits information, such as the insurance premium (discount amount of the insurance premium) and the insurance policy condition, to the wearable device 100 or mobile device 200.
Then, the wearable device 100 or the mobile device 200 presents information, such as the insurance premium and the insurance policy condition, transmitted from the server 300 to the user (Step S700). For example, as illustrated in
Furthermore, in the present embodiment, the wearable device 100 or the mobile device 200 may analyze past sensing data (e.g., sensing data acquired by the wearable device 100 before purchasing the insurance policy) that has already been stored to calculate the number of steps to be reflected in the insurance premium. Furthermore, in the present embodiment, the wearable device 100 or the mobile device 200 may have a standalone configuration to simulate or determine the insurance premium by using past data. At this time, for example, as illustrated in
Furthermore, in the present embodiment, for example, when an environmental sound includes a sound (e.g., car horn or the like) with which danger to the user may be inferred where environmental sound around the user is acquired in order to acquire the features, the wearable device 100 may alert the user by giving a screen display, vibration, or the like to the user, as illustrated in
Next, details of calculation of the number of steps, illustrated in Step S400 of
First, as described in Step S100 of
The wearable device 100 or the mobile device 200 determines whether to start to detect the number of steps (Substep S402). For example, when the personal authentication of the user is successfully performed in Substep S401 described above and an operation indicating the intention to approve the detection of the number of steps is received from the user (Substep S402: Yes), the wearable device 100 or the mobile device 200 proceeds to Substep S403 in order to start to detect the number of steps. On the other hand, for example, when the personal authentication of the user fails in Substep S402 described above or when the operation indicating the intention to approve the detection of the number of steps is not received from the user, the wearable device 100 or the mobile device 200 repeats the processing of Substep S401 (Substep S402: No).
The wearable device 100 or the mobile device 200 sets data Dstep about the number of steps stored to 0 and starts the detection of the number of steps (specifically, acquisition of the inertial data) (Substep S403).
The wearable device 100 or the mobile device 200 starts acquisition of the sensing data to acquire the features (Substep S404). Note that details of calculation of the features in the present embodiment will be described later.
The wearable device 100 or the mobile device 200 calculates (counts) the data Dstep about the number of steps of the user on the basis of the change in the inertial data (the acceleration data, the angular velocity data, etc.) having ever been acquired (Substep S405). Note that, in the present embodiment, the number of steps of the user may be calculated by analyzing the inertial data, referring to the model obtained in advance by machine learning.
The wearable device 100 or the mobile device 200 determines whether there is no detection of the number of steps for a predetermined time period or more (Substep S406). When there is no detection of the number of steps for the predetermined time period or more (Substep S406: Yes), the wearable device 100 or the mobile device 200 proceeds to Substep S407. On the other hand, when the wearable device 100 or the mobile device 200 is not in the state there is no detection of the number of steps for the predetermined time period or more (i.e., the number of steps is detected), the process returns to Substep S405 (Substep S406: No).
The wearable device 100 or the mobile device 200 calculates the features, on the basis of the sensing data having ever been acquired, and calculates the reliability on the basis of the calculated features (Substep S407). Note that details of the features and the reliability in the present embodiment will be described later.
Note that, in the present embodiment, a timing interval for calculating the reliability and the time length of a sensing data acquisition period for calculating the features are preferably basically long, from the viewpoint of reliability. However, when the time length is too long, there is a high possibility of including many time slots during which the user is not walking. Therefore, in the present embodiment, it is preferable to set and adjust the time length in view of a sensing data acquisition state, a numerical value of the reliability, and a balance between a processing load, power consumption, and the like in the mobile device 200.
The wearable device 100 or the mobile device 200 determines whether the reliability calculated in Substep S407 described above is equal to or above the predetermined threshold (Substep S408). When the reliability is equal to or above the predetermined threshold (Substep S408: Yes), the wearable device 100 or the mobile device 200 proceeds to Substep S409. On the other hand, when the reliability is not equal to or above the predetermined threshold (Substep S408: No), the wearable device 100 or the mobile device 200 proceeds to Substep S410.
Note that, in the present embodiment, the predetermined threshold described above may be fixed to a preset value or may be determined or changed by the server 300, referring to a plurality of histories (the number of steps and the like) of a plurality of users or the results of the guarantee and management of the insurance company. With a configuration as described above, for example, even when the insurance premium is reduced for each user according to the result of the number of steps, the insurance company can achieve a benefit.
The wearable device 100 or the mobile device 200 outputs the number of steps having ever calculated to the server 300 (Substep S409).
The wearable device 100 or the mobile device 200 finishes the acquisition of the sensing data for acquiring the features, and the process returns to Substep S402 (Substep S410).
Note that, in the present embodiment, it is preferable to acquire the sensing data and calculate the number of steps, the features, and the like, only when the number of steps is detected, and such a configuration makes it possible to suppress the increase in processing load and power consumption in the wearable device 100 or the mobile device 200.
In the present embodiment, the reliability of the number of steps is calculated in order to confirm that the number of steps calculated as described above is not the number of steps falsified. Then, in the present embodiment, for calculation of the reliability, the features characterizing the plurality of pieces of sensing data are calculated, from the plurality of pieces of sensing data obtained by various sensors mounted in the wearable device 100, and the reliability is calculated using the calculated features. For example, in the present embodiment, the number of steps or walking distance estimated from the features is compared with the number of steps calculated using the inertial data, or the walking distance obtained by multiplying the number of steps by the stride registered in advance by the user, and when a difference in the number of steps or walking distance is small, it is determined that the number of steps is not the number of steps falsified. Then, in the present embodiment, the number of steps obtained by the inertial data can be regarded as the number of steps valid that can be reflected in the calculation of the insurance premium.
Specifically, in the present embodiment, the features can be calculated from the inertial data, the position data, the biological data, and the environment data included in the plurality of pieces of sensing data. Hereinafter, details of the pieces of data will be sequentially described with reference to
In the present embodiment, the inertial data is data that changes due to three-dimensional inertial motion (translational motion and rotational motion in three orthogonal axis directions) of the user, and specifically refers to the acceleration data, the angular velocity data, and the like. Specifically, in the present embodiment, as described above, it is possible to calculate the data Dstep about the number of steps of the user on the basis of the change in the inertial data (the acceleration data, the angular velocity data, etc.). Furthermore, in the present embodiment, it is possible to calculate the walking distance as a feature by multiplying the calculated number of steps by the stride registered in advance by the user. Note that, in the present embodiment, when it is troublesome to measure the stride, for example, the height of the user may be multiplied by a predetermined coefficient (e.g., 0.45) so that the result is treated as a numerical value instead of the stride. Furthermore, in the present embodiment, the predetermined coefficient described above may be dynamically changed according to a result of the recognition of the user's behavior (e.g., walking, running, etc.) (e.g., since the stride changes between walking and running, the coefficient is increased for running relative to that for walking.).
Furthermore, in the present embodiment, the recognition of the user's behavior (running, walking, etc.) can be performed for one of the features, on the basis of the inertial data (the acceleration data, the angular velocity data, etc.), and a result of the recognition is referred to as the behavior recognition data. Here, the behavior recognition data means data indicating the movement or motion of the user, and specifically refers to data indicating the movement or motion, such as walking or running, of the user. Note that, in the present embodiment, at this time, the recognition of the behavior may be performed, referring to a model obtained in advance by machine learning using the inertial data of many users. Furthermore, in the present embodiment, the recognition of the behavior may be performed using a model obtained by machine learning using the inertial data obtained by the IMU 152 worn on a target user, and this configuration makes it possible to increase the accuracy in recognition of the behavior of a specific user.
Furthermore, in the present embodiment, not only the inertial data but also, for example, a schedule (wake-up time, clock-in time, quitting time, bedtime, etc.) input by the user in advance may be used for the recognition of the behavior. Alternatively, in the present embodiment, the recognition of the behavior may be performed using position data (sensing data) (e.g., home, company, school, station, etc.) obtained by the positioning sensor 154. In this way, the accuracy of the recognition of the behavior can be improved.
Furthermore, in the present embodiment, for example, in a case where the IMU 152 is mounted on both the wearable device 100 and the mobile device 200, recognition of the user's behavior may be performed by comparing the inertial data between both devices. More specifically, a time difference in peak of acceleration in a gravity direction between the wearable device 100 and the mobile device 200, which is within a predetermined time period, may be recognized as walking. Specifically, a periodic change in an arm swing direction in the acceleration data obtained by the wearable device 100 but no periodic change in the arm swing direction in the acceleration data obtained by the mobile device 200 is recognized as walking. Furthermore, satisfaction of these two conditions may be recognized as walking. In addition, in the present embodiment, in a case where, for example, it is determined, on the basis of the radio field intensity or the like, that short-distance communication can be performed between the wearable device 100 and the mobile device 200, that is, the wearable device 100 and the mobile device 200 are located within a predetermined distance (e.g., within 1 m), satisfaction of this condition and the above two conditions may be recognized as walking.
In the present embodiment, the position data is information indicating the position of the user in a global coordinate system or a relative coordinate system. Specifically, in the present embodiment, it is possible to acquire the position data of the user on the basis of the sensing data from the positioning sensor 154. For example, as illustrated in
In the present embodiment, the biological data is information indicating the conditions of the body of the user, such as the pulse rate, heart rate, blood pressure, blood flow rate, respiration, skin temperature, perspiration, brain wave, myoelectric potential, and skin resistance level of the user. Specifically, in the present embodiment, it is possible to perform the recognition of the user's behavior, as the feature, from the biological data of the user based on the sensing data from the biological information sensor 156. For example, as illustrated in
In the present embodiment, the environment data is, for example, information indicating the conditions of the environment around the user that are obtained as an image, sound, and radio wave (specifically, for example, change in radio field intensity). Specifically, in the present embodiment, it is possible to calculate the recognition of the behavior, the number of steps, the walking distance, and the like of the user, as the features, from the environment data based on the sensing data from the microphone 160, image sensor 158, and communication unit 180. For example, in the present embodiment, as illustrated in
Furthermore, in the present embodiment, for example, as illustrated in
Furthermore, in the present embodiment, the movement of the user, that is, walking (running) may be detected as the feature, on the basis of a change in the intensity of a radio wave (e.g., WiFi (registered trademark), Bluetooth (registered trademark), etc.) detected by the communication unit 180. Specifically, when the user moves (walks or runs), the intensity of the radio wave from each access point, detected by the communication unit 180 of the wearable device 100 worn by the user should change. Therefore, in the present embodiment, when the radio field intensity may be detected for each piece of identification information (e.g., SSID) of the access point, the intensity of the radio wave having the same identification information that changes even after a certain period of time may be recognized as walking (running). Furthermore, in the present embodiment, an attenuation rate of the radio field intensity can be applied to a predetermined formula to calculate the movement distance of the user, and therefore, it is also possible to calculate a movement distance of the user per time, that is, a speed. Then, the calculated speed is compared with a predetermined threshold, and it is possible to determine whether the user is walking or running. Furthermore, dividing the distance by the stride of the user (the stride may be changed between walking and running) makes it possible to calculate the number of steps of the user.
Furthermore, in the present embodiment, the user's behavior or walking distance may be detected as the feature, on the basis of the change of an image acquired by the image sensor 158. Specifically, as illustrated in
Note that the present embodiment is not limited to the sensing data acquired by the various sensors described above and the features obtained from the sensing data, and may be, but not particularly limited, sensing data or the features obtained from another sensor.
Next, details of calculation of the reliability according to the present embodiment will be described with reference to
It is considered that the reliability of the recognition of the behavior (walking and running) and the reliability of the distance or the number of steps are different depending on the type of the feature. Therefore, in the present embodiment, upon calculating the reliability, weighting (coefficient) in obtaining the reliability is changed for each type of feature while using a plurality of features.
Specifically, in the present embodiment, as illustrated in
Then, in the present embodiment, for a coefficient CoeffHR (coefficient for the number of steps based on the change in the pulse rate), a coefficient CoeffWiFi (coefficient for the number of steps based on the change in the radio field intensity), and a coefficient Coeffvideo (coefficient for the number of steps based on the change in the feature points in the images) of mathematical formula (1), values determined in advance according to the properties of the features can be used. More specifically, for the characteristics related to the distance and the number of steps, a difference from each of the distance and the number of steps that are obtained from the inertial data is calculated, a mean value (variance value) or a normalized value of the differences is multiplied by each coefficient (coefficient ratio) and added together, and thus, the reliability ri can be obtained.
Note that, in the present embodiment, the calculation formula of the reliability ri is not limited to the above mathematical formula (1), and is not particularly limited as long as a formula in which the respective features are weighted (coefficient) in calculation.
Furthermore, in the present embodiment, as illustrated in
Furthermore, in the present embodiment, as illustrated in
Note that, in the present embodiment, the values of the coefficients are not limited to the values illustrated in
As described above, according to the embodiments of the present disclosure, it is possible to prevent the falsification of the number of steps or the like, and therefore, the insurance premium is calculated by using the fair steps or the like, providing the sense of satisfaction with the insurance premium. Therefore, the health of the insured person is enhanced, and an increase in the number of insurance subscribers is promising. Note that, in the description of the above embodiments, an exemplary application to the prevention of falsification of the number of steps has been described, but the present embodiment is not limited thereto, and can also be applied to the prevention of falsification of the movement distance of the user.
Note that, in the above description, although it has been described that the number of steps and the reliability are calculated mainly by the mobile device 200 and the insurance premium is calculated by the server 300, the embodiments of the present disclosure are not limited to such a form. In the embodiments of the present disclosure, for example, in one or both of the wearable device 100 and the mobile device 200, the processing from the calculation of the number of steps and the reliability to the calculation of the insurance premium may be performed, and whole or part of the processing may be performed by a large number of cloud information processing devices.
Note that, as described above, the embodiments of the present disclosure are not limited to application to the information processing for acquiring the number of steps for the health promotion insurance. The embodiments of the present disclosure may be applied to, for example, a service of giving discount points (incentive) that can be used for shopping or the like instead of cash to the user depending on the number of steps, or a service of giving health advice to the user depending on the number of steps.
As illustrated in
The CPU 901 functions as an arithmetic processing unit and a control device, and controls all or some of operations in the smartphone 900, according to various programs recorded in the ROM 902, the RAM 903, the storage device 904, or the like. The ROM 902 stores programs, calculation parameters, and the like used by the CPU 901. The RAM 903 primarily stores the programs used in execution of the CPU 901 or parameters or the like that change appropriately in the execution of the programs. The CPU 901, the ROM 902, and the RAM 903 are connected to each other by the bus 914. Furthermore, the storage device 904 is a data storage device configured as an example of a storage unit of the smartphone 900. The storage device 904 includes, for example, a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or the like. The storage device 904 stores the programs and various data executed by the CPU 901, various data acquired from outside, and the like.
The communication module 905 is a communication interface that includes, for example, a communication device or the like for connection to a communication network 906. The communication module 905 can be, for example, a communication card or the like for a wired or wireless local area network (LAN), Bluetooth (registered trademark), or a wireless USB (WUSB). Furthermore, the communication module 905 may be a router for optical communication, a router for an asymmetric digital subscriber line (ADSL), a modem for various communications, or the like. The communication module 905 transmits and receives signals and the like to and from the Internet or other communication devices by using a predetermined protocol such as Transmission Control Protocol (TCP)/Internet Protocol (IP). Furthermore, the communication network 906 connected to the communication module 905 is a network connected in a wired or wireless manner, and is, for example, the Internet, a home LAN, infrared communication, satellite communication, or the like.
The sensor module 907 includes various sensors, such as a motion sensor (e.g., acceleration sensor, gyroscope sensor, geomagnetic sensor, etc.), a biological information sensor (e.g., pulse sensor, blood pressure sensor, fingerprint sensor, etc.), and a position sensor (e.g., global navigation satellite system (GNSS) receiver, and the like).
The imaging device 909 is provided on a surface of the smartphone 900 to image a target or the like positioned on the front or back side of the smartphone 900. Specifically, the imaging device 909 is configured to include an imaging element (not illustrated) such as a complementary MOS (CMOS) image sensor, and a signal processing circuit (not illustrated) that performs imaging signal processing on a signal obtained by photoelectric conversion by the imaging element. Furthermore, the imaging device 909 can further include an optical system mechanism (not illustrated) that includes an imaging lens, a zoom lens, a focusing lens, and the like, and a drive system mechanism (not illustrated) that controls the operations of the optical system mechanism. Then, the imaging element collects incident light from the target, as an optical image, and the signal processing circuit photoelectrically converts the formed optical image in units of pixels, reads a signal of each pixel as an imaging signal, and performs image processing to acquire a captured image.
The display device 910 is provided on a surface of the smartphone 900 and can be a display device such as a liquid crystal display (LCD) or organic electro luminescence (EL) display. The display device 910 is configured to display an operation screen, the captured image acquired by the imaging device 909 described above, and the like.
The speaker 911 is configured to output, for example, a voice call, voice accompanying image content displayed by the display device 910 described above, and the like, to the user.
The microphone 912 is configured to collect, for example, a voice call of the user, a voice including a command for activating a function of the smartphone 900, and sound in a surrounding environment of the smartphone 900.
The input device 913 is a device such as a button, touch screen, or mouse that is operated by the user. The input device 913 includes an input control circuit that generates an input signal on the basis of information input by the user and that outputs the input signal to the CPU 901. The user can operate the input device 913 to input various data to the smartphone 900 or give an instruction for processing operation.
An exemplary hardware configuration of the smartphone 900 has been described above. Note that the hardware configuration of the smartphone 900 is not limited to the configuration illustrated in
Furthermore, the smartphone 900 according to the present embodiment may be applied to a system including a plurality of devices on the premise of connection to a network (or communication between the devices), such as cloud computing. In other words, the mobile device 200 according to the present embodiment described above can also be implemented, for example, as the information processing system 10 that performs the processing related to the information processing method according to the present embodiment by using the plurality of devices.
Preferred embodiments of the present disclosure have been described above in detail with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to these examples. A person skilled in the art may obviously find various alternations or modifications within the technical concept described in claims, and it should be understood that the alternations and modifications will naturally come under the technical scope of the present disclosure.
Note that the embodiments of the present disclosure described above can include, for example, a program for causing a computer to function as the information processing device according to the present embodiment, and a non-transitory tangible medium on which the program is recorded. In addition, the program may be distributed via a communication line (including wireless communication) such as the Internet.
Furthermore, the respective steps in the processing of each embodiment described above may not necessarily be processed in the order having been described. For example, the order of the processing may be appropriately changed between the respective steps. In addition, the processing of the respective steps may be performed partially in parallel or individually, instead of being performed in chronological order. Furthermore, the processing method of each step may not be necessarily performed according to the described method, and may be performed, for example, by another method by using another functional unit.
Furthermore, the effects descried herein are merely illustrative or exemplified effects, and are not limitative. In other words, with or in place of the above effects, the technology according to the present disclosure can provide other effects that are apparent to those skilled in the art from the description herein.
Note that the present technology can also have the following configurations.
(1) An information processing device comprising:
Number | Date | Country | Kind |
---|---|---|---|
2021-120537 | Jul 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/008880 | 3/2/2022 | WO |