The present disclosure relates to an information processing apparatus, an information processing method, and a program.
Conventionally, a person uses a plurality of characters that he/she has properly according to various scenes (places, environments, etc.). For example, a person switches his/her attitude, the way of thinking, the way of speaking, and the like naturally or consciously between at the time of being in an office and at the time of being with a friend, or between at the time of being in school and at the time of being at home.
Here, in the recent information presentation technology, a virtual agent is provided in a system such that the agent provides information desired by a user by voice or image. With regard to such a technology, for example, Patent Document 1 below discloses an information terminal apparatus which enables a user to intuitively understand a change of his/her taste or hobby by gradually changing and displaying visual aspects of a character according to the amount of change in the user's characteristics.
However, the related art does not detect or analyze a plurality of characters possessed by a user, and merely changes visual aspects of an agent according to change in the user's taste or hobby.
Furthermore, in a case where the user is tired or busy, it may be difficult for the user to be aware or to control his/her character suitable for the current scene.
Thus, the present disclosure proposes an information processing apparatus, an information processing method, and a program capable of presenting appropriate information so as to bring out a more preferable user's character.
The present disclosure proposes an information processing apparatus including a control unit that determines a character of a user, determines whether or not it is a timing to change to a predetermined character, and performs control to output a trigger for prompting a change to the predetermined character at the change timing.
The present disclosure proposes an information processing method, by a processor, including determining a character of a user, determining whether or not it is a timing to change to a predetermined character, and performing control to output a trigger for prompting a change to the predetermined character at the change timing.
The present disclosure proposes a program for causing a computer to function as a control unit configured to determine a character of a user, determine whether or not it is a timing to change to a predetermined character, and control to output a trigger for prompting a change to the predetermined character at the change timing.
As described above, according to the present disclosure, it is possible to present appropriate information so as to bring out a more preferable user's character.
Note that the above-described effect is not necessarily limited, and any one of the effects described in this specification together with or in place of the above-described effect, or other effects that can be grasped from this specification may be exhibited.
Preferred embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. Note that, in the present specification and the drawings, components having substantially the same functional configuration are denoted by the same reference numerals, and redundant description will be omitted.
Furthermore, the explanation will be made in the following order.
<<1. Overview of Information Processing System According to Embodiment of Present Disclosure>>
Furthermore, in this specification, as the name of a character, for example, a name expressing emotions (“short-tempered character,” “lonely character, “crybaby character,” “dopey character,” “fastidious character,” etc.), a name expressing a role and position in society (“mammy character”, “daddy character”, “work character”, “clerk character”, “high school girl character”, “celebrity character”, etc.), or a names combining these (“short-tempered mammy character”, “cheerful clerk character”, etc.) is given as an example as appropriate.
As illustrated in
In a case where it is determined that the character change is necessary, the agent V provides the user with information (hereinafter, also referred to as a change trigger) as a trigger for the user's character change. For example, as illustrated in the middle part of
Thus, in the present embodiment, it is possible to prompt to change to a further desirable character. Note that, by setting for each character whether or not the character feels happy (hereinafter, called as a “happy characters”) and by activating a change trigger to prioritize a change to the happy character, it is possible to lead the user to a happier state as a result.
Here, as an example, the agent determines the necessity of character change from the user's situation or state and gives an appropriate change trigger, but the present embodiment is not limited to this, and a change to the character linked to a schedule previously input by the user may be prompted. As a result, the change trigger can be provided at the timing desired by the user. The user may input in advance, along with the schedule, when and where, and what kind of character the user wants to be.
Furthermore, the virtual agent V in the present system can give a change trigger from a user terminal 1 (refer to
An output method of the change trigger varies according to a function of the device. For example, sound information (voice, environmental sound, sound effect, etc.), display information (words, agent images, pictures, videos, etc., on a display screen), vibration, smell, and the like are conceivable. For example, in the case of a glasses-type terminal (AR eyewear), a change trigger may be output by text, words, or graphics superimposed on a space or on an object by AR technology. Furthermore, in the case of an ear-mounted headset or an open air earphone, a change trigger may be output by whispering at the ear, blowing a wind, applying heat, or the like such that other people do not hear it.
Furthermore, in the case of a smartphone, a tablet terminal, or the like, a change trigger is given by text, words, figures, or characters appearing at an end or a part of a display screen. Furthermore, in the case of a shoulder type terminal, basically, as with an ear-mounted headset, it is possible to give a change trigger by whispering at the ear, blowing a wind, or applying heat such that other people do not hear it, and also by vibration, shift in center of gravity, pulling on hair, or the like.
Subsequently, the entire configuration of the information processing system according to the present embodiment will be described with reference to
As illustrated in
The user terminal 1 transmits to the server 2 various types of information regarding the user situation used for character determination and change determination, such as position information and a user's uttered voice.
The server 2 has a function as a virtual agent, such as determination of a user character and activation of a change trigger, on the basis of information transmitted from the user terminal 1.
Note that, although the system configuration mainly performing processing on the server 2 side (cloud server) is exemplified in the present embodiment, the present disclosure is not limited to this, and a part or all of various processing such as character determination and change trigger activation may be performed on the user terminal 1 side. Furthermore, the processing according to the present embodiment may be performed by a plurality of external devices (distributed processing), or some processing may be performed by an edge server (edge computing).
The information processing system according to an embodiment of the present disclosure has been described above. Subsequently, specific configurations of respective devices included in the information processing system according to the present embodiment will be described with reference to the drawings.
<<2. Configuration>>
<2-1. Configuration of User Terminal 1>
The control unit 10 functions as an arithmetic processing device and a control device and controls the overall operation in the user terminal 1 according to various programs. The control unit 10 is realized by, for example, an electronic circuit such as a central processing unit (CPU) or a microprocessor. Furthermore, the control unit 10 may include a read only memory (ROM) that stores programs to be used, operation parameters, and the like, and a random access memory (RAM) that temporarily stores parameters and the like that change appropriately.
Furthermore, the control unit 10 according to the present embodiment performs control such that voice information input by the voice input unit 13 and various sensor information detected by the sensor 14 are transmitted from the communication unit 11 to the server 2. Furthermore, the control unit 10 controls the display unit 15 or the voice output unit 16 to output a change trigger received from the server 2 by the communication unit 21.
The communication unit 11 is connected to the network 3 by wire or wirelessly, and transmits/receives data to/from an external device (for example, a peripheral device, a router, a base station, the server 2 or the like). The communication unit 11 communicates with external devices by, for example, a wired/wireless local area network (LAN), Wi-Fi (registered trademark), a mobile communication network (long term evolution (LTE), third generation mobile communication system (3G)) or the like.
The operation input unit 12 receives an operation instruction from a user and outputs operation contents of the instruction to the control unit 10. The operation input unit 12 may be a touch sensor, a pressure sensor, or a proximity sensor. Alternatively, the operation input unit 12 may have a physical configuration such as a button, a switch, or a lever.
The voice input unit 13 is realized by a microphone, a microphone amplifier unit for amplifying and processing a voice signal obtained by the microphone, and an A/D converter for digital converting to a voice signal, and the voice input unit 13 outputs the voice signal to the control unit 10.
The sensor 14 detects a user's situation, state, or surrounding environment, and outputs detection information to the control unit 10. The sensor 14 may be a plurality of sensor groups or a plurality of types of sensors. Examples of the sensor 14 include a motion sensor (acceleration sensor, gyro sensor, geomagnetic sensor, etc.), a position sensor (indoor positioning based on communication with Wi-Fi (registered trademark), Bluetooth (registered trademark), etc., or outdoor positioning using GPS etc.), a biological sensor (heartbeat sensor, pulse sensor, sweat sensor, body temperature sensor, electroencephalogram sensor, myoelectric sensor, etc.), an imaging sensor (camera), and an environment sensor (temperature sensor, humidity sensor, luminance sensor, rain sensor, etc.).
The display unit 15 is a display device that outputs an operation screen, a menu screen, and the like. The display unit 15 may be, for example, a display device such as a liquid crystal display (LCD) or an organic electroluminescence (EL) display. Furthermore, the display unit 15 according to the present embodiment can output a user questionnaire for character determination as described later and a video as a change trigger under the control of the control unit 10.
The voice output unit 16 has a speaker for reproducing a voice signal and an amplifier circuit for the speaker. The voice output unit 16 according to the present embodiment outputs a change trigger such as voice of an agent or music under the control of the control unit 10.
The storage unit 17 is realized by a read only memory (ROM) that stores a program used for processing of the control unit 10, calculation parameters, and the like, and a random access memory (RAM) that temporarily stores parameters and the like that change appropriately.
The configuration of the user terminal 1 according to the present embodiment has been specifically described above. Note that the configuration of the user terminal 1 is not limited to the example illustrated in
Furthermore, at least a part of the configuration illustrated in
<2-2. Configuration of Server 2>
(Control Unit 20)
The control unit 20 functions as an arithmetic processing device and a control device and controls the overall operation in the server 2 according to various programs. The control unit 20 is realized by, for example, an electronic circuit such as a central processing unit (CPU) or a microprocessor. Furthermore, the control unit 20 may include a read only memory (ROM) that stores programs to be used, operation parameters, and the like, and a random access memory (RAM) that temporarily stores parameters and the like that change appropriately.
Furthermore, the control unit 20 according to the present embodiment also functions as a user situation/action recognition unit 201, a character determination unit 202, a change trigger output control unit 203, and a user information management unit 204.
The user situation/action recognition unit 201 recognizes (including analysis) a user situation, a surrounding situation (peripheral environment), and an action on the basis of sensor information and voice information transmitted from the user terminal 1. Furthermore, the user situation/action recognition unit 201 can also perform action recognition on the basis of a schedule registered in advance by the user and posted contents (text, image, position information, who is with the user) to a social network service.
The character determination unit 202 determines characters possessed by the user and characters currently appearing. For example, the character determination unit 202 makes a determination on the basis of a predetermined questionnaire answer input by the user, a post history to a social network service, a schedule history, and an action history based on sensor information. In this case, the character determination unit 202 may perform character determination with reference to a character determination rule registered in advance in the character information storage unit 22a, or may learn an appearance status of the user's character by machine learning.
The change trigger output control unit 203 determines whether or not to change the current character of the user and performs control to output information serving as a trigger for changing the character. Examples of the change trigger include some information presentation or voice call by agent voice, other voices, music, video, pictures, posting history of the user to the past social network service, smell, and the like.
The user information management unit 204 registers various types of information related to the user, such as characters possessed by the user, appearance patterns of respective characters, and action history of the user, in the user information storage unit 22b and manages them.
(Communication Unit 21)
The communication unit 21 transmits and receives data to and from an external device by wire or wirelessly. The communication unit 21 communicates with the user terminal 1 via the network 3 by, for example, a wired/wireless local area network (LAN), wireless fidelity (Wi-Fi, registered trademark) or the like.
(Character Information Storage Unit 22a)
The character information storage unit 22a stores various information related to characters. For example, the character information storage unit 22a stores character determination rules. Examples of the rules include a rule of determining “work character” or “school character” by determining that a user is at school or at office (which can be estimated from the age of the user) in a case where the user stays at the same place relatively on a regular basis during the day of a weekday, and a rule of determining “mommy character” by estimating a home in a case where the user stays in the same place relatively on a regular basis at night and also from a family structure and the like of the user. Furthermore, there are a rule of determining “character in girls' association” in a case where a conversations is lively in a situation where the user is with a friend at a restaurant with reference to map information, and determining “relaxed character” in a case where it is recognized from biological information and the like that the user is relaxed. Furthermore, the character information storage unit 22a also stores information (name, features, change trigger information, etc.) of characters created on the system side.
(User Information Storage Unit 22b)
The user information storage unit 22b stores various types of information related to the user, such as characters possessed by the user, appearance patterns of respective characters, and action history of the user. Furthermore, change trigger information for each character of the user may also be stored.
The configuration of the server 2 according to the present embodiment has been specifically described above. Note that the configuration of the server 2 illustrated in
<<3. Operation Processing>>
Subsequently, operation processing of the information processing system according to the present embodiment will be specifically described using the drawings.
<3-1. Character Determination Processing>
First, character determination processing will be described with reference to
As illustrated in
Furthermore, the server 2 acquires the user's past post history (comments, images, exchanges with friends) to a social media service, schedule history, position information history, action history, etc., and analyzes in what kind of situation and what kind of behavior, remarks, etc., the user performs and what kind of emotion the user has.
Thus, one or more characters (main characters) possessed by the user are determined on the basis of analysis results of attribute information, questionnaire response information, post history, schedule history, behavior history and the like, which are acquired as initial settings. Thus, as an example, it is determined that one user has four characters (personalities) such as “lively character” (who is always fine, loves a festival, and appears at girls' association when going back to hometown), “work character” (who emphasizes the balance to go well in the organization and does not express one's feeling while keeping a hardworking image), “dark-natured character” (who has dark and negative feelings, cannot express one's feelings, and is tired and has no motivation), and “shrewd character” (who manipulates the people around him/her and puts priority on one's own benefit).
The determined character information of the user is accumulated in the user information storage unit 22b. Furthermore, the character determination unit 202 may set, in the characters of the user, a character that makes the user feel happy or have fun as a happy character. Here,
As illustrated in
Next, after the server 2 grasps characters at the initial setting to a certain degree, the server 2 continuously accumulates the daily action history and the like of the user (for example, every five minutes) (step S106). The daily activity history and the like are, for example, daily conversations of the user acquired by the user terminal 1, position information (transit history), activity history (when, where, and what action (walk, run, sit, ride on a train, and the like) is taken), music the user listened to, the environmental sound of the city where the user walked, scheduler input information, posting to a social network, and the like, and those are accumulated in the user information storage unit 22b.
Then, in a case where the character is accumulated for a predetermined period (for example, one week) (step S109/Yes), the character determination unit 202 learns a character corresponding to the user's daily action pattern on the basis of the accumulated information (step S112). Accumulation and learning are repeated periodically to improve the accuracy of the character information.
Here,
The character determination processing has been specifically described above. Note that, it is also possible to notify the user of the character determination result and cause the user to correct it. Furthermore, the above questionnaire may be periodically performed to correct and update character information. Furthermore, the user may register by him/herself that “now is XX character” and may register a character for a time zone using a scheduler.
Furthermore, although the time and the place have been mainly described as the situation under which the user uses different characters, this embodiment is not limited to this. For example, in a case where the user uses a plurality of social network services, characters may be used properly for each service. Therefore, it is also possible to determine and register as character information which character is applied in which social network service.
<3-2. Change Trigger Output Processing>
Subsequently, how to output a change trigger for changing a character of the user will be described with reference to
As illustrated in
Next, the change trigger output control unit 203 determines whether or not a character change is necessary (step S126). For example, in a case where the user is in the office and in “work character” during a time zone that is usually “mammy character” (refer to
Then, in a case where it is determined that the character change is necessary (step S126/Yes), the change trigger output control unit 203 performs control to output a change trigger that is a trigger for changing to the happy character (step S129). The change trigger is, for example, provision of information for prompting a change in action, and a proposal by an agent (for example, a proposal to prompt the user at least to leave a “office”, such as “why don't you buy sweets back home?”), an environmental sound related to the happy character (for example, a voice that evokes the environment of “mammy character”, such as a child's voice), a video (for example, a sound that evokes the environment of “mammy character”, such as a picture of a child), a smell (for example, a sound that evokes the environment of “mammy character”, such as the smell of a house), and the like are assumed.
The output control of a change trigger according to the present embodiment has been described above. Note that, in the example described above, the necessity of character change is automatically determined from the situation of the user, but the present embodiment is not limited to this, and the necessity of the character change may be determined on the basis of the schedule previously input by the user. For example, in a case where the schedule for work is up to 17:00, and the time for the mommy character is scheduled to be from 18:00, if the user is at an office even after 18:00 and remains the work character, it may be determined that a character change is necessary.
This allows the user to appropriately control his/her character with the assistance of an agent, and to take unconsciously or deliberately an action of establishing a beneficial, fortunate or favorable relationship for the user.
For example, when a female user A in her twenties working in a company walks out of the office after 22:00 on Friday after finishing a meeting until 21:00, a wearable device (user terminal 1) worn by the user A recognizes on a basis of her gait and sigh, biological information, schedule information of this week, web history, and the like that the user A has finished a busy week at work, is walking to a station with heavy footsteps without looking at the shopping WEB site that she always checks, and the user A is changing from “work character” to “dark-natured character”.
Note that, as an example, each function of the server 2 illustrated in
As described above, since the user A has been changed to “dark-natured character”, the user terminal 1 searches for a happy character possessed by the user A. For example, in a case where the user A's “character who loves hometown” (a character that has a strong love for her hometown and the user feels at ease with his/her friends (childhood friends) there (relieve herself)) is set to a happy character, the user terminal 1 searches for history information (voice, laugh of friends, photo, videos, posting history, etc. in a fun drink party with hometown friends) when the user A returns hometown and becomes a “character who loves hometown” and provides the user with the information as a change trigger. For example, in a case where the user terminal 1 is an open-air earphone worn on the ears of the user A, a voice and a laugh of a friend recorded in a fun drink party with a hometown friend may be mixed with surrounding environmental sounds (noises) so as to be output controlled to be faintly audible.
Note that in a case where the user terminal 1 is a glasses-type terminal, a slide show of the photos taken when the user returns hometown may be made within a range not disturbing user's view. In a case of a smart phone, a speech of a friend at a drinking party in hometown may be displayed faintly with a speech bubble or a post at the time may be displayed at the end of a display screen.
In this manner, the user A can remind the time when the user had fun and he/she was fine and the user terminal prompts to change to a happy character herself.
As a result, the user A changes from “dark-natured character” to a bright and energetic character, and for example, makes a reservation for yoga while thinking “I will get up early tomorrow and do morning yoga I have wanted” as an voluntary action. In a case where the morning yoga schedule comes up in the schedule, the user terminal 1 can estimate that the character change is successful (effective).
In addition, the user A changes from “dark-natured character” to “character who loves hometown” and takes out a smartphone to call or send a message to a hometown friend. In a case where a contact to a friend is made or a schedule to meet a friend comes up, the user terminal 1 can estimate that the character change is successful (effective).
<3-3. Parameter Correction Processing>
Subsequently, parameter correction processing will be described with reference to
(Correction of Priority of Change Trigger)
Note that, in the present embodiment, in a case where there is a plurality of change triggers for a certain character, the priority may be set in advance as a default. The default priority may be random, or may be arranged by estimating the priority from the tendency of the user's past history. For example, a case will be described where there are change triggers whose priority are set as follows, and the change triggers are output from the upper side.
After outputting the change triggers in a method of the highest priority, the control unit 20 of the server 2 determines whether or not a character change is successful (step S212). For example, in a case of outputting a change trigger for prompting a change to the happy character, the control unit 20 can determine whether or not a character change is successful on a basis of whether or not a happy (happy, fine, positive) action change happens such as that the sigh of the user is reduced, a footstep is lightened, a user is smile, a schedule for meeting with or going out with someone is input, the user contacts a friend or a lover, or the user feels happy. Furthermore, it may also be determined that the character change is successful even in a case where there is a change from a situation (place, environment) that the user wants to leave, even before completely changing to a happy character, such as leaving office.
Next, in a case where the character change is not successful (step S212/No), the change trigger output control unit 203 changes to the change trigger with the next highest priority (step S215), returns to step S209, and outputs the change trigger (step S209).
Next, in a case where the character change is successful (step S212/Yes), the priority of the change trigger is corrected (step S218). In other words, the change trigger (method, content) in which the character change is successful is learned.
For example, in “agent's voice”, in a case where there is no change in the user, a change trigger in “music” is given, and in a case where it is successful, the priority of the change trigger is changed as indicated in Table 2 below.
Furthermore, in a case where there is no change in the user also in “music”, a change trigger is given in the next “video”, and in a case where it is successful, the priority of the change trigger is changed as indicated in Table 3 below.
(Correction of Parameters by User)
For example, in a case where the user is the work character outside the office despite a time zone of the mommy character, usually at 20:00 on weekdays (for example, it can be determined from word usage, conversation contents, fatigue level, smile level, etc.), the user terminal 1 presents a change trigger to the happy character. However, in a case where it is correct to be a work character since the user is actually eating and drinking with a superior and a business partner, the user requests parameter correction by the operation input unit 12 since it is correct to be a work character now. The correction content of the parameter may be manually input by the user, or may be automatically input by recognizing the situation on the user terminal 1 side. For example, time zones, places, and people around the user are recognized (recognizable by a camera, a voiceprint, speech contents, schedule information), and the parameters are corrected. Here,
(Supplementary of Characters)
Furthermore, the user can supplement characters he/she wants to be for pay or free of charge. The obtained character information is accumulated in the user information storage unit 22b together with the main characters.
For example, in a case where a celebrity character (happy character) is obtained, for example, the following change trigger is supplemented as celebrity character information.
The activation timing (parameters such as time and place) of the celebrity character may be set by the user (such as input to a linked scheduler). Furthermore, recommended setting may be made such that the system makes determination appropriately.
<3-4. Processing of Outputting Change Trigger to Minor Character Minor Characters>
Subsequently, change trigger output processing to a rare character which is rarely detected from everyday behavior patterns will be described. As a result of the character determination in the initial setting, minor characters (characters that appear only under a certain situation, characters that are normally suppressed intentionally) can be extracted in addition to the user's main characters. Such minor characters are grasped by a default questionnaire or past history, but in a case where the frequency of appearance is, for example, once in three months, they are not learned as characters corresponding to a daily action pattern, and in a case where the user intentionally suppresses the characters, there is a high possibility that the characters are not registered in a linked scheduler.
However, it is also possible for the system to determine that it is better to change to such a minor character according to the user situation, and to provide a change trigger to such a character.
For example, among minor characters of the user A, there are a proactive female character (especially a character that takes an active action in love life) that rarely appears. For example, as illustrated in
However, at present after ten years from marriage, the character rarely appears in the user A. When the user A relaxes with her husband at home, there is almost no conversation between the husband who originally had only a few words and the user A who is obsessed with reading, and the communication is lacking. The user A who wants to communicate more with her husband or go on a date or spend a happy time with him makes setting to output a change trigger that prompts a change to a proactive female character in the system.
Specifically, for example, when the user A is at home with her husband, the character may be activated when there is no conversation between the couple for more than 5 minutes, or activated in a situation where laughter or smile is not detected for a certain period of time. Hereinafter, this will be specifically described with reference to
Next, the change trigger output control unit 203 of the server 2 determines a change timing to a minor character on the basis of voice information and sensor information acquired by the user terminal 1 (step S309) and outputs a change trigger at the timing satisfying the condition (step S309). The change trigger to the minor character may be, for example, as indicated in Table 5 below, but can be changed as appropriate.
Next, in a case where the character change fails (step S312/No), the change trigger is changed to the next highest priority change trigger (step S318).
Next, in a case where the character change is successful (step S312/Yes), the priority of the change trigger is corrected with the successful content (step S315).
<3-5. Change Trigger Output Processing Among a Plurality of Agents>
Subsequently, change trigger output processing among a plurality of agents will be described with reference to
Between predetermined group members formed after mutual approval (for example, lovers, specific friend groups, family, etc.), the character information of each user is shared between agents, and the agent can request a character change to the other agent at the optimal timing.
As illustrated in
Next, the agent Vb reminds the user B who seems to be bored by showing a date picture with the user A or playing a voice at the date with the user A to remind the user A to prompt to change to a date character.
If the character change is successful, the user B is expected to contact the user A. The user A side feels as if it is like a telepathy and feels happy since the user A gets contact from the user B when the user A feels lonely because of no contact from the user B.
Although the outline has been described above, the agent Va and the agent Vb are virtual, and the operation of each agent can be performed in the server 2 and each user terminal 1. Furthermore, in a case where an application for realizing the function of the server 2 illustrated in
Subsequently, the change trigger output processing among a plurality of agents will be described with reference to
As illustrated in
Next, the user terminal 1b of the user B outputs a change trigger so as to change the user B into a date character that contacts the user A (giving a change trigger for changing the user A to a happy character) (step S412).
<3-6. Advertising>
Furthermore, in the present embodiment, it is also possible to present an advertisement according to a character of the user. Depending on the character, the sense of money, the item to be purchased, and the service desired to be used may be different, so it is possible to present the user with an optimal advertisement in accordance with the character.
Furthermore, on the basis of the user's various histories (time, place, companion, purchase history, etc.) when each character is appearing, the user's preference and tendency at the time of each character appearance are analyzed, and an advertisement matching the character can be also provided. Advertisements are provided by the user terminal 1 in the form of images, voice, and the like.
Furthermore, the timing of the advertisement provision may be in accordance with the current character, or may be in accordance with the character expected to appear next.
Furthermore, when a character managing a family budget, such as a mammy character, is appearing, advertisements for other characters may be presented together, and the advertisements may be presented intensively when the user is a neutral character in which a vehicle travel time is the longest.
<3-7. Guidance to Potential Character Who Wants to Be>
In the present embodiment, it is also possible to determine the potential character the user wants to be on the basis of the past murmur or contents posted to a social network service, etc., and to provide an opportunity to change to such a character.
In such a case, as indicated in the upper part of
In this way, the system can also give a potential change trigger that is forgotten by the user.
<<5. Summary>>
As described above, in the information processing system according to the embodiment of the present disclosure, it is possible to present appropriate information so as to bring out a more preferable user's character.
The preferred embodiments of the present disclosure have been described above in detail with reference to the accompanying drawings, but the present technology is not limited to such examples. It is obvious that persons who have ordinary knowledge in the technical field of the present disclosure can conceive various modifications or corrections within the scope of the technical idea described in the claims, and naturally understood that such modifications or corrections belong to the technical scope of the present disclosure.
For example, a computer program for causing a hardware such as a CPU, ROM, and RAM built in the user terminal 1 or the server 2 described above to exhibit the function of the user terminal 1 or the server 2 can also be created. Furthermore, a computer readable storage medium storing the computer program is also provided.
Furthermore, the effects described in the present specification are merely illustrative or exemplary, and not limiting. That is, the technology according to the present disclosure can exhibit other effects apparent to those skilled in the art from the description of the present specification, in addition to or instead of the effects described above.
Note that the present technology can also have the following configurations.
(1)
An information processing apparatus including a control unit configured to:
(2)
The information processing apparatus according to (1), in which the control unit refers to information regarding one or more characters possessed by the user, and determines a character of the user according to a current time, place, or environment.
(3)
The information processing apparatus according to (1) or (2), in which the control unit refers to information regarding one or more characters possessed by the user, and determines a character of the user on the basis of at least one of voice information, action recognition, or biological information.
(4)
The information processing apparatus according to any one of (1) to (3), in which the control unit determines whether or not it is a timing to change to the predetermined character on the basis of at least one of time, place, environment, voice information, action recognition, or biological information.
(5)
The information processing apparatus according to any one of (1) to (4),
(6)
The information processing apparatus according to any one of (1) to (5),
(7)
The information processing apparatus according to any one of (1) to (6),
(8)
The information processing apparatus according to any one of (1) to (7),
(9)
The information processing apparatus according to any one of (1) to (8),
(10)
The information processing apparatus according any one of (1) to (9),
(11)
The information processing apparatus according to any one of (1) to (10),
(12)
An information processing method, by a processor, including:
(13)
A program for causing a computer to function as a control unit configured to:
Number | Date | Country | Kind |
---|---|---|---|
2017-071508 | Mar 2017 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/002215 | 1/25/2018 | WO | 00 |