The present disclosure relates to an information processing apparatus and an information processing method.
There has been known a guide system that gives a guide regarding exhibits and the interior of a facility by using voice information and image information to a facility user (hereinafter, appropriately referred to as “user”) in an exhibit appreciation facility such as an art gallery and a museum.
Such a guide system includes a device that automatically reproduces a voice guide regarding an exhibit in accordance with a position of the user with respect to the exhibit (e.g., see Patent Literature 1).
The above-described conventional technique, however, has room for further improvement in giving a guide in accordance with the state and preference of the user.
Specifically, when the above-described conventional technique is used, a voice guide for an exhibit may be automatically reproduced even when the user is close to the exhibit but is not actually appreciating the exhibit.
In this regard, for example, automatic reproduction only in a case where the user is in an appreciation state can be performed by considering, for example, an orientation of the face and a line of sight of the user. Note, however, that, in general, only the same explanation may be able to be given to users having different appreciation styles and preferences, or redundant explanations may be given for an exhibit which the user is not interested in.
Thus, the present disclosure proposes an information processing apparatus and an information processing method capable of giving a guide in accordance with the state and preference of the user.
In order to solve the above problems, one aspect of an information processing apparatus according to the present disclosure includes: an acquisition unit that acquires a state of a user who appreciates an appreciated object; an estimation unit that estimates an appreciation pattern, which is a type of an appreciation style of the user, based on the state of the user acquired by the acquisition unit; a determination unit that determines a mode of a guide regarding the appreciated object presented to the user in accordance with the appreciation pattern estimated by the estimation unit; and a guide control unit that executes guide control in which control is performed such that the guide is presented to the user in accordance with the mode of the guide determined by the determination unit.
An embodiment of the present disclosure will be described in detail below with reference to the drawings. Note that, in the following embodiment, the same reference signs are attached to the same parts to omit duplicate description.
Furthermore, in the present specification and the drawings, a plurality of components having substantially the same functional configuration may be distinguished by attaching hyphenated different numbers after the same reference signs. For example, a plurality of configurations having substantially the same functional configuration is distinguished as a guide terminal 30-1 and a guide terminal 30-2, as necessary. Note, however, that, when it is unnecessary to particularly distinguish a plurality of components having substantially the same functional configuration, only the same reference signs are attached. For example, when it is unnecessary to particularly distinguish the guide terminal 30-1 and the guide terminal 30-2, the guide terminal 30-1 and the guide terminal 30-2 are simply referred to as guide terminals 30.
Furthermore, the present disclosure will be described in accordance with the following item order.
First, an outline of an information processing method according to the embodiment of the present disclosure will be described.
A schematic configuration of the guide terminal 30 according to the embodiment of the present disclosure will be described. The guide terminal 30 is a terminal device that is lent to a user, for example, at the time of entrance to an exhibit appreciation facility (hereinafter, simply referred to as “facility”) such as an art gallery and a museum and carried by the user in the facility.
Furthermore, the guide terminal 30 presents guide information on an exhibit P approached by the user, floor guidance, and the like to the user by using voice information and image information. The guide information on the exhibit P is, for example, work explanation. The guide information on floor guidance includes a place of each exhibition area and a movement route.
As illustrated in
The voice output unit 34 is, for example, a speaker, and outputs guide information using voice as a user interface. Note that the voice output unit 34 may be earphones, a headphone, and the like.
The display unit 35 is, for example, a display, and outputs guide information based on an image. The guide information based on an image can be presented to the user by using, for example, text, a symbol, or an avatar Avr in
Furthermore, the guide terminal 30 may be a portable terminal such as a smartphone owned by the user himself/herself. In such a case, the user installs and operates a dedicated guide application, for example, before or at the time of entering the facility to cause the portable terminal of the user himself/herself function as the guide terminals 30 in the facility.
Furthermore, the guide terminal 30 may be a wearable terminal. In such a case, the guide terminal 30 is, for example, augmented reality (AR) glasses, and the user wears the AR glasses in the facility.
By the way, an existing technique using a terminal device such as the guide terminal 30 has room for further improvement in giving a guide in accordance with the state and preference of the user.
Specifically, when a voice guide regarding the exhibit P approached by the user is automatically reproduced, a voice guide regarding the exhibit P may be automatically reproduced in the existing technique even when the user is close to the exhibit P but is not actually appreciating the exhibit P.
In this regard, for example, automatic reproduction only in a case where the user is in an appreciation state can be performed by considering, for example, an orientation of the face and a line of sight of the user. Note, however, that, in general, only the same explanation may be able to be given to users having different appreciation styles and preferences, or redundant explanations may be given for an exhibit which the user is not interested in.
Thus, in the information processing method according to the embodiment, an information processing system 1 executes guide control. In the guide control, the information processing system 1 acquires the state of a user who appreciates the exhibit P, estimates an appreciation pattern, which is a type of an appreciation style of the user, based on the acquired state of the user, determines a mode of a guide related to the exhibit P to be presented to the user in accordance with the estimated appreciation pattern, and performs control such that a guide is presented to the user in accordance with the determined mode of the guide. As illustrated in
Specifically, in the information processing method according to the embodiment, the server device 50 first appropriately acquires the state of the user from the in-facility sensor unit 10 and the guide terminal 30. The state of the user includes behaviors of appreciation actions of the user, which represent the appreciation style and preference of the user. Then, the server device 50 estimates the appreciation pattern of the user based on the acquired state of the user (Step S1).
The “appreciation patterns” here are types of the appreciation style of the user. In the embodiment of the present disclosure, as illustrated in
Then, the server device 50 determines the mode of the guide to the user in accordance with the estimated appreciation pattern (Step S2). Then, the server device 50 executes guide control on each guide terminal 30 in accordance with the determined mode of the guide (Step S3). The “guide control” here includes control of what is used as a user interface for presenting the guide information, in other words, modal presentation control.
Then, the guide terminal 30 gives a guide to the user by performing modal presentation in accordance with an instruction of the guide control (Step S4).
As described above, in the information processing method according to the embodiment, guide control is executed. In the guide control, the state of a user who appreciates the exhibit P is acquired, an appreciation pattern, which is a type of an appreciation style of the user is estimated, based on the acquired state of the user, a mode of a guide related to the exhibit P to be presented to the user is determined in accordance with the estimated appreciation pattern, and control is performed such that a guide is presented to the user in accordance with the determined mode of the guide.
Therefore, according to the information processing method of the embodiment, a guide can be given in accordance with the state and preference of the user.
A configuration example of the information processing system 1 to which the information processing method according to the above-described embodiment is applied will be more specifically described below.
In other words, components in
Furthermore, in description with reference to
As described above, as illustrated in
The in-facility sensor unit 10 is a group of various sensors provided at various places in a facility. As illustrated in
The description returns to
The guide terminal 30 appropriately transmits the state of the user to the server device 50, and gives a guide to the user based on an instruction related to the guide control from the server device 50 in the mode of the guide in accordance with the instruction.
The server device 50 is implemented as, for example, a cloud server, and appropriately acquires the state of the user from the in-facility sensor unit 10 and the guide terminal 30. Then, the server device 50 estimates the appreciation pattern of the user based on the acquired state of the user, and determines the mode of the guide for each user in accordance with the estimated appreciation pattern. Then, the server device 50 executes guide control on the guide terminal 30 in accordance with the determined mode of the guide.
Next,
As illustrated in
The in-terminal sensor unit 32 is a group of various sensors provided in the guide terminal 30. As illustrated in
The description returns to
The storage unit 36 is implemented by, for example, a semiconductor memory element, such as a random access memory (RAM), a read only memory (ROM), and a flash memory, or a storage device, such as a hard disk and an optical disk.
In the example in
The control unit 37 is a controller, and is implemented by, for example, a central processing unit (CPU) and a micro processing unit (MPU) executing various programs (not illustrated) stored in the storage unit 36 using a RAM as a work area. Furthermore, the control unit 37 can be implemented by an integrated circuit such as an application specific integrated circuit (ASIC) and a field programmable gate array (FPGA).
The control unit 37 includes an acquisition unit 37a, a transmission unit 37b, and an output control unit 37c. The control unit 37 implements or executes functions and effects of information processing described below.
The acquisition unit 37a acquires an instruction operation of the user via the operation unit 31 and the microphone 32c, and causes the output control unit 37c to perform output control in accordance with the instruction operation. Furthermore, the acquisition unit 37a acquires the states of the guide terminal 30 and the user from the in-terminal sensor unit 32, and causes the transmission unit 37b to transmit the states to the server device 50.
Furthermore, the acquisition unit 37a acquires an instruction related to the guide control from the server device 50 via the communication unit 33, and causes the output control unit 37c to perform output control for giving a guide to the user in accordance with the instruction.
The transmission unit 37b appropriately transmits the states of the guide terminal 30 and the user acquired by the acquisition unit 37a to the server device 50 via the communication unit 33.
The output control unit 37c performs output control on the voice output unit 34 and the display unit 35 in accordance with the instruction operation of the user acquired by the acquisition unit 37a. Furthermore, the output control unit 37c performs output control on the voice output unit 34 and the display unit 35 in accordance with the instruction related to the guide control from the server device 50 acquired by the acquisition unit 37a.
Next,
Similarly to the above-described communication unit 33, the communication unit 51 is implemented by, for example, an NIC. The communication unit 51 is connected to the in-facility sensor unit 10 and the guide terminal 30 via the network N by wire or wirelessly, and transmits and receives various pieces of information to and from the in-facility sensor unit 10 and the guide terminal 30.
Similarly to the above-described storage unit 36, the storage unit 52 is implemented by, for example, a semiconductor memory element, such as a RAM, a ROM, and a flash memory, or a storage device, such as a hard disk and an optical disk.
In the example in
The pattern estimation model 52b is an estimation model that estimates the appreciation pattern of the user based on the states of the user and the guide terminal 30 acquired from the in-facility sensor unit 10 and the guide terminal 30. The pattern estimation model 52b is generated as a learning model using an algorithm of machine learning such as deep learning. In such a case, for example, when the state of the user acquired by an acquisition unit 53a to be described later is input, the pattern estimation model 52b outputs the estimated appreciation pattern and the accuracy thereof.
The pattern history DB 52c is information on a history of appreciation patterns estimated in the past by an estimation unit 53b to be described later. The user information DB 52d is information on the attribute of the user and the like, and includes, for example, an impairment type indicating that the user is visually impaired or hearing impaired.
Similarly to the above-described control unit 37, the control unit 53 is a controller. The control unit 53 is implemented by, for example, a CPU and an MPU executing various programs (not illustrated) stored in the storage unit 52 using a RAM as a work area. Furthermore, the control unit 53 can be implemented by an integrated circuit such as an ASIC and an FPGA.
The control unit 53 includes the acquisition unit 53a, the estimation unit 53b, a determination unit 53c, and a guide control unit 53d. The control unit 53 implements or executes functions and effects of information processing to be described below.
The acquisition unit 53a acquires the state of the user transmitted from the in-facility sensor unit 10 and the guide terminal 30 via the communication unit 51. The estimation unit 53b estimates the appreciation pattern of the user based on the state of the user acquired by the acquisition unit 53a. Note that the estimation unit 53b analyzes the acquired state of the user by performing image analysis or the like, as necessary. Furthermore, the estimation unit 53b estimates the appreciation pattern of the user based on a movement trajectory, a distance to the exhibit P, and the like obtained as a result of the analysis.
Here, the appreciation pattern according to the embodiment of the present disclosure will be specifically described.
As illustrated in
In the exhibition area EA, as illustrated in
Furthermore, as illustrated in
Furthermore, as illustrated in
Furthermore, as illustrated in
The estimation unit 53b inputs the state of the user acquired by the acquisition unit 53a to the pattern estimation model 52b, and receives a label value of one of the four types and the accuracy thereof from the pattern estimation model 52b.
Then, when the accuracy is equal to or more than a predetermined threshold, the estimation unit 53b estimates an appreciation pattern corresponding to the received label value as the appreciation pattern of the corresponding user. Furthermore, when the accuracy is less than the predetermined threshold, the estimation unit 53b estimates an appreciation state of the user. Note that, in the case, one of the four types of appreciation patterns may be regarded as being adopted. In the embodiment of the present disclosure, when the accuracy is less than the predetermined threshold, for example, the appreciation pattern “B” is uniformly regarded as being adopted.
Furthermore, when there is a certain number or more of users who are in an appreciation state obviously different from the four types, a new appreciation pattern may be added. The obviously different appreciation state is, for example, a case where the user appreciates the exhibit P without approaching the exhibit P, returns to the route, and approaches and appreciates only some exhibits P.
Furthermore, in a case of a group that takes the same action, the estimation unit 53b may estimate the appreciation pattern based on the state of any one (e.g., leader) in the group, and regard all members in the group as having the same appreciation pattern. In such a case, the determination unit 53c to be described later determines the same one guide mode for all the members in the group in accordance with the appreciation pattern.
The description returns to
As illustrated in
Furthermore, for example, in a case of the appreciation pattern “G”, the determination unit 53c determines the mode of the guide such that only the exhibit P at a position where the user has stopped is explained for the corresponding user. Furthermore, in the case of the appreciation pattern “G”, the determination unit 53c determines the mode of the guide such that a related exhibit and the like that have not been appreciated by the user are recommended, for example.
Furthermore, for example, in a case of the appreciation pattern “F”, the determination unit 53c determines the mode of the guide such that a schematic explanation is prioritized over a detailed explanation for the corresponding user. The schematic explanation includes, for example, an outline and the historical background of the exhibit P, explanation in a case where the exhibit P is looked at from a distance, and the like. Furthermore, in the case of the appreciation pattern “F”, the determination unit 53c determines the mode of the guide such that the exhibit P to be looked at next by the user and the like are recommended, for example.
Furthermore, for example, in a case of the appreciation pattern “B”, the determination unit 53c determines the mode of the guide such that a simple explanation is made for the corresponding user. Furthermore, when the user continues to stop, the determination unit 53c determines the mode of the guide such that explanation is detailed. Furthermore, in the case of the appreciation pattern “B”, the determination unit 53c determines the mode of the guide such that a general-purpose recommendation is made or such that a recommendation is not particularly made, for example.
Note that, if the user requests a guide in any mode, the mode is prioritized over a mode of the guide determined by the determination unit 53c.
The description returns to
By the way, the appreciation pattern of each user is not necessarily fixed. Therefore, the determination unit 53c may change the mode of the guide following time-series changes of the appreciation pattern of the user.
Note that, in general, the user is most likely to maintain the previous appreciation pattern. When the previous appreciation pattern is maintained, the determination unit 53c continues the mode of the guide determined in accordance with the previous appreciation pattern.
Furthermore, when the transition probability has a value equal to or more than a certain value, the determination unit 53c changes the mode of the guide such that whether a transition to the appreciation pattern is actually made can be confirmed. For example, the determination unit 53c intentionally presents the mode of the guide corresponding to another appreciation pattern on a trial basis, and formally determines the mode of the guide in accordance with reaction of the user. Furthermore, when the transition probability changes to have a value more than the above-described value equal to or more than a certain value, the determination unit 53c determines the mode of the guide in accordance with the changed appreciation pattern without performing confirmation.
For example, when the largest transition probability is less than 0.2, the determination unit 53c continues the mode of the guide in accordance with the previous appreciation pattern. Furthermore, when the largest transition probability is equal to or more than 0.2 and less than 0.5, the determination unit 53c confirms whether a recommendation to the user is desired, for example. Furthermore, when the largest transition probability is equal to or more than 0.5, the determination unit 53c determines the mode of the guide in accordance with the changed appreciation pattern. In the example of
By the way, through a preliminary experiment, the applicant has obtained a result that there is a difference between evaluations of the user for an appreciated object depending on a difference between modes presented to the user.
First, explanation tends to improve the evaluation of the user for an appreciated object. Furthermore, even in a case of explanation of the same contents, as illustrated in
That is, it is known that change of the mode may influence the evaluation of the same appreciated object. Thus, in the embodiment of the present disclosure, modal presentation control is performed in consideration of such a point.
First, as illustrated in
Note that, when an explanation is not easily understood only by voice, for example, when a specific place or position of the exhibit P is desired to be pointed, a line of sight of the user may be led by using text and a symbol as a supplementary mode.
As illustrated in
Furthermore, when the guide type is “floor guidance”, the determination unit 53c selects “avatar” as the presented mode, and causes the guide terminal 30 to perform floor guidance performed by an avatar. A mode other than the avatar is appropriately selected as the supplementary mode.
As illustrated in
Furthermore, in a case where the guide type is “companion appreciation”, the determination unit 53c selects “avatar” as the presented mode, and causes the guide terminal 30 to perform the companion appreciation. In the companion appreciation, an avatar accompanies the user, and enjoys the interest of the user together. The supplementary mode is particularly unnecessary.
As illustrated in
Furthermore, when the guide type is “request response”, that is, in a case of a guide in which any user request is responded each time, the determination unit 53c selects “avatar” as the presented mode, and causes the guide terminal 30 to perform the request response. In the request response, an avatar is called only when a user request is made, and gives a guide. The supplementary mode is particularly unnecessary.
As illustrated in
Furthermore, as illustrated in
By the way, in general, the user may have visual impairment or hearing impairment. Thus, in such a case, the determination unit 53c can determine the mode of the guide such that a presented mode in accordance with impairment of the user is selected.
Whether or not the user has visual impairment or hearing impairment can be registered to the above-described user information DB 52d preliminarily or at the time of entrance, for example. The determination unit 53c refers to the user information DB 52d. As illustrated in
This enables an optimum presented mode in accordance with impairment of the user to be selected, and enables the guide terminal 30 to give a guide in the presented mode.
Furthermore, a user who has difficulty in viewing or hearing in a certain situation although the user is not registered in the user information DB 52d as having impairment or does not receive impairment recognition is also assumed.
In such a case, for example, when a guide is given in a predetermined presented mode but assumed reaction is not obtained from the user, the determination unit 53c can change the presented mode. When the reaction is obtained, the determination unit 53c can continue to give a guide in the changed presented mode.
Specifically, as illustrated in
Furthermore, for example, when the user reacts well to “other than voice”, which is the predetermined presented mode, the determination unit 53c continues the presented mode as other than voice. In contrast, when the user reacts poorly such as the user hesitates to, for example, floor guidance based on other than voice, the determination unit 53c switches the presented mode from other than voice to voice.
This enables an optimum presented mode in accordance with a situation to be selected for the user who has difficulty in viewing or hearing in a certain situation although the user is not registered in the user information DB 52d as having impairment or does not receive impairment recognition, and enables the guide terminal 30 to give a guide in the presented mode.
Next, a processing procedure executed by the server device 50 will be described with reference to
As illustrated in
Note that the estimation unit 53b determines whether or not the accuracy of the estimated appreciation pattern is equal to or more than a predetermined threshold (Step S103). When the accuracy is equal to or more than the predetermined threshold (Step S103, Yes), the determination unit 53c determines the mode of a guide to the user in accordance with the estimated appreciation pattern (Step S104).
In contrast, when the accuracy is less than the predetermined threshold (Step S103, No), the estimation unit 53b estimates the appreciation state of the user based on the state of the user (Step S105), and the determination unit 53c determines the mode of the guide to the user in accordance with the estimated appreciation state (Step S106).
Then, the guide control unit 53d executes guide control on the guide terminal 30 in accordance with the determined mode of the guide (Step S107). Then, the server device 50 repeats the processing from Step S101.
Note that the above-described embodiment can include some variations.
Although the guide terminal 30 ideally accumulates guide content covering all modes in the guide content DB 36a, the guide terminal 30 may have a limited storage capacity.
Therefore, the guide terminal 30 preferably acquires guide content from the server device 50, as necessary. If guide content is acquired each time the appreciation pattern changes, however, a delay may occur.
Thus, the server device 50 may select guide content to which transition is likely to be performed in consideration of the current appreciation pattern of the user based on, for example, the above-described pattern history DB 52c, and preliminarily distribute the guide content to the guide terminal 30. This can reduce the possibility of the occurrence of a delay without wasting the storage capacity of the guide terminal 30.
Furthermore, the user may be led so as to take an action from in the current appreciation pattern of the user to in a desired appreciation pattern in accordance with, for example, an organizer intention or a degree of congestion of a facility.
As illustrated in
In such a case, the determination unit 53c determines the mode of the guide such that a recommendation is made based on the organizer intention, for example. As illustrated in the figure, the recommendation in such a case includes, for example, “This work is recommended” and “It is good to come close to look”. That is, in the case, the determination unit 53c leads the user in the appreciation pattern “G” or “F” to take an action in the appreciation pattern “A”.
This can increase the possibility that the user U, who is originally not likely to approach the notable work PX, approaches the notable work PX based on the organizer intention.
Furthermore, as illustrated in
In such a case, as illustrated in the figure, the determination unit 53c determines the mode of the guide such that a recommendation of “It is good to look from a distance” is made, for example. Furthermore, as illustrated in the figure, the determination unit 53c determines the mode of the guide such that comparison with another exhibit P, explanation, and the like are made in a state in which the user U is in the central portion of the exhibition area EA, for example. That is, in the case, the determination unit 53c leads the user in the appreciation pattern “A” to take an action in the appreciation pattern “F”.
This enables the user U who originally desires to approach and appreciate the exhibit P to enjoy appreciation without approaching the exhibit P.
Furthermore, as illustrated in
In such a case, as illustrated in the figure, the determination unit 53c determines the mode of the guide such that a recommendation of “Come close and look at the brushwork” is made, for example. Furthermore, as illustrated in the figure, the determination unit 53c determines the mode of the guide such that the user U is caused to compare a plurality of works approaching the exhibit P, for example. That is, in the case, the determination unit 53c leads the user in the appreciation pattern “F” to take an action in the appreciation pattern “A” or “G”.
This makes it possible to give the user U who originally desires to appreciate the exhibit P from a distance an opportunity to approach the exhibit P and enjoy appreciation at a new attention point for the user U such as brushwork in a case of a painting and the like.
Furthermore, in
In such a case, as illustrated in the figure, the determination unit 53c determines the mode of the guide such that a recommendation of “First, look at the next exhibition room” is made, for example. That is, in the case, the determination unit 53c leads the user in the appreciation pattern “A” to take an action in the appreciation pattern “G”.
This enables the user U who originally desires to appreciate the exhibit P approaching the exhibit P along the route to change the route and enjoy appreciation, and enables the user to enjoy appreciation approaching the exhibit P again at the time when the congestion is alleviated later.
Furthermore, it has been already described that the guide terminal 30 can receive an utterance instruction of a user. The utterance instruction may contain a rough expression using a pronoun and the like, but meanwhile, the utterance instruction may contain a limiting expression to some extent.
In such a case, the determination unit 53c determines the mode of the guide such that the user U is led to the central portion of the exhibition area EA and comprehensive work explanation is made in accordance with a rough expression recognized as a result of voice recognition, for example.
This enables the user U to enjoy appreciation of a work group in accordance with rough expressions uttered by the user U in a comprehensive manner.
In contrast, as illustrated in
In such a case, the determination unit 53c determines the mode of the guide such that the user U is led to the vicinity of the corresponding exhibit P and a detailed work explanation is made in accordance with the limiting expressions recognized as a result of voice recognition, for example.
This enables the user U to enjoy appreciation of a work in accordance with limiting expressions uttered by the user U more closely or in detail.
Next,
In such a case, as illustrated in
This can prevent the above-described confusion. This enables a guide to be given to the user in the same mode in at least the same exhibition area EA, and enables the user to stably enjoy appreciation.
Next,
As illustrated in
This enables the user to take an appreciation action in accordance with movement of another user other than the user himself/herself. For example, when the user in the appreciation pattern “A” does not want to be disturbed in appreciation by another user, the user can avoid the exhibit P that is likely to be crowded, and easily take an appreciation action in accordance with the appreciation pattern and preference of the user himself/herself.
Furthermore, as illustrated in
This can prevent the avatar Avr from being superimposed on the exhibit P and making the exhibit P difficult to see. Furthermore, this can prevent the brightness of the avatar Avr from disturbing appreciation in a case of low surrounding brightness. Furthermore, when it can be detected that the user is concentrating on appreciation of the exhibit P, the brightness of the avatar Avr may be lowered. Furthermore, the invisibility may include displaying only the outline of the avatar Avr or making the outline transparent.
Furthermore, although, in the above-described embodiment, a case where an algorithm of Bayesian estimation is used for transition estimation of an appreciation pattern has been described in an example, this is merely one example, and an algorithm to be used is not limited. Furthermore, the pattern estimation model 52b is not necessarily used for estimating the appreciation pattern. The appreciation pattern may be estimated by pattern matching based on image analysis or the like.
Furthermore, among pieces of processing described in the above-described embodiment, all or part of processing described as being performed automatically can be performed manually, or all or part of processing described as being performed manually can be performed automatically by a known method. In addition, the processing procedures, specific names, and information including various pieces of data and parameters in the above document and drawings can be optionally changed unless otherwise specified. For example, various pieces of information in each figure are not limited to the illustrated information.
Furthermore, each component of each illustrated device is functional and conceptual, and does not necessarily need to be physically configured as illustrated. That is, the specific form of distribution/integration of each device is not limited to the illustrated one, and all or part of the device can be configured in a functionally or physically distributed/integrated manner in any unit in accordance with various loads and use situations.
For example, the guide terminal 30 may include some or all of processing units of the control unit 53 of the server device 50, and the guide terminal 30 may implement or execute some or all of the functions and effects of information processing executed by the control unit 53 of the server device 50.
Furthermore, the above-described embodiment can be appropriately combined in a region where the processing contents do not contradict each other. Furthermore, the order of steps in the sequence diagrams or the flowcharts of the embodiment can be appropriately changed.
The guide terminal 30 and the server device 50 according to the above-described embodiment are implemented by a computer 1000 having a configuration as illustrated in
The CPU 1100 operates based on a program stored in the ROM 1300 or the storage 1400, and controls each unit. For example, the CPU 1100 develops a program stored in the ROM 1300 or the storage 1400 on the RAM 1200, and executes processing corresponding to various programs.
The ROM 1300 stores a boot program such as a basic input output system (BIOS) executed by the CPU 1100 at the time when the computer 1000 is started, a program depending on the hardware of the computer 1000, and the like.
The storage 1400 is a computer-readable recording medium that non-transiently records a program executed by the CPU 1100, data used by the program, and the like. Specifically, the storage 1400 is a recording medium that records an information processing program according to the present disclosure. The information processing program is one example of program data 1450.
The communication interface 1500 connects the computer 1000 with an external network 1550. For example, the CPU 1100 receives data from another device, and transmits data generated by the CPU 1100 to another device via the communication interface 1500.
The input/output interface 1600 connects an input/output device 1650 with the computer 1000. For example, the CPU 1100 can receive data from an input device such as a keyboard and a mouse via the input/output interface 1600. Furthermore, the CPU 1100 can transmit data to an output device such as a display, a speaker, and a printer via the input/output interface 1600. Furthermore, the input/output interface 1600 may function as a medium interface that reads a program and the like recorded in a predetermined recording medium. The medium includes, for example, an optical recording medium such as a digital versatile disc (DVD) and a phase change rewritable disk (PD), a magneto-optical recording medium such as a magneto-optical disk (MO), a tape medium, a magnetic recording medium, a semiconductor memory, and the like.
For example, when the computer 1000 functions as the server device 50 according to the embodiment of the present disclosure, the CPU 1100 of the computer 1000 implements the functions of the control unit 53 by executing an information processing program loaded on the RAM 1200. Furthermore, the storage 1400 stores an information processing program according to the present disclosure and data in the storage unit 52. Note that the CPU 1100 reads the program data 1450 from the storage 1400 and executes the program data 1450. In another example, the CPU 1100 may acquire these programs from another device via the external network 1550.
As described above, according to one embodiment of the present disclosure, the server device 50 (corresponding to one example of “information processing apparatus”) includes: the acquisition unit 53a that acquires a state of a user who appreciates the exhibit P (corresponding to one example of “appreciated object”); the estimation unit 53b that estimates an appreciation pattern, which is a type of an appreciation style of the user, based on the state of the user acquired by the acquisition unit 53a; the determination unit 53c that determines a mode of a guide regarding the exhibit P presented to the user in accordance with the appreciation pattern estimated by the estimation unit 53b; and the guide control unit 53d that executes guide control in which control is performed such that the guide is presented to the user in accordance with the mode of the guide determined by the determination unit 53c. This enables a guide to be given in accordance with the state and preference of the user.
Although the embodiment of the present disclosure has been described above, the technical scope of the present disclosure is not limited to the above-described embodiment as it is, and various modifications can be made without departing from the gist of the present disclosure. Furthermore, components of different embodiments and variations may be appropriately combined.
Furthermore, the effects in the embodiment described in the present specification are merely examples and not limitations. Other effects may be exhibited.
Note that the present technology can also have the configurations as follows.
(1)
An information processing apparatus comprising:
The information processing apparatus according to (1),
The information processing apparatus according to (1),
The information processing apparatus according to (1), (2) or (3),
The information processing apparatus according to (4),
The information processing apparatus according to (5),
The information processing apparatus according to (4), (5) or (6),
The information processing apparatus according to any one of (4) to (8),
The information processing apparatus according to any one of (1) to (9),
The information processing apparatus according to (10),
The information processing apparatus according to (10) or (11),
The information processing apparatus according to (12),
The information processing apparatus according to any one of (10) to (13),
The information processing apparatus according to any one of (1) to (14),
The information processing apparatus according to (15),
The information processing apparatus according to any one of (1) to (16),
The information processing apparatus according to (17),
The information processing apparatus according to any one of (10) to (18),
An information processing method comprising:
Number | Date | Country | Kind |
---|---|---|---|
2021-077241 | Apr 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/006013 | 2/15/2022 | WO |