The present disclosure relates to an information processing apparatus, an information processing method, and a program.
In recent years, as the performance of a projection device that projects information on a wall surface or the like has improved, the projection device is being used for notification of various information.
For example, Patent Document 1 below discloses a technology of notifying a user of information by using a projection device (so-called moving projector) capable of changing the posture (that is, changing the projection direction).
Information that the user is notified of can include information with high confidentiality. In performing notification of information with high confidentiality, it is desirable that at least other user cannot visually recognize the information. Furthermore, it is desirable that the other user does not notice the state where the projection device is being driven to change the posture. Changing the posture of the projection device may indicate to other user that notification of certain information may be performed, and furthermore may attract the eyes of the other user to the information with high confidentiality.
Therefore, the present disclosure provides a mechanism capable of controlling the posture of a projection device in accordance with confidentiality of information that a user is notified of.
According to the present disclosure, there is provided an information processing apparatus including a control unit configured to control a projection process of notification information including posture control of a projection device on the basis of spatial information of a space where the projection device can perform projection, information indicating the position and the posture of the projection device, information indicating confidentiality of the notification information, information of a first user who is a notification target of the notification information, and information of a second user who is not a notification target of the notification information.
Furthermore, according to the present disclosure, there is provided an information processing method including causing a processor to control a projection process of notification information including posture control of a projection device on the basis of spatial information of a space where the projection device can perform projection, information indicating the position and the posture of the projection device, information indicating confidentiality of the notification information, information of a first user who is a notification target of the notification information, and information of a second user who is not a notification target of the notification information.
Moreover, according to the present disclosure, there is provided a program for causing a computer to function as a control unit configured to control a projection process of notification information including posture control of a projection device on the basis of spatial information of a space where the projection device can perform projection, information indicating the position and the posture of the projection device, information indicating confidentiality of the notification information, information of a first user who is a notification target of the notification information, and information of a second user who is not a notification target of the notification information.
As described above, the present disclosure provides a mechanism capable of controlling the posture of the projection device in accordance with confidentiality of information that the user is notified of. Note that the effects described above are not necessarily limited, and along with or in lieu of the effects described above, any of the effects described in the present Description, or another effect that can be grasped from the present Description may be exhibited.
Hereinafter, a preferred embodiment of the present disclosure will be described in detail with reference to the accompanying drawings. Note that in the present Description and the drawings, the same reference signs denote constituents having substantially the same functional configuration and an overlapping description will be omitted.
Note that the description will be given in the following order.
1 Overview
1.1. Overall configuration example
1.2. Overview of proposed technology
2. Device configuration example
3. Details of projection process
3.1. First case
3.2. Second case
3.3. Third case
4. Process flow
5. Modifications
6. Summary
<<1. Overview>>
<1.1. Overall Configuration Example>
The physical space 30 is a real space in which users (users A and B) can move inside. The physical space 30 may be a closed space such as an indoor space or an open space such as an outdoor space. At least, the physical space 30 is a space in which information can be projected by a projection device. The coordinates in the physical space 30 are defined as the coordinates on coordinate axes, that is, the Z axis whose axial direction is the vertical direction and the X axis and the Y axis having the horizontal plane as the XY plane. It is assumed that the origin of the coordinate system in the physical space 30 is, for example, a vertex on the ceiling side of the physical space 30.
A projector 121 is a projection device that visually notifies the user of various information by mapping and displaying the various information on any surface of the physical space 30. As the projector 121, a projector (so-called moving projector) capable of changing the posture (that is, changing the projection direction) is used. In the example illustrated in
The input unit 110 is a device for inputting information of the physical space 30 and information of the user. The input unit 110 can be realized as a sensor device that senses various information. The input units 110A and 110B are user-worn sensor devices. In the example illustrated in
The information processing system 100 outputs information with any location in the physical space 30 as an output location. First, the information processing system 100 acquires information inside the physical space 30 by analyzing information input by the input unit 110. The information inside the physical space 30 is information regarding the shape and arrangement of a real object such as a wall, a floor, or furniture in the physical space 30, information regarding the user, and the like. Then, the information processing system 100 sets the projection target area of the display object on the basis of the information inside the physical space 30, and projects the display object on the projection target area that has been set. For example, the information processing system 100 can project the display object 20 on the floor, a wall surface, the top surface of a table, or the like. In a case where a so-called moving projector is used as the projector 121, the information processing system 100 realizes control of such an output location by changing the posture of the projector 121.
A configuration example of the information processing system 100 according to the present embodiment has been described above.
<1.2. Overview of Proposed Technology>
Information that the user is notified of can include information with high confidentiality. In performing notification of information with high confidentiality, it is desirable that at least other user cannot visually recognize the information. Furthermore, it is desirable that the other user does not notice the state where the projection device is being driven to change the posture. Note that in the present Description, driving of the projection device refers to driving performed to change the posture of the projection device, such as pan/tilt mechanism driving, unless otherwise specified.
Changing the posture of the projection device may indicate to other user that notification of certain information may be performed, and furthermore may attract the eyes of the other user to the information with high confidentiality. Attracting the eyes of the other users as described above is also referred to as a gaze attraction effect below. Considering that there is possibility that information with high confidentiality is notified, it is desirable that posture control of the projection device is performed in consideration of the gaze attraction effect.
Therefore, the present disclosure provides a mechanism capable of controlling the posture of a projection device in accordance with confidentiality of information that a user is notified of. Such a mechanism will be described with reference to
Assume a case where a user existing in the physical space 30 is notified of information. The information that the user is notified of is also referred to as notification information below. The notification information can include an image (still image/moving image) and/or text or the like. A user who is a notification target of the notification information is also referred to as a first user. A user who is not the notification target of the notification information is also referred to as a second user. In the example illustrated in
When the information processing system 100 acquires notification information that the user A should be notified of, the information processing system 100 generates a display object on the basis of the notification information and projects the display object that has been generated on the physical space 30 to perform notification of the notification information. In the example illustrated in
In a case where confidentiality of notification information is high, it is desirable that the notification information that has been projected is visible only to the user A and not visible to the user B. Furthermore, in consideration of the gaze attraction effect, it is desirable that the user B does not visually recognize the state where the projector 121 is driven to project the notification information on the location visually recognized only by the user A. Therefore, the information processing system 100 imposes a restriction on driving, such as not driving the projector or driving the projector at a low speed, in a case where the projector 121 is within the visible range of the user B. Therefore, the state where the projector 121 is driven cannot be or less likely to be visually recognized by the user B. As a result, it is possible to avoid the occurrence of an unintended gaze attraction effect and to ensure confidentiality of notification information. For example, privacy of the user A is protected.
Moreover, for example, the user A does not have to move far away from the user B in order to receive notification of notification information with high confidentiality, which improves convenience.
Posture control of the projector 121 in consideration of the gaze attraction effect is beneficial not only to the user A but also to the user B. This is because if the user B sees the state in which the projector 121 is driven, attention of the user B is distracted by the projector 121. In this case, there are disadvantages such as interruption of work performed by the user B. In this regard, by performing posture control of the projector 121 in consideration of the gaze attraction effect, it is possible to avoid giving a disadvantage to the user B who is not the notification target of notification information.
The overview of the proposed technology has been described above. The details of the proposed technology will be described below.
<<2. Device Configuration Example>>
(1) Input Unit 110
The input unit 110 has a function of inputting information of the user or the physical space. The input unit 110 can be realized by various input devices.
For example, the input unit 110 can include an imaging device. The imaging device includes a lens system, a drive system, and an imaging element, and captures an image (still image or moving image). The imaging device may be a so-called optical camera or a thermographic camera that can also acquire temperature information.
For example, the input unit 110 can include a depth sensor. The depth sensor is a device that acquires depth information, such as an infrared ranging device, an ultrasonic ranging device, a time of flight (ToF) system ranging device, laser imaging detection and ranging (LiDAR), or a stereo camera.
For example, the input unit 110 can include a sound collecting device (microphone). The sound collecting device is a device that collects ambient sound and outputs audio data converted into a digital signal via an amplifier and an analog digital converter (ADC). The sound collecting device collects, for example, user voice and environment sound.
For example, the input unit 110 can include an inertial sensor. The inertial sensor is a device that detects inertial information such as acceleration or angular velocity. The inertial sensor is worn by the user, for example.
For example, the input unit 110 can be realized as a biosensor. The biosensor is a device that detects biological information such as heartbeat or body temperature of the user. The biosensor is worn by the user, for example.
For example, the input unit 110 can include an environment sensor. The environment sensor is a device that detects environment information such as brightness, temperature, humidity, or atmospheric pressure in the physical space.
For example, the input unit 110 can include a device that inputs information on the basis of physical contact with the user. Examples of such a device include a mouse, a keyboard, a touch panel, a button, a switch, and a lever. These devices can be installed in a terminal device such as a smartphone, a tablet terminal, or a personal computer (PC).
The input unit 110 inputs information on the basis of control performed by the control unit 150. For example, the control unit 150 can control the zoom ratio and the imaging direction of the imaging device.
Note that the input unit 110 may include one of these input devices or a combination thereof, or may include a plurality of input devices of the same type.
Furthermore, the input unit 110 may include a terminal device such as a smartphone, a tablet terminal, a wearable terminal, a personal computer (PC), or a television (TV).
(2) Output Unit 120
The output unit 120 is a device that outputs information to the user. The output unit 120 can be realized by various output devices.
The output unit 120 includes a display device that outputs visual information. The output unit 120 maps visual information on a surface of a real object and outputs the visual information. An example of such an output unit 120 is the projector 121 illustrated in
The output unit 120 can include an audio output device that outputs auditory information. Examples of such an output unit 120 include a speaker, a directional speaker, an earphone, a headphone, and the like.
The output unit 120 can include a haptic output device that outputs haptic information. The haptic information is, for example, vibration, force sense, temperature, electrical stimulation, or the like. Examples of the output unit 120 that outputs haptic information include an eccentric motor, an actuator, a heat source, and the like.
The output unit 120 can include a device that outputs olfactory information. The olfactory information is, for example, a scent. Examples of the output unit 120 that outputs olfactory information include an aroma diffuser and the like.
The output unit 120 outputs information on the basis of control performed by the control unit 150. For example, the projector 121 changes the posture (that is, the projection direction) on the basis of control performed by the control unit 150. Furthermore, the directional speaker changes the directivity on the basis of control performed by the control unit 150.
In the present embodiment, the output unit 120 includes at least the projector 121 including the movable unit whose posture can be changed. The output unit 120 may include a plurality of projectors 121, and may include another display device, an audio output device, or the like in addition to the projector 121.
Furthermore, the output unit 120 may include a terminal device such as a smartphone, a tablet terminal, a wearable terminal, a personal computer (PC), or a television (TV).
(3) Communication Unit 130
The communication unit 130 is a communication module for transmitting and receiving information to and from another device. The communication unit 130 performs wired/wireless communication in compliance with any communication standard such as, for example, a local area network (LAN), a wireless LAN, Wireless Fidelity (Wi-Fi, registered trademark), infrared communication, or Bluetooth (registered trademark).
For example, the communication unit 130 receives notification information and outputs the notification information to the control unit 150.
(4) Storage Unit 140
The storage unit 140 has a function of temporarily or permanently storing information for operating the information processing system 100. The storage unit 140 stores, for example, spatial information, condition information, posture information, notification information, and/or information related to the notification information as described later.
The storage unit 140 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like. The storage unit 140 may include a storage medium, a recording device that records data in the storage medium, a reading device that reads data from the storage medium, a deleting device that deletes data recorded in the storage medium, and the like.
(5) Control Unit 150
The control unit 150 functions as an arithmetic processing unit and a control device, and controls the overall operation of the information processing system 100 according to various programs. The control unit 150 is realized, for example, by an electronic circuit such as a central processing unit (CPU), or a microprocessor. Furthermore, the control unit 150 may include a read only memory (ROM) that stores a program to be used, a calculation parameter, and the like, and a random access memory (RAM) that temporarily stores a parameter that appropriately changes and the like.
As illustrated in
(5.1) Spatial Information Acquisition Unit 151
The spatial information acquisition unit 151 has a function of acquiring information of the physical space (hereinafter also referred to as spatial information) on the basis of information input by the input unit 110. The spatial information acquisition unit 151 outputs the spatial information that has been acquired to the output control unit 155. The spatial information will be described below.
The spatial information can include information indicating the type and arrangement of a real object in the physical space. Furthermore, the spatial information can include identification information of the real object. For example, the spatial information acquisition unit 151 acquires such information by recognizing the captured image. In addition, the spatial information acquisition unit 151 may acquire such information on the basis of the read result of an RFID tag attached to the real object in the physical space. Furthermore, the spatial information acquisition unit 151 may also acquire such information on the basis of user input. Note that examples of the real object in the physical space include a wall, a floor, furniture, and the like.
The spatial information can include three-dimensional information indicating the shape of the space. The three-dimensional information indicating the shape of the space is information indicating the shape of the space defined by real objects in the physical space. For example, the spatial information acquisition unit 151 acquires three-dimensional information indicating the shape of the space on the basis of depth information. In a case where information indicating the type and arrangement of real objects in the physical space and identification information of the real objects can be acquired, the spatial information acquisition unit 151 may acquire three-dimensional information indicating the shape of the space in consideration of such information.
The spatial information can include information of the material, the color, the texture, or the like of a surface forming the space (for example, a surface of a real object such as a wall, a floor, furniture, or the like). For example, the spatial information acquisition unit 151 acquires such information by recognizing the captured image. In a case where information indicating the type and arrangement of the real object in the physical space and identification information of the real object can be acquired, the spatial information acquisition unit 151 may acquire information of the material, the color, the texture, or the like in consideration of such information.
Spatial information can also include information regarding conditions within the physical space, such as brightness, temperature, and humidity of the physical space. For example, the spatial information acquisition unit 151 acquires such information on the basis of environment information.
The spatial information includes at least one of the pieces of information described above.
(5.2) User Information Acquisition Unit 152
The user information acquisition unit 152 has a function of acquiring information of the user (hereinafter also referred to as user information) on the basis of information input by the input unit 110. The user information acquisition unit 152 outputs the user information that has been acquired to the output control unit 155. The user information will be described below.
The user information can include whether or not there is a user in the physical space, the number of users in the physical space, and identification information of each user. For example, the user information acquisition unit 152 acquires such information by recognizing the face part of the user included in the captured image.
The user information can include attribute information of the user. The attribute information is information indicating attributes of the user such as age, sex, job, family structure, or friendship. For example, the user information acquisition unit 152 acquires attribute information of the user on the basis of the captured image or by using the identification information of the user to make an inquiry to the database that stores the attribute information.
The user information can include information indicating the position of the user. For example, the user information acquisition unit 152 acquires information indicating the position of the user on the basis of the captured image and the depth information.
The user information can include information indicating the posture of the user. For example, the user information acquisition unit 152 acquires information indicating the posture of the user on the basis of the captured image, the depth information, and the inertial information. The posture of the user may refer to the posture of the whole body such as standing still, standing, sitting, or lying down, or the posture of part of the body such as the face, the torso, a hand, a foot, or a finger.
The user information can include information indicating the visible range of the user. For example, the user information acquisition unit 152 identifies the positions of the eyes and the line-of-sight direction of the user on the basis of on the captured image including the eyes of the user and the depth information, and acquires information indicating the visible range of the user on the basis of such information and the spatial information. Information indicating the visible range is information indicating which location in the physical space is included in the field of view or the field of view of the user. Note that the field of view is a range visible without moving the eyes. The field of view may mean the central field of view, or may mean the central field of view and the peripheral field of view. The field of view is a range that is visible by moving the eyes. In addition, the presence of an obstacle is also taken into consideration in acquisition of the information indicating the visible range. For example, the back of an obstacle as viewed from the user is outside the visible range.
The user information can include information indicating activity of the user. For example, the user information acquisition unit 152 acquires information indicating activity of the user on the basis of biological information of the user. For example, activity is low during sleep or at the time of falling asleep, and the activity is high in other cases.
The user information can include information indicating motion of the user. For example, the user information acquisition unit 152 recognizes motion of the user by any method such as an optical method using an imaging device or an imaging device and a marker, an inertial sensor method using an inertial sensor worn by the user, or a method using depth information, and thus acquires information indicating motion of the user. The motion of the user may refer to motion of using the whole body such as movement, or motion of partially using the body such as a gesture with a hand. Furthermore, as the user information, user input to a screen displayed by mapping on any surface of the physical space as described above with reference to
The user information can include information input with voice by the user. For example, the user information acquisition unit 152 can acquire such information by voice-recognizing a speaking voice of the user.
The user information includes at least one of the pieces of information described above.
(5.3) Projector Information Acquisition Unit 153
The projector information acquisition unit 153 has a function of acquiring information regarding the projector 121. The projector information acquisition unit 153 outputs projector information that has been acquired to the output control unit 155. The projector information will be described below.
The projector information includes information indicating the location where the projector 121 is installed. For example, the projector information acquisition unit 153 acquires information indicating the position of the projector 121 on the basis of setting information in installation of the projector 121 or on the basis of a captured image of the projector 121.
The projector information includes information indicating the posture of the projector 121. For example, the projector information acquisition unit 153 may acquire information indicating the posture from the projector 121, or may acquire information indicating the posture of the projector 121 on the basis of a captured image of the projector 121. The information indicating the posture is information indicating the current posture of the projector 121, and includes, for example, current pan angle information and tilt angle information of the projector 121. Furthermore, in a case where the projector 121 makes translational movement, the information indicating the posture also includes information indicating the current position of the projector 121. Specifically, the current position of the projector 121 is the current absolute position of the optical system of the projector 121, or the current relative position of the optical system of the projector 121 with respect to the position where the projector 121 is installed. Note that since the output control unit 155 controls the posture of the projector 121 as described later, information indicating the posture of the projector 121 can be known to the output control unit 155.
The projector information can include information indicating the driving state of the projector 121. The information indicating the driving state is a driving sound or the like for changing the posture of the projector 121. For example, the projector information acquisition unit 153 acquires information indicating the driving state of the projector 121 on the basis of the detection result of the environment sensor.
(5.4) Notification Information Acquisition Unit 154
The notification information acquisition unit 154 has a function of acquiring notification information that the user is to be notified of and information related to the notification information. The notification information acquisition unit 154 outputs information that has been acquired to the output control unit 155. The notification information may be information received from the outside such as an electronic mail, or information generated due to action of the user in the physical space 30 (for example, information for navigation to a user who is moving, or the like). Information related to the notification information will be described below.
Information related to the notification information includes information for identifying the first user. Information for identifying the first user may be identification information of the first user. In this case, the first user is uniquely specified. Information for identifying the first user may also be attribute information of the user. In this case, any user who has predetermined attribute information (for example, female, age group, or the like) is specified as the first user.
Information related to the notification information includes information indicating confidentiality of the notification information (hereinafter, also referred to as confidentiality information). Confidentiality information includes information indicating the level of confidentiality and information designating the range within which notification information can be disclosed (up to friends, family, or the like). Note that examples of the information indicating the level of confidentiality includes a value indicating the degree of confidentiality, a flag indicating whether or not the notification information is information that should be kept confidential, and the like.
Information related to the notification information includes information indicating the priority of the notification information. The priority here may be regarded as an urgency. Notification information with higher priority is preferentially conveyed to the user (that is, projected).
The notification information acquisition unit 154 may acquire information for identifying the first user, confidentiality information, and information indicating the priority by analyzing the content of the notification information. The analysis target includes the sender, the recipient, the importance label of the notification information, the type of the application which has generated the notification information, the generation time (time stamp) of the notification information, and the like.
Information related to the notification information may include information for identifying the second user. Information for identifying the second user may be identification information of the second user. In this case, the second user is uniquely specified. Information for identifying the second user may be attribute information of the user. In this case, any user who has predetermined attribute information (for example, female, age group, or the like) is specified as the second user. In this case, the user other than the second user may be specified as the first user.
(5.5) Output Control Unit 155
The output control unit 155 has a function of causing the output unit 120 to output information on the basis of information acquired by the spatial information acquisition unit 151, the user information acquisition unit 152, the projector information acquisition unit 153, and the notification information acquisition unit 154. Specifically, the output control unit 155 causes the projector 121 to perform mapping so that the display object is projected on the projection target area defined on any surface in the physical space.
In particular, the output control unit 155 controls a projection process of the notification information including posture control of the projector 121 on the basis of spatial information, projector information, confidentiality information, user information of the first user, and user information of the second user. First, the output control unit 155 sets the projection target area. Next, the output control unit 155 changes the posture of the projector 121 until the notification information can be projected on the projection target area that has been set. Thereafter, the output control unit 155 causes the projector 121 to project the display object generated on the basis of the notification information on the projection target area. Hereinafter, each process will be specifically described.
Set Projection Target Area
Position of Projection Target Area
The output control unit 155 sets the position of the projection target area. The output control unit 155 sets the projection target area at a different location according to whether or not the confidentiality information satisfies a predetermined condition. The predetermined condition is that the confidentiality information indicates that the notification information is information that should be kept confidential. The case where confidentiality information satisfies the predetermined condition is a case where the confidentiality information indicates that the notification information is the information that should be kept confidential. Whether or not confidentiality information satisfies the predetermined condition can be determined according to threshold determination for determining whether or not confidentiality of the notification information is higher than a predetermined threshold, or determination using a flag or the like indicating whether or not the notification information is information that should be kept confidential. In a case where confidentiality of the notification information is higher than the predetermined threshold or a flag indicating that the notification information is information that should be kept confidential is set, it is determined that confidentiality information satisfies the predetermined condition. In the following, the fact that confidentiality information satisfies the predetermined condition is also simply referred to as confidentiality is high. In contrast, the case where confidentiality information does not satisfy the predetermined condition is a case where the confidentiality information indicates that the notification information is not the information that should be kept confidential. In a case where confidentiality of the notification information is lower than the predetermined threshold or a flag indicating that the notification information is information that should be kept confidential is not set, it is determined that confidentiality information does not satisfy the predetermined condition. In the following, the fact that confidentiality information does not satisfy the predetermined condition is also simply referred to as confidentiality is low.
Specifically, in a case where confidentiality of the notification information is high, the output control unit 155 sets the projection target area where the notification information is projected within the visible range of the first user and outside the visible range of the second user. Regarding notification information with high confidentiality, since the projection target area is set within a range that is visible only to the first user, confidentiality of the notification information can be secured.
Here, setting the projection target area within the visible range of the first user means that at least part of the projection target area overlaps with the visible range of the first user. That is, not the entire projection target area may be included within the visible range of the first user. This is because as long as the first user notices that the notification information is projected, the notification information that has been projected can attract gaze. Furthermore, setting the projection target area outside the visible range of the second user means that the projection target area and the visible range of the second user do not overlap with each other. Therefore, confidentiality of the notification information is further secured. Furthermore, it is desirable that a predetermined buffer is provided between the projection target area and the visible range of the second user so as to be separated from each other. Therefore, it is possible to keep the projection target area outside the visible range of the second user even if the second user, for example, slightly moves the posture, and confidentiality is further secured.
In contrast, in a case where confidentiality of the notification information is low, the output control unit 155 sets the projection target area where the notification information is projected within the visible range of the first user. Regarding notification information with low confidentiality, it is allowed to set the projection target area without considering the second user. That is, the projection target area may be set within the visible range of the second user. As a result, it becomes possible to increase choices of locations for setting the projection target area.
Hereinafter, an example of setting the projection target area for notification information with high confidentiality will be described with reference to
Size of Projection Target Area
The output control unit 155 sets the size of the projection target area.
The output control unit 155 may set the size of the projection target area on the basis of the distance between the position of the first user and the position of the projection target area. For example, the output control unit 155 sets the projection target area smaller as the distance between the position of the first user and the position of the projection target area is smaller, and sets the projection target area larger as the distance is greater. This is to facilitate recognition of projected characters or the like.
The output control unit 155 may set the size of the projection target area on the basis of notification information. For example, the output control unit 155 sets the projection target area larger as the number of characters included in notification information increases, and sets the projection target area smaller in a case where notification information includes only simple icons.
The output control unit 155 may set the size of the projection target area on the basis of spatial information. For example, the output control unit 155 sets the size of the projection target area within the range that does not exceed the size of the surface for which the projection target area is set.
The output control unit 155 may set the size of the projection target area on the basis of projector information. For example, the output control unit 155 sets the size of the projection target area so that the projection target area falls within the current projectable area of the projector 121.
Control Posture
Next, the output control unit 155 controls the posture of the projector 121. The output control unit 155 may or may not change the posture. That is, the output control unit 155 can cause the projector 121 to project notification information without changing the posture of the projector 121 or by changing the posture of the projector 121.
The output control unit 155 sets the posture of the projector 121 to be taken in projection (hereinafter also referred to as a target posture) so that the projection target area that has been set is included in the projectable area of the projector 121. The target posture includes information indicating the pan angle, the tilt angle, and/or the position of the projector 121 that should be taken in projection. Then, the output control unit 155 performs control to change the posture of the projector 121 in a case where the target posture that has been set and the current posture of the projector 121 obtained as projector information are different from each other. The output control unit 155 may set the target posture so that the projection target area is located at the center of the projectable area. Furthermore, the output control unit 155 may set the target posture so that the projection target area is located at an end part of the projectable area.
The output control unit 155 performs posture control according to whether or not confidentiality information satisfies the predetermined condition. In a case where confidentiality of the notification information is high, the output control unit 155 imposes a restriction on changing the posture of the projector 121. Here, the restriction is to specify a driving method of the projector 121 for visually or acoustically hiding driving of the projector 121 from the second user, and a process to be executed when the projector 121 is driven. In a case where confidentiality of the notification information is high, the output control unit 155 drives the projector 121 by a predetermined driving method and executes a predetermined process. Examples of the restriction that can be imposed are described in the second and third cases of <<3. Details of projection process>> as described later. Examples of the restriction that can be imposed include stopping posture change (that is, the posture is not changed), positioning the projection target area at an end part of the projectable area, shortening the driving time (that is, reducing the posture change amount), reducing driving sound (that is, slowing driving speed for changing the posture or increasing environment sound), and the like. Other examples of the restriction that can be imposed are also described in <<5. Modifications>> to be described later. Examples of the restriction that can be imposed include returning the posture of the projector 121 to the original posture after changing the posture, darkening an indicator, controlling ambient light, and the like. By imposing such a restriction, the fact that the projector 121 is driving to project notification information with high confidentiality can be made less noticeable to the second user. Note that such a restriction is not imposed in a case where confidentiality of notification information is low.
In the case of changing the posture of the projector 121, the output control unit 155 generates a drive parameter for changing the posture and transmits the drive parameter to the projector 121. The projector 121 performs driving in the pan/tilt direction and driving in the horizontal direction or the height direction for changing the position according to such a drive parameter. The drive parameter can include information indicating the target posture of the projector 121. In this case, the projector 121 changes the pan angle, the tilt angle, and/or the position so that the posture matches the target posture. The drive parameter may include the posture change amount (the pan angle change amount, the tilt angle change amount, and the position change amount) necessary for the posture of the projector 121 to become the target posture, together with or in lieu of information indicating the target posture of the projector 121. The change amount is obtained by taking the difference between the current posture of the projector 121 obtained as projector information and the target posture that has been set. In this case, the projector 121 changes the pan angle, the tilt angle, and/or the position by the change amount. Furthermore, the drive parameter may include parameters such as drive speed of a motor for changing the posture of the projector 121, acceleration/deceleration and the rotation direction, illuminance, cooling fan strength, and the like. The output control unit 155 determines the drive parameter within a range in which stable operation of the drive mechanism of the projector 121 is realized.
Perform Projection
The output control unit 155 projects notification information on the projection target area that has been set in a case where the posture control of the projector 121 is completed. Specifically, the output control unit 155 generates a display object (that is, a projection image) on the basis of the notification information. For example, the output control unit 155 generates a display object by shaping the notification information according to the shape and the size of the projection target area. Then, the output control unit 155 causes the projector 121 to project the display object that has been generated.
Supplement
The output control unit 155 may control projecting timing. The projection timing is a concept including the timing of setting the projection target area, the timing of changing the posture of the projector 121, and timing of performing projection after posture control.
Regarding notification information whose notification target is all the users in the physical space 30, the output control unit 155 may set the projection target area at any location. In this case, the output control unit 155 projects a display object that is moving toward the set projection target area or outputs a voice instructing all the users to direct his or her eyes to the set projection target area to attract gaze to the set projection target area. Therefore, it is possible for all the users to visually recognize notification information.
The output control unit 155 may control the projection process on the basis of information indicating activity of the user. For example, in a case where activity of the first user is low, the output control unit 155 suppresses projection of notification information with low priority. In contrast, in a case where activity of the second user is low, the output control unit 155 controls the projection process without considering the second user.
The output control unit 155 may control the projection process on the basis of information indicating motion of the user. For example, in a case where the user engages in any work, the output control unit 155 suppresses projection of notification information with low priority. In contrast, in a case where the second user engages in any work, the output control unit 155 controls the projection process without considering the second user.
<<3. Details of Projection Process>>
Hereinafter, projection processes in first to third cases will be described in detail.
<3.1. First Case>
The present case is a case where confidentiality of notification information is low.
Examples of the notification information with low confidentiality include information related to all the users in the physical space 30, general-purpose information such as a weather forecast, and information to be notified additionally due to an operation performed by the user in a state where the user is recognizable to the other user. The operation performed by the user in a state recognizable to the other user is, for example, an explicit utterance to a voice agent, or the like.
It is desirable that notification information with low confidentiality is projected at a location which is most visible to the first user. Therefore, the output control unit 155 sets the projection target area at the location which is most visible to the first user on the basis of spatial information and user information of the first user.
Considering the characteristic of the projector 121 that the display characteristic is better as proceeding toward the center of the projectable area and expandability in a case where additional notification information is generated due to user operation for notification information that is projected, it is desirable that the projection target area is located at the center of the projectable area of the projector 121 in the target posture. Therefore, the output control unit 155 controls the posture of the projector 121 so that the projection target area is included in the center of the projectable area of the projector 121.
A specific example of the present case will be described with reference to
Here, it is assumed that the user C is the notification target. However, since the confidentiality is low, the users A and B are not considered, and the projection target area 22 is set at any position within the visible range 40C of the user C. The projection target area 22 illustrated in
In the present specific example, since the confidentiality is low, the gaze attraction effect due to driving of the projector 121 may not be considered. Therefore, the output control unit 155 determines the drive parameter so that projection is performed quickest within a range in which stable operation of the drive mechanism of the projector 121 is realized.
Furthermore, in a case where there is another output unit 120 (for example, a display device such as a smartphone) at or near the location set as the projection target area, the output control unit 155 may cause the other output unit 120 to output notification information.
<3.2. Second Case>
The present case is a case where confidentiality of notification information is high and the projector 121 is not driven.
A specific example of the present case will be described with reference to
As illustrated in
In a case where the projector 121 has a zoom function, the output control unit 155 may cause the projector 121 to zoom out to expand the projectable area 21. As a result, it becomes possible to increase choices of locations for setting the projection target area.
Furthermore, the output control unit 155 may control the content of the notification information to be projected according to the position and/or size of the projection target area. This point will be described with reference to
Furthermore, in a case where there is another output unit 120 (for example, a display device such as a smartphone) at or near the location set as the projection target area, the output control unit 155 may cause the other output unit 120 to output notification information.
<3.3. Third Case>
The present case is a case where confidentiality of notification information is high and the projector 121 is driven. The overview of the present case will be described with reference to
As illustrated in
In such a case, in order to ensure confidentiality of the notification information, it is desirable to consider the following viewpoints. Hereinafter, a technology of ensuring confidentiality of the notification information will be described from the viewpoints.
First viewpoint: Minimize drive time
Second viewpoint: Make projection target area less likely to be grasped
Third perspective: Minimize driving sound
Fourth perspective: Reflect behavior of second user
First Viewpoint
The output control unit 155 calculates the minimum posture change amount with which the projection target area can be positioned within the projectable area, and changes the posture of the projector 121 by the calculated change amount. For example, in a case where the projection target area on which notification information is projected is outside the projectable area of the projector 121, the output control unit 155 changes the posture of the projector 121 so that the center of the current projectable area of the projector 121 passes on the direct line connecting the center of the current projectable area of the projector 121 and the projection target area (for example, the center of the projection target area). Specifically, the output control unit 155 sets the projectable area centered on the projection target area as the target projectable area, and determines the drive parameter so that the posture change from the current posture to the target posture becomes linear in a case where the posture that realizes the target projectable area is the target posture. Linear posture change means that the posture change amount per unit time is constant. Such control can minimize the movement distance of the projectable area (that is, the posture change amount of the projector 121). That is, the drive time of the projector 121 can be minimized.
Furthermore, in a case where the projection target area on which notification information is projected is outside the projectable area of the projector 121, the output control unit 155 stops the posture change of the projector 121 with entrance the projection target area into the projectable area of the projector 121 from outside as a trigger. The projection target area being outside the projectable area means that at least part of the projection target area is outside the projectable area. The projection target area being within the projectable area means that entirety of the projection target area is located inside the projectable area. Such posture control can reduce the posture change amount and shorten the drive time as compared with the case where the posture is changed until the projection target area becomes located at the center of the projectable area.
A specific example of the control described above will be described with reference to
Second Viewpoint
Stopping the posture change of the projector 121 according to the trigger described in the first aspect described above is effective also in the second viewpoint. In a case where the posture change of the projector 121 is stopped according to the trigger described above, the projection target area is located at an end part of the projectable area. Therefore, at least the projection target area is located far from the center of the projectable area. Therefore, even if the second user visually recognizes the projector 121 that is projecting the notification information, it is possible to make the second user less likely to grasp where in the projectable area of the projector 121 the notification information is projected.
The output control unit 155 may control the posture of the projector 121 at the time when the notification information is projected such that the center of the projectable area of the projector 121 is between the first user and the second user. The output control unit 155 sets the projection target area for projecting the notification information whose notification target is the first user, in the area on the first user side in the projectable area. In this case, since the projection direction is directed toward the space between the first user and the second user, the position of the projection target area can be less likely to be grasped by the other user. A specific example of such control will be described with reference to
Third Viewpoint
In a case where confidentiality of the notification information is high, the output control unit 155 may set the posture change speed of the projector 121 to be slower than the posture change speed of the projector 121 in a case where confidentiality of notification information is low. Specifically, the output control unit 155 decreases the posture change speed in a case where the confidentiality is high, and increases the posture change speed in a case where the confidentiality is low. Typically, the faster the posture change speed is, the louder the driving sound of the projector 121 is, and the slower the posture change speed is, the quieter the drive sound of the projector 121 is. Then, the louder the driving sound is, the more easily the second user notices that the projector 121 is driving to change the posture. Therefore, in a case where the confidentiality is high, by decreasing the posture change speed and lowering the driving sound, driving of the projector 121 can be made less noticeable to the second user.
The output control unit 155 may control the posture change speed according to the volume of the environment sound. Specifically, the output control unit 155 increases the posture change speed as the volume of the environmental sound is louder, and decreases the posture change speed as the volume of the environment sound decreases. This is because as the volume of the environmental sound increases, the volume of the driving sound is relatively decreased, and driving of the projector 121 is less noticeable to the second user.
The output control unit 155 may control the posture change speed according to the distance between the projector 121 and the second user. Specifically, the output control unit 155 increases the posture change speed as the distance between the projector 121 and the second user increases, and decreases the posture change speed as the distance decreases. This is because as the distance between the projector 121 and the second user is larger, it becomes harder for the second user to hear the driving sound of the projector 121.
In a case where confidentiality of notification information is high, the output control unit 155 may set the volume of the environment sound around the projector 121 to be louder than the volume of the environment sound around the projector 121 in a case where confidentiality of notification information is low. Specifically, the output control unit 155 increases the volume of the environment sound in a case where the confidentiality is high and decreases the volume of the environment sound in a case where the confidentiality is low. The environmental sound here is, for example, background music (BGM) or the like reproduced in the physical space 30. In a case where the confidentiality is high, by increasing the environment sound and relatively decreasing the volume of the driving sound, driving of the projector 121 can be made less noticeable to the second user.
The output control unit 155 performs control described above on the basis of a sound collection result of a microphone installed in the physical space 30, a sensor device that monitors the operation sound of the projector 121, or the like. Alternatively, the output control unit 155 may perform the control described above by referring to a table set in advance. In a case where the second user is wearing headphones or the like, the control described above may not be performed. Furthermore, the output control unit 155 may set a fan or the like of a main body of the projector 121 as a target for noise reduction.
Fourth Viewpoint
As a result, in a case where confidentiality of notification information is high, the output control unit 155 imposes a restriction on the posture change of the projector 121 on the basis of user information of the second user. Specifically, in a case where confidentiality of notification information is high, the output control unit 155 selects whether or not to impose a restriction on the posture change of the projector 121 depending on whether or not the projector 121 is located within the visible range of the second user.
In a case where confidentiality of notification information is high, the output control unit 155 may determine whether or not to change the posture of the projector 121 depending on whether or not the projector 121 is located within the visible range of the second user. Specifically, the output control unit 155 does not change the posture of the projector 121 in a case where the projector 121 is located within the visible range of the second user. In contrast, in a case where the projector 121 is located outside the visible range of the second user (that is, a case where the projector 121 is not located within the visible range of the second user), the output control unit 155 changes the posture of the projector 121. Therefore, it is possible to prevent the second user from visually recognizing the state where the projector 121 is being driven. Therefore, it is possible to prevent eyes of the second user from being attracted to the notification information that is projected by using the projection direction of the projector 121 that is being driven as a clue.
In a case where confidentiality of notification information is high, the output control unit 155 may control a noise reduction process of the projector 121 depending on whether or not the projector 121 is located within the visible range of the second user. For example, the output control unit 155 performs the control according to the third viewpoint described above in a case where the projector 121 is located within the visible range of the second user, and does not perform the control in a case where the projector 121 is located outside the range. Alternatively, the output control unit 155 may control the degree of noise reduction (for example, drive speed) depending on whether or not the projector 121 is located within the visible range visible of the second user.
The output control unit 155 may separately notify the first user of notification information with high priority via a personal terminal or a wearable device of the first user. In this case, the first user is notified of the notification information by means of an image, sound, or vibration. In contrast, regarding notification information with low priority, the output control unit 155 may postpone posture control and projection until the condition is satisfied (that is, the projector 121 is out of the visible range of the second user).
Note that the output control unit 155 may determine whether or not to impose a restriction and the content of the restriction without considering the second user far from the projector 121. This is because driving of the projector 121 located far away is less likely to be noticed. The distance serving as a criterion for determining whether or not to be taken into consideration can be set on the basis of the visual acuity of the second user and the size of the projector 121.
Furthermore, the output control unit 155 may determine whether or not to impose a restriction and the content of the restriction without considering the second user with low activity.
Furthermore, in a case where there is another output unit 120 (for example, a display device such as a smartphone) at or near the location set as the projection target area, the output control unit 155 may cause the other output unit 120 to output notification information.
<<4. Process Flow>>
Overall Process Flow
In a case where it is determined that the confidentiality of the notification information is lower than the threshold (step S106/YES), the output control unit 155 sets the projection target area at the location which is most visible to the first user (step S108). Next, the output control unit 155 controls the posture of the projector 121 so that the projection target area is located at the center of the projectable area (step S110). Then, the output control unit 155 projects the notification information on the projection target area (step S126). The case where the notification information is projected in this way corresponds to the first case described above.
In a case where it is determined that the confidentiality of the notification information is higher than the threshold (step S106/NO), the output control unit 155 sets the projection target area at a location visible only to the first user (step S112). Next, the output control unit 155 determines whether or not the projection target area is included in the projectable area of the projector 121 (step S114).
In a case where it is determined that the projection target area is included in the projectable area of the projector 121 (step S114/YES), the output control unit 155 projects the notification information on the projection target area (step S126). The case where the notification information is projected in this way corresponds to the second case described above.
In a case where it is determined that the projection target area is included outside the projectable area of the projector 121 (step S114/NO), the output control unit 155 calculates the minimum posture change amount with which the projection target area can be positioned within the projectable area (step S116). Next, the output control unit 155 determines whether or not the projector 121 is within the visible range of the second user (step S118). In a case where it is determined that the projector 121 is within the visible range of the second user (step S118/YES), the process proceeds to step S124. In contrast, in a case where it is determined that the projector 121 is outside the visible range of the second user (step S118/NO), the output control unit 155 sets a drive parameter for changing the posture on the basis of the posture change amount calculated in step S116 described above (step S120). Next, the output control unit 155 controls the posture of the projector 121 on the basis of the drive parameter (step S122). Thereafter, the output control unit 155 determines whether or not the projection target area has entered the projectable area (step S124). In a case where it is determined that the projection target area is not within the projectable area (step S124/NO), the process returns to step S118. In contrast, in a case where it is determined that the projection target area has entered the projectable area (step S124/YES), the output control unit 155 projects the notification information on the projection target area (step S126). The case where the notification information is projected in this way corresponds to the third case described above.
Flow of Posture Control Process in Third Case
<<5. Modifications>>
(1) First Modification
In a case where confidentiality of notification information is high and in a case where the projector 121 is within the visible range of the second user, the output control unit 155 causes another projector 121 to project other notification information whose notification target is the second user in a direction different from the projector 121 as viewed from the second user. In a case where confidentiality of notification information is high, the output control unit 155 first controls two or more projectors 121 and uses one of the projectors 121 for the second user, and then uses the other of the projectors 121 for the first user. Since the line-of-sight of the second user is attracted to the notification information whose notification target is the second user, it is possible to remove the visible range of the second user from the projector 121. Therefore, it is possible to remove the restriction imposed in a case where confidentiality of notification information is high and the projector 121 is within the visible range of the second user. This point will be described in detail with reference to
The process performed in this case will be described with reference to
Specifically, as illustrated in the left diagram of
Next, as illustrated in the center diagram of
Then, as illustrated in the right diagram of
Note that, in the above, the example has been described in which notification information with low confidentiality, out of notification information with high confidentiality and the notification information with low confidentiality, is used to secure the confidentiality of the notification information with high confidentiality. The output control unit 155 may rank relative confidentiality of pieces of notification information that have been received and have not been conveyed, so that notification is performed in ascending order of confidentiality of each piece of the notification information in the same manner as described above to secure confidentiality of notification information with relatively high confidentiality.
Furthermore, in a case where there is a plurality of projectors 121, the output control unit 155 selects a projector existing outside the visible range of the second user as the projector 121 that projects notification information whose notification target is the first user. In this case, it becomes possible to notify the first user of notification information with high confidentiality without performing the process of removing the visible range of the second user from the projector 121.
(2) Second Modification
The output control unit 155 may change the posture of the projector 121 and project the notification information, and then return the posture of the projector 121 to a predetermined posture. The predetermined posture may be a posture before the posture is changed or may be an initial posture set in advance. As a result, the history of posture changes is deleted. Therefore, it is possible to prevent the second user from noticing that the first user has been notified of notification information after projection of the notification information is finished.
(3) Third Modification
The output control unit 155 may darken an indicator such as an LED for displaying energization or the like of the projector 121 during driving of the projector 121. As a result, driving of the projector 121 can be made less noticeable to the second user.
(4) Fourth Modification
The output control unit 155 may control the posture of the projector 121 or environment light around the projector 121 (for example, room lighting) so that the area whose brightness exceeds a predetermined threshold is included in the projectable area of the projector 121, in projection of notification information. In particular, the output control unit 155 controls the posture of the projector 121 or the environment light so that the brightness of the area other than the projection target area in the projectable area exceeds the predetermined threshold. The projector 121 can project a solid black color on the portion other than the projection target area in the projectable area. This solid black portion can be visually recognized by the second user. In this respect, by performing this control, the solid black portion can be made inconspicuous.
(5) Fifth Modification
The output control unit 155 may drive the projector 121 located within the visible range of the first user instead of performing projection. In this case, it is possible to notify the first user of the fact that there is at least notification information for the first user.
<<6. Summary>>
An embodiment of the present disclosure has been described above in detail with reference to
More specifically, in a case where confidentiality of the notification information is high, the information processing system 100 controls as to whether or not to impose a restriction on driving of the projector 121 according to whether or not the projector 121 is located within the visible range of the second user. For example, the output control unit 155 does not change the posture of the projector 121 in a case where the projector 121 is located within the visible range of the second user, and changes the posture of the projector 121 in a case where the projector 121 is located outside the visible range of the second user. Therefore, it is possible to prevent the second user from visually recognizing the state where the posture of the projector 121 is changed. Therefore, it is possible to prevent the eyes of the second user from being attracted to notification information.
While a preferred embodiment of the present disclosure has been described in detail with reference to the accompanying drawings, the technical scope of the present disclosure is not limited to such examples. It is obvious that a person skilled in the art to which the present disclosure pertains can conceive various modifications and corrections within the scope of the technical idea described in the claims, and it is naturally understood that these also belong to the technical scope of the present disclosure.
For example, the information processing system 100 may be realized as a single device, or part or all of the information processing system 100 may be realized as separate devices. For example, in the functional configuration example of the information processing system 100 illustrated in
Note that the series of processes performed by each device described in the present Description may be realized by using any of software, hardware, and a combination of software and hardware. The program configuring the software is stored in advance in a storage medium (non-transitory media) provided inside or outside each device, for example. Then, each program is read into the RAM, for example, when the computer executes the program, and is executed by a processor such as a CPU. The storage medium described above is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Furthermore, the computer program described above may be distributed, for example, via a network without using a storage medium.
Furthermore, the processes described in the present Description by using the flowcharts and the sequence diagrams does not necessarily have to be executed in the illustrated order. Some process steps may be performed in parallel. In addition, additional process steps may be adopted, and some process steps may be omitted.
Furthermore, the effects described in the present Description are illustrative or exemplary only and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present Description in addition to or in lieu of the effects described above.
Note that the following configurations also belong to the technical scope of the present disclosure.
(1)
An information processing apparatus including a control unit configured to control a projection process of notification information including posture control of a projection device on the basis of spatial information of a space where the projection device can perform projection, information indicating the position and the posture of the projection device, information indicating confidentiality of the notification information, information of a first user who is a notification target of the notification information, and information of a second user who is not a notification target of the notification information.
(2)
The information processing apparatus according to (1), in which the control unit imposes a restriction on a change in the posture of the projection device in a case where the information indicating the confidentiality satisfies a predetermined condition.
(3)
The information processing apparatus according to (2), in which the control unit determines whether or not to change the posture of the projection device according to whether or not the projection device is located within a visible range of the second user.
(4)
The information processing apparatus according to (3), in which the control unit does not change the posture of the projection device in a case where the projection device is located within the visible range of the second user, and changes the posture of the projection device in a case where the projection device is located outside the visible range of the second user.
(5)
The information processing apparatus according to (4), in which in a case where a projection target area where the notification information is projected is outside a projectable area of the projection device, the control unit changes the posture of the projection device so that a center of a projectable area of the projection device passes on a straight line connecting a center of a current projectable area of the projection device and the projection target area.
(6)
The information processing apparatus according to (4) or (5), in which in a case where a projection target area where the notification information is projected is outside a projectable area of the projection device, the control unit stops a posture change of the projection device with entrance of the projection target area into the projectable area of the projection device from outside as a trigger.
(7)
The information processing apparatus according to any one of (2) to (6), in which the predetermined condition is that the information indicating the confidentiality indicates that the notification information is information that should be kept confidential.
(8)
The information processing apparatus according to any one of (1) to (7), in which the control unit controls the posture of the projection device in projection of the notification information so that the center of the projectable area of the projection device is located between the first user and the second user.
(9)
The information processing apparatus according to any one of (1) to (8), in which the control unit causes another projection device to project other notification information whose notification target is the second user in a direction different from the projection device as viewed from the second user in a case where the projection device is within a visible range of the second user.
(10)
The information processing apparatus according to any one of (1) to (9), in which in a case where the information indicating the confidentiality satisfies a predetermined condition, the control unit sets posture change speed of the projection device to be slower than posture change speed of the projection device in a case where the information indicating the confidentiality does not satisfy the predetermined condition.
(11)
The information processing apparatus according to any one of (1) to (10), in which in a case where the information indicating the confidentiality satisfies a predetermined condition, the control unit sets a volume of environment sound around the projection device to be louder than a volume of environment sound around the projection device in a case where the information indicating the confidentiality does not satisfy the predetermined condition.
(12)
The information processing apparatus according to any one of (1) to (11), in which in a case where the information indicating the confidentiality satisfies a predetermined condition, the control unit changes the posture of the projection device to project the notification information and then returns the posture of the projection device to a predetermined posture.
(13)
The information processing apparatus according to any one of (1) to (12), in which the control unit controls the posture of the projection device or environment light around the projection device so that an area whose brightness exceeds a predetermined threshold is included in a projectable area of the projection device in projection of the notification information.
(14)
The information processing apparatus according to any one of (1) to (13), in which the control unit sets a projection target area where the notification information is projected within a visible range of the first user and outside a visible range of the second user in a case where the information indicating the confidentiality satisfies a predetermined condition.
(15)
The information processing apparatus according to any one of (1) to (14), in which the control unit causes the projection device to project the notification information without changing or by changing the posture of the projection device.
(16)
The information processing apparatus according to any one of (1) to (15), in which the information of the second user includes information indicating activity of the second user.
(17)
An information processing method including causing a processor to control a projection process of notification information including posture control of a projection device on the basis of spatial information of a space where the projection device can perform projection, information indicating the position and the posture of the projection device, information indicating confidentiality of the notification information, information of a first user who is a notification target of the notification information, and information of a second user who is not a notification target of the notification information.
(18)
A program for causing a computer to function as a control unit configured to control a projection process of notification information including posture control of a projection device on the basis of spatial information of a space where the projection device can perform projection, information indicating the position and the posture of the projection device, information indicating confidentiality of the notification information, information of a first user who is a notification target of the notification information, and information of a second user who is not a notification target of the notification information.
Number | Date | Country | Kind |
---|---|---|---|
2018-108449 | Jun 2018 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/020704 | 5/24/2019 | WO | 00 |