The present disclosure relates to an information processing system, a control method, and a control program.
From the viewpoint of countermeasures against infectious diseases, reduction in costs, and the like in recent years, companies introducing working from home as a work pattern for employees are increasing. An employee working from home performs work on his/her own terminal device by remotely accessing an in-house server, using data or the like stored in the server. In addition, the employee working from home often uses his/her own terminal device, for conferences, meetings, and the like online via various applications.
Communication in a remote environment such as working from home is mainly performed using sound and video. For example, for a remote conference, sound picked up by a microphone and a video captured by a camera are output to terminals of other participants of the conference. Participants of the conference confirm the facial expressions of the respective participants in the video, and the conference is advanced on the basis of the contents of the words output as the sound.
In this way, as a technology for conveying various information to other persons connected online, the following technologies have been proposed. For example, a technology has been proposed to collect relevant data associated with a video event in video data captured by a camera, select a feature parameter based on the collected relevant data according to the type of the video event, and generate tactile effects from the selected feature parameter. In addition, a technology has been proposed to add tactile information to a medium file together with a time stamp, as tag information, reproduce sound data or video content on the basis of the medium file, and output the tactile information added as the tag information.
Patent Literature 1: JP 2018-73424 A
Patent Literature 2: JP 2011-501575 A
However, in a case where the work pattern is shifted from working in office to working from home, it may be difficult to understand what a person is feeling, only from the sound and the video captured by the camera. Therefore, the remote environment tends to provide insufficient communication with other persons.
Therefore, the present disclosure provides an information processing system, a control method, and a control program that activate communication with other persons, in the remote environment.
According to the present disclosure, a signal generation unit generates a vibration signal based on vibration measured by a vibration sensor provided at a mat and a result of detection of an object placed on the mat. A signal output unit vibrates a vibration mechanism based on the vibration signal generated by the signal generation unit.
Embodiments of the present disclosure will be described below in detail with reference to the drawings. Note that in the following embodiments, the same portions are denoted by the same reference numerals and symbols, and a repetitive description thereof will be omitted.
For example, in the remote environment, chatting opportunity tends to decrease. A survey shows the result that the largest number of people answer that the frequency of chats is three times per day when they were in the office, whereas the largest number of people answer that the frequency of chats is approximately once in three days in the remote environment.
Furthermore, in the remote environment, opportunities for a person to start a chat by him-/her-self also tend to decrease. For example, in a survey, the largest number of people answered that the ratio of starting chats by themselves in the chat was approximately 20% in any work pattern of working in office or working from home. However, the survey shows the result that the number of people who answered 20% was decreased and the number of people who answered 0 was increased in the remote environment, as compared with working in office.
In this way, there is a big psychological hurdle in the remote environment for each of the people to have a chat, making easily talking with other people difficult. The reduced chats may cause various adverse effects such as difficulty in asking someone's advice and difficulty in obtaining various information related to work, thereby possibly causing adverse effects such as a decrease in work efficiency.
Therefore, in order to activate communication with another person in the remote environment, in the remote communication system 1 according to the present embodiment, a mat speaker 10 and a camera 20 are arranged, in addition to a terminal device 30 for communication, as illustrated in
An operator who performs remote communication by using the mat speaker 10 uses the microphone 110 provided at the top plate 101 to transmit sound to a person on the other end of the line. Furthermore, the microphone 110 also functions as a vibration sensor, and detects the weight, arrangement position, contact speed, and the like of an object arranged on the top plate 101.
In addition, when the operator brings his/her finger into contact with the vibrator 120 of the communication mechanism 102, the operator can feel vibration on the finger. Furthermore, even in a state where the finger is separated from the vibrator 120, the operator can feel the vibration of the vibrator 120 due to the vibration of air or the like. An eccentric motor or the like may be added to the vibrator 120 so that the operator can more easily feel the vibration with the finger separated. In addition, pressing the call button provided in the communication mechanism 102 by the operator makes it possible for the operator to talk with a person with whom the operator remotely communicates, by using the microphone 110 provided at the top plate 101, and the sound is transmitted to the person.
The camera 20 captures an image from a side facing the top plate 101. It is preferable for the camera 20 not to capture an image of a region displaced from the top plate 101, and it is more preferable to have an angle of view for capturing an image of the entire surface of the top plate 101. The image captured by the camera 20 is transmitted to the terminal device 30.
In this way, the remote communication system 1 as the information processing system includes the mat speaker 10 that serves as a mat, the vibrator 120 that is a vibration mechanism, and the terminal device 30. Furthermore, the mat includes the microphone 110 as the vibration sensor. Furthermore, the camera 20 captures an image of an object placed on the top plate 101.
Next, details of the terminal device 30 that is an information processing device will be described with reference to
The input device 306 is, for example, a keyboard or a mouse. Furthermore, the display device 308 is, for example, a monitor. The operator refers to a screen displayed on the display device 308 to input a command to the terminal device 30, with the input device 306.
Furthermore, the operator uses the input device 306 to operate a button 201 for turning on and off transmission and reception of the video, and selects whether to perform mutual transmission and reception of the video captured by the camera 20. When the button 201 is turned on, the image of the camera 20 is transmitted and received between the terminal device 30 operated by the operator and the terminal device 31 operated by the person on the other end of the line. The image of the camera 20 transmitted from the terminal device 31 is displayed as an image 202 on the operation screen 200. In addition, when the button 201 is turned on, vibration generated on the basis of vibration collected by the microphone 110 of the top plate 101 is mutually transmitted and received between the terminal device 30 operated by the operator and the terminal device 31 operated by the person on the other end of the line.
Returning to
Furthermore, the input/output control unit 307 receives, from the communication control unit 305, an input of the image of the camera 20 transmitted from the terminal device 31 via the network. Then, the input/output control unit 307 causes the display device 308 to display the acquired image thereon.
The contact detection unit 301 receives an input of the sound collected by the microphone 110. In addition, the contact detection unit 301 receives, from the microphone 110, an input of pressure vibration caused upon contact of the object placed on the top plate 101.
Returning to
On the other hand, when the contact of the object with the top plate 101 is detected, the contact detection unit 301 determines whether a certain time period such as 1 to 2 seconds has elapsed from a previous contact. This is to prevent unnecessary detection of contact due to chattering is prevented.
When the certain time period has not elapsed from the previous contact, it is determined that chattering has occurred, and the contact detection unit 301 outputs the sound collected by the microphone 110 to the signal generation unit 303. Then, the contact detection unit 301 instructs the signal generation unit 303 to generate the signal limited to the sound collected by the microphone 110.
On the other hand, when the certain time period has elapsed from the previous contact, the contact detection unit 301 determines that the object is placed on the top plate 101. Then, the contact detection unit 301 outputs the sound collected by the microphone 110 to the signal generation unit 303. Furthermore, the contact detection unit 301 instructs the signal generation unit 303 to add a vibration effect based on the image of the camera 20.
The image analysis unit 304 receives an input of the image captured by the camera 20 from the side facing an object mount surface of the top plate 101. Then, the image analysis unit 304 analyzes the image of the camera 20 to recognize the object placed on the top plate 101.
For example, the image analysis unit 304 acquires a large number of pieces of data each obtained by combining an image showing a specific object arranged on the top plate 101 and a name of the specific object, and performs machine learning using the acquired data as training data to generate an image recognition model. Then, the image analysis unit 304 inputs the image of the camera 20 to the image recognition model, for recognition of the object captured in the image. Thereafter, the image analysis unit 304 outputs a result of the recognition of the object placed on the top plate 101, to the signal generation unit 303. The object recognized by the image analysis unit 304 also includes an object in action, such as “a hand pressing the top plate 101” or “a spinning top”.
At this time, when the object located at a position outside the angle of view of the camera 20, the image analysis unit 304 outputs a notification indicating absence of the object to the signal generation unit 303, as a result of the recognition. Furthermore, when it is difficult to identify the object, the image analysis unit 304 outputs a notification indicating failure in recognition to the signal generation unit 303, as a result of the recognition.
For example, in
Furthermore, the image analysis unit 304 outputs the image captured by the camera 20 to the communication control unit 305. However, the image captured by the camera 20 may be directly input to the communication control unit 305.
Returning to
The signal generation unit 303 receives an input of the sound collected by the microphone 110 from the contact detection unit 301. At the same time, the signal generation unit 303 receives either specification of generation of the signal limited to the sound or specification of generation of a signal to which the vibration effect is to be added. Furthermore, the signal generation unit 303 receives an input of the result of the recognition of the image, from the image analysis unit 304.
When the instruction for generation of the signal limited to the sound is given, the signal generation unit 303 converts the sound acquired, into a vibration signal. Furthermore, the signal generation unit 303 performs filter processing by passing the vibration signal after conversion, through a low-pass filter to delete vibration due to unnecessary sound data. Then, the signal generation unit 303 outputs the vibration signal to the communication control unit 305.
On the other hand, when the instruction for generation of the signal to which the vibration effect is to be added, the signal generation unit 303 determines whether the recognition of the image has been successfully performed, the object is absent, or the recognition has failed, from the result of the recognition of the image. When the result of the recognition of the image indicates the absence of the object or the failure in recognition is notified of, the signal generation unit 303 generates the signal limited to the sound collected by the microphone 110. In other words, the signal generation unit 303 converts the acquired sound into signal vibration. Furthermore, the signal generation unit 303 performs filter processing by passing the vibration signal after conversion, through a low-pass filter to delete vibration due to unnecessary sound data. Then, the signal generation unit 303 outputs the vibration signal to the communication control unit 305.
On the other hand, when the recognition of the image has been successfully performed, the signal generation unit 303 converts the acquired sound into the vibration signal. Next, the signal generation unit 303 adjusts the vibration signal obtained by converting the sound, by using pitch shifting or the like on. Next, the signal generation unit 303 acquires a vibration pattern corresponding to a result of the recognition of the acquired image, from the vibration pattern information. Then, the signal generation unit 303 adds the acquired vibration pattern to the vibration signal obtained by converting the sound, for application of the vibration effect to the vibration signal obtained by converting the sound. Thereafter, the signal generation unit 303 outputs the vibration signal to which the vibration effect has been applied, to the communication control unit 305.
In other words, the signal generation unit 303 generates the vibration signal, on the basis of vibration measured by the vibration sensor provided at the mat and a result of the detection of the object placed on the mat. More specifically, the signal generation unit 303 generates the vibration signal, on the basis of the vibration measured by the vibration sensor and the result of the recognition of the object by the image analysis unit 304. More specifically, the signal generation unit 303 generates a basic vibration signal from the vibration measured by the vibration sensor, generates additional vibration on the basis of the result of the recognition of the object, and generates the vibration signal by adding the additional vibration to the basic vibration signal. Here, the basic vibration signal is a vibration signal generated from the sound collected by the microphone 110, and the additional signal is a vibration signal having a vibration pattern corresponding to the result of the recognition of the image.
Here, in the present embodiment, when the object is outside the angle of view of the camera 20, the signal generation unit 303 has generated the signal limited to the sound collected by the microphone 110, but may perform other processing. For example, when the object is outside the angle of view of the camera 20, the signal generation unit 303 may output nothing as the vibration signal. In addition, the signal generation unit 303 may weaken the vibration signal generated from the sound collected by the microphone 110 and output the weaken vibration signal.
Furthermore, the signal generation unit 303 has generated the vibration signal directly from the sound collected by the microphone 110. However, in addition, an action or the like may be estimated from the sound to apply a vibration pattern corresponding to the estimated action to the vibration signal, as the vibration effect.
Here, in the present embodiment, the vibration pattern information held by the signal generation unit 303 in advance has been described, but the method of acquiring the vibration pattern is not limited thereto. For example, the vibration pattern information may be held by an external device such as a server arranged in a cloud so that the signal generation unit 303 may download each vibration pattern from the external device for use. In addition, the signal generation unit 303 may transmit a result of the recognition to the external device so as to receive a vibration pattern corresponding to the result of the recognition selected by the external device, from the external device, for use as the vibration effect.
Returning to
Here, transmitting the image captured by the camera 20 to the person with whom the operator remotely communicates by the communication control unit 305 also makes it possible for the operator to communicate the intention to the person with whom the operator remotely communicates, through the image of the camera 20.
Returning to
The signal output unit 302 receives an input of the vibration signal from the communication control unit 305. Then, the signal output unit 302 outputs the vibration signal to the vibrator 120 to vibrate the vibrator 120. In other words, the signal output unit 302 vibrates the vibration mechanism on the basis of the vibration signal transmitted from the other terminal device 31.
With this configuration, the vibrator 120 receives the vibration signal transmitted by the terminal device 31 of the person with whom the operator remotely communicates, and generates vibration to which the vibration effect is applied, according to the object placed on the top plate 101 of the person. The operator can feel the vibration generated by the vibrator 120, recognize the object placed on the top plate 101 by the person with whom the operator remotely communicates and displayed on the display device 308 also through the vibration, and obtain more information about the person with whom the operator remotely communicates.
As described above, the remote communication system 1 as the information processing system includes a plurality of terminal devices 30 as the information processing devices. The terminal device 30 includes the mat, the vibration mechanism, the signal generation unit 303, the communication control unit 305 that transmits the vibration signal generated by the signal generation unit 303 to another information processing device 31, and the signal output unit 302 that vibrates the vibration mechanism on the basis of the vibration signal transmitted from the other information processing device 31.
The contact detection unit 301 receives sound from the microphone 110 (Step S1).
When receiving the sound, the contact detection unit 301 determines whether contact of the object with the top plate 101 has been detected (Step S2).
When contact of the object is detected (Step S2: affirmative), the contact detection unit 301 determines whether the certain time period has elapsed from the previous vibration (Step S3).
When the certain time period has elapsed from the previous vibration (Step S3: affirmative), the signal generation unit 303 receives the instruction for generation of the signal to which the vibration effect is to be added, from the contact detection unit 301, together with the sound. Furthermore, the signal generation unit 303 receives an input of a result of the recognition of the image captured by the camera 20, from the image analysis unit 304. Then, the signal generation unit 303 determines whether the object placed on the top plate 101 has been recognized (Step S4).
When the object placed on the top plate 101 has been recognized (Step S4: affirmative), the signal generation unit 303 converts the sound into the vibration signal and further adjusts the vibration signal (Step S5).
Next, the signal generation unit 303 acquires a vibration pattern corresponding to the result of the recognition of the object placed on the top plate 101 by the image analysis unit 304, from the vibration pattern information. Next, the signal generation unit 303 adds the acquired vibration pattern to the vibration signal to apply the vibration effect to the vibration signal (Step S6). Then, the transmission process for the vibration signal proceeds to Step S9.
On the other hand, when no contact of the object is detected (Step S2: negative), when the certain time period has not yet elapsed from the previous vibration (Step S3: negative), or when recognition of the object is difficult or the object is outside the angle of view of the camera 20 (Step S4: negative), the signal generation unit 303 performs the following processing. The signal generation unit 303 converts the sound collected by the microphone 110 into the vibration signal (Step S7).
Next, the signal generation unit 303 passes the generate vibration signal through the low-pass filter, for filter processing (Step S8). Then, the transmission process for the vibration signal proceeds to Step S9.
The signal generation unit 303 outputs the vibration signal to the communication control unit 305. The communication control unit 305 transmits the acquired vibration signal to the terminal device 31 of the person with whom the operator remotely communicates, via the network (Step S9).
Here, in the present embodiment, the terminal device 30 has transmitted the vibration signal to the terminal device 31 of the person with whom the operator remotely communicates, but, in addition to this configuration, the sound collected by the microphone 110 may be transmitted to the terminal device 31 of the person with whom the operator remotely communicates. Furthermore, in a case where the terminal device 30 includes a sound collection device other than the microphone 110 or an image acquisition device other than the camera 20, sound or an image obtained by the sound collection device or the image acquisition device may be transmitted to the terminal device 31 of the person with whom the operator remotely communicates.
Furthermore, in the present embodiment, when the object as a source of the vibration signal is arranged on the top plate 101, the terminal device 30 generates the vibration signal and transmits the generated vibration signal to the terminal device 30. However, in the remote communication system, the vibration signal may be generated by another device.
For example, the signal generation unit 303 is arranged at an external device such as a server in a cloud. Then, the terminal device 30 transmits the sound collected by the microphone 110 or the image of the camera 20, to the external device. The signal generation unit 303 of the external device uses the sound collected by the microphone 110 or the image of the camera 20, any of which is received from the terminal device 30, generating the vibration signal to which the vibration effect corresponding to the result of the recognition is applied. Then, the external device transmits the generated vibration signal to the terminal device 31. Alternatively, the external device 30 may transmit the generated vibration signal to the terminal device 30 so that the terminal device 30 transmits the vibration signal to the terminal device 31.
In addition, for example, the vibration signal may be generated by a signal generation unit 303 of the terminal device 31 that includes a vibrator 120 vibrated by the vibration signal. In this configuration, the terminal device 30 transmits the sound collected by the microphone 110 or the image of the camera 20, to the terminal device 31. The signal generation unit 303 of the terminal device 31 uses the sound collected by the microphone 110 or the image of the camera 20, any of which is received from the terminal device 30, generating the vibration signal to which the vibration effect corresponding to the result of the recognition is applied. Then, a signal output unit 302 of the terminal device 31 outputs the generated vibration signal to vibrate the vibrator 120.
In other words, the remote communication system 1 as the information processing system includes the signal generation unit 303 that generates a vibration signal on the basis of vibration measured by the vibration sensor provided at the mat and a result of the detection of the object placed on the mat, and the signal output unit 302 that vibrates the vibration mechanism on the basis of the vibration signal generated by the signal generation unit 303.
As described above, when the object is placed on the top plate, the remote communication system according to the present embodiment adds, as the vibration effect, a vibration pattern corresponding to an image of the object placed on the top plate, to the vibration signal generated from the sound picked up by the microphone. Then, the vibration signal to which the vibration effect has been applied is transmitted to the terminal device of the person with whom the operator remotely communicates to vibrate the vibrator of a mat speaker.
In a case where sound obtained when the object is placed is simply converted into the vibration signal and transmitted to the terminal device of the person with whom the operator remotely communicates to vibrate the vibrator, providing monotonous vibration. Meanwhile, transmitting the vibration pattern according to information about the placed object and the appearance thereof makes it possible to reproduce the vibration that is difficult to express by the vibration signal based on the sound for the person with whom the operator remotely communicates, thus making the person feel more information about the arranged object. Therefore, it is possible to communicate with each other by using more information in remote communication, enabling activated communication.
In the first embodiment described above, the signal generation unit 303 has generated the signal limited to the sound collected by the microphone 110, upon failure in the recognition of the object by the image analysis unit 304, but it is also possible to add the vibration effect using the appearance of the object. Hereinafter, addition of the vibration effect by using the appearance of the object will be described.
Upon failure in the recognition of the object, the image analysis unit 304 recognizes the appearance of the object such as a color of the object, a size of the object relative to a reference size, or movement of the object, and notifies the appearance as appearance information. In addition, upon failure in the recognition of the object, the image analysis unit 304 notifies of the number of objects on the top plate 101 as the appearance information.
Upon failure in the recognition of the object, the signal generation unit 303 acquires a result of the recognition of the appearance of the object such as the color of the object, the size of the object relative to the reference size, or the movement of the object, from the image analysis unit 304, as the appearance information. Then, the signal generation unit 303 acquires a vibration pattern according to the acquired appearance information, and adds the vibration pattern as the vibration effect, to the vibration signal generated from the sound collected by the microphone 110.
For example, when the object has a black color, the signal generation unit 303 adds a vibration pattern representing sound, such as “thump,” representing heavy weight, as the vibration effect. Meanwhile, when the object has a white color, the signal generation unit 303 adds a vibration pattern representing sound, such as “plock,” representing light weight, as the vibration effect. Furthermore, when the object has a size larger than the reference size, the signal generation unit 303 adds the vibration pattern representing sound, such as “thump,” representing heavy weight, as the vibration effect. Meanwhile, when the object has a size smaller than the reference size, the signal generation unit 303 adds the vibration pattern representing sound, such as “plock,” representing light weight, as the vibration effect. Furthermore, when the object is rotating, the signal generation unit 303 adds a vibration pattern representing rotation such as repetitive short vibration, as the vibration effect. Meanwhile, when the object is traveling straight, the signal generation unit 303 adds a vibration pattern indicating traveling straight such as long continued vibration, as the vibration effect.
Furthermore, upon failure in the recognition of the object, the signal generation unit 303 acquires information about the number of objects on the top plate 101, from the image analysis unit 304, as the appearance information. In this configuration, the signal generation unit 303 acquires a vibration pattern corresponding to the number of the objects indicated by the acquired appearance information, and adds the vibration pattern as the vibration effect.
For example, when there is a plurality of objects, the signal generation unit 303 adds a vibration pattern representing sound, such as “clack, clack, clack, clack, clack, clack,” representing putting the plurality of objects, as the vibration effect. Meanwhile, when there is one object, the signal generation unit 303 adds a vibration pattern representing sound, such as “clack,” representing putting of one object, as the vibration effect.
In this way, upon failure in the recognition of the object by the image analysis unit 304, the signal generation unit 303 generates the vibration signal on the basis of the appearance of the object.
As described above, upon failure in the recognition of the object, the remote communication system according to the present modification acquires a vibration pattern from the appearance information obtained from the object and adds the vibration pattern as the vibration effect. Accordingly, even when the object cannot be recognized, vibration representing the outline of the object on the top plate can be transmitted to the person with whom the operator remotely communicates, and more information that may be an opportunity for a conversation can be given to the person. Therefore, communication can be activated.
When the sensor-mounted button 40 is pressed by the operator, the acceleration sensor 401 detects the pressing of the sensor-mounted button 40 by the operator. Then, a signal notifying of a result of the detection by the acceleration sensor 401 is output to the signal generation unit 303.
The signal generation unit 303 receives an input of the result of the detection by the acceleration sensor 401, from the sensor-mounted button 40. Then, the signal generation unit 303 acquires a vibration pattern according to the result of the detection by the acceleration sensor 401, from the vibration pattern information. In this configuration, a predetermined vibration pattern indicating the pressing of the sensor-mounted button 40 is registered in the vibration pattern information.
Then, the signal generation unit 303 adds the vibration pattern according to the result of the detection by the acceleration sensor 401, to the vibration signal generated from the sound collected by the microphone 110, as the vibration effect. Thereafter, the signal generation unit 303 outputs the vibration signal to which the vibration effect according to the result of the detection by the acceleration sensor 401 is applied, to the communication control unit 305.
Here, the signal generation unit 303 may mute the vibration signal generated from the sound collected by the microphone 110. In this case, the signal generation unit 303 employs the vibration pattern according to the result of the detection by the acceleration sensor 401, as the vibration signal.
As described above, the remote communication system as the information processing device further includes the sensor-mounted button 40 that is a sensor-mounted mechanism that detects a predetermined action. Then, when the predetermined action is performed, the signal generation unit 303 receives a notification indicating detection of the predetermined action from the sensor-mounted mechanism, and generates a predetermined vibration signal determined in advance.
The communication control unit 305 transmits the vibration signal to which the vibration effect is applied corresponding to the result of the detection by the acceleration sensor 401, input from the signal generation unit 303, to the terminal device 31 of the person with whom the operator remotely communicates, via the network. Furthermore, the communication control unit 305 receives a vibration signal to which the vibration effect is applied corresponding to a result of the detection by the acceleration sensor 401 upon pressing the sensor-mounted button 40 by the person with whom the operator remotely communicates, via the network. Then, the communication control unit 305 outputs the vibration signal to which the vibration effect is applied corresponding to the result of the detection by the acceleration sensor 401 upon pressing the sensor-mounted button 40 by the person with whom the operator remotely communicates, to the signal output unit 302,
The signal output unit 302 transmits the vibration signal to which the vibration effect is applied corresponding to the result of the detection by the acceleration sensor 401 upon pressing the sensor-mounted button 40 by the person with whom the operator remotely communicates, to the vibrator 120 and vibrates the vibrator 120. This configuration makes it possible for the operator to feel the vibration generated by pressing the sensor-mounted button 40 by the person with whom the operator remotely communicates.
As described above, in the remote communication system according to the present embodiment, it is possible to transmit the specific vibration to the person with whom the operator remotely communicates, by using the sensor or the like arranged on the button. This configuration makes it possible to convey a specific feeling to the person with whom the operator communicates, by vibration, enabling further activated communication.
When a plurality of objects are placed on the top plate 101 or when work, such as writing a sentence in a notebook is performed on the top plate 101, the image analysis unit 304 recognizes a plurality of objects such as a pencil, paper sheet, a hand performing writing, and the other hand touching the top plate 101. In this case, the image analysis unit 304 outputs results of the recognition of the plurality of objects, to the content selection unit 309.
When the contact detection unit 301 detects the contact of the object with the top plate 101 after the certain time period has elapsed from the previous contact, the content selection unit 309 receives the instruction for generation of the signal to which the vibration effect is to be added, together with an input of the sound collected by the microphone 110. Next, the content selection unit 309 receives an input of a result of the recognition of the plurality of objects obtained from the image of the camera 20, from the image analysis unit 304.
Here, the content selection unit 309 preferentially selects an object having vibration easy to understand for the person with whom the operator communicates, from among the plurality of recognized objects, as a target being a source from which the vibration effect is generated. In the present embodiment, the content selection unit 309 has a priority table in which priorities for the respective objects are assigned in advance. This priority table is created on the basis of, for example, evaluations of vibrations in the past made by the operator. Then, the content selection unit 309 selects an object having the highest priority in the priority table, from among the plurality of recognized objects, as the target being a source from which the vibration effect is generated. Here, the content selection unit 309 may select a predetermined number of objects in descending order of priority. Then, the content selection unit 309 outputs the sound collected by the microphone 110 to the signal generation unit 303 together with the result of the recognition of the selected object.
For example, if the operator writes a sentence in a notebook on the top plate 101, the vibration pattern representing writing sound is easy to understand for the person with whom the operator communicates. Therefore, the content selection unit 309 selects, from among the pencil, the paper sheet, the hand performing writing, and the other hand touching the top plate 101 which are recognized by the image analysis unit 304, a result of the recognition of the hand performing writing.
In other words, when a plurality of individual objects is detected as the objects placed on the mat, the content selection unit 309 uses a priority of each of predetermined individual objects to select one or several individual objects.
In addition, the content selection unit 309 may uses a result of sound spectrum to select an object having vibration easy to understand for the person with whom the operator communicates. For example, the content selection unit 309 may perform spectrum analysis of the sound collected by the microphone 110, exclude steady sound, identify characteristic sound in time series, and select an object corresponding to the sound as the target being a source from which the vibration effect is generated. Furthermore, the content selection unit 309 may use information from other sensors such as a vibration device, an inertial measurement unit (IMU) sensor, and an acceleration sensor, in addition to the microphone 110 to select an object having vibration easy to understand for the person with whom the operator communicates.
The signal generation unit 303 receives the input of the sound collected by the microphone 110 together with the results of the recognition of the objects selected by the content selection unit 309. Then, the signal generation unit 303 generates the vibration signal from the acquired sound and adjusts the vibration signal, adds a vibration pattern corresponding to the result of the recognition of the object to the vibration signal, and generates a vibration signal to which the vibration effect has been applied. Thereafter, the signal generation unit 303 outputs the vibration signal to which the generated vibration effect has been applied, to the communication control unit 305.
The content selection unit 309 receives the instruction for generation of the signal to which the vibration effect is to be added, together with the input of the sound collected by the microphone 110 (Step S11).
Next, the content selection unit 309 acquires the results of the recognition of the images of the plurality of objects from the image analysis unit 304 (Step S12).
Next, the content selection unit 309 refers to the priority table (Step S13).
The content selection unit 309 selects an object having the highest priority indicated in the priority table, from among the plurality of objects whose results of the recognition have been acquired, as an object being a source from which the vibration effect is generated (Step S14).
Thereafter, the content selection unit 309 outputs the result of the recognition of the selected object together with the sound, to the signal generation unit 303 (Step S15).
As described above, when the plurality of objects is recognized, the remote communication system according to the present embodiment preferentially selects an object determined to have a vibration pattern easy to understand for the person with whom the operator communicates, from among the objects, as the target being the source from which the vibration effect is generated. This configuration makes it easy for the person with whom the operator remotely communicates to further understand a state on the side of the operator, enabling activated communication.
For example, when the operator eats an apple placed on the top plate 101, sound obtained by a microphone at the mouth of the operator more accurately represents eating of the apple than the sound obtained by a device being the microphone 110 provided at the top plate 101. In this way, it is preferable to select a device by which a vibration pattern desired to be transmitted can be easily obtained and generate the vibration pattern by using information from the device. In the present embodiment, a headset 41 worn by the operator and another external microphone 42 are arranged. In other words, the remote communication system 1 as the information processing device includes a plurality of information acquisition devices such as the camera 20, the headset 41, and the external microphone 42.
When the contact detection unit 301 detects contact of an object with the top plate 101 after the certain time period has elapsed from the previous contact, the device selection unit 310 receives the instruction for generation of the signal to which the vibration effect is to be added, together with an input of the sound collected by the microphone 110. Next, the device selection unit 310 receives an input of a result of the recognition of the image from the image analysis unit 304. Furthermore, the device selection unit 310 receives inputs of sounds from the headset 41 and the external microphone 42.
Next, the device selection unit 310 determines whether content to be transmitted is included in the sounds obtained from the microphone 110, the headset 41, and the external microphone 42. For example, when a sound is heard several seconds later after the apple arranged on the top plate 101 is picked up, the device selection unit 310 determines that there is the content to be transmitted. When the content to be transmitted is not included, the device selection unit 310 outputs, to the signal generation unit 303, the instruction for generation of the signal to which the vibration effect is to be added, together with the sound from the microphone 110 and the result of the recognition of the image.
On the other hand, when the content to be transmitted is included, the device selection unit 310 selects a device having a higher sound pressure. For example, in the case of eating sound, the device selection unit 310 selects the headset 41 because the headset 41 closer to the mouth of the operator has the highest sound pressure. Here, in the present embodiment, selection of the device has been performed on the basis of the sound pressure of each device, but the device selection unit 310 may select a device configured to acquire a characteristic frequency band of content selected. Furthermore, the device selection unit 310 may select a plurality of devices. In this way, the device selection unit 310 selects one or a plurality of specific information acquisition devices from among the plurality of information acquisition devices.
Then, the device selection unit 310 outputs the sound from the microphone 110, the sound collected by the headset 41, and the result of the recognition of the image, to the signal generation unit 303. Here, in the present embodiment, the device selection unit 310 has selected the device for obtaining the content being a source from which the vibration pattern is generated, from among the sound collection devices, but, in addition, a plurality of image acquisition devices may be provided, in addition to the camera 20, so as to select a device configured to obtain the maximum information about the content to be transmitted, from among the plurality of image acquisition devices.
The signal generation unit 303 acquires, from the device selection unit 310, inputs of the sound from the microphone 110, the sound collected by the headset 41, and the result of the recognition of the image. Then, the signal generation unit 303 generates the vibration signal on the basis of the sound from the microphone 110 and adjust the vibration signal. Furthermore, a vibration pattern corresponding to the sound collected by headset 41 and a vibration pattern corresponding to the result of the recognition of the image are acquired. Furthermore, the signal generation unit 303 adds the vibration pattern corresponding to the sound collected by the headset 41 and the vibration pattern corresponding to the result of the recognition of the image, to the vibration signal, as the vibration effect. In other words, the signal generation unit 303 generates the vibration signal on the basis of acquisition information obtained from the specific information acquisition devices selected by the device selection unit 310.
The device selection unit 310 receives the instruction for generation of the signal to which the vibration effect is to be added, together with the input of the sound collected by the microphone 110 (Step S21).
Next, the device selection unit 310 acquires the result of the recognition of the image, and the sounds from the headset 41 and the external microphone 42 (Step S22).
Next, the device selection unit 310 determines whether the content to be transmitted is included (Step S23). When the content to be transmitted is not included (Step S23: negative), the device selection unit 310 outputs, to the signal generation unit 303, the instruction for generation of the signal to which the vibration effect is to be added, together with the sound collected by the microphone 110 and the result of the recognition of the image. Thereafter, the vibration signal generation process proceeds to Step S25.
On the other hand, when the content to be transmitted is included (Step S23: affirmative), the device selection unit 23 selects a device having the highest sound pressure, from the microphone 110, the headset 41, and the external microphone 42. Then, the device selection unit 23 acquires information collected by the selected device (Step S24). Then, the device selection unit 310 outputs, to the signal generation unit 303, the instruction for generation of the signal to which the vibration effect is to be added, together with the sound from the microphone 110, the result of the recognition of the image, and the information collected by the selected device.
The signal generation unit 303 generates the vibration signal on the basis of the sound from the microphone 110 and adjust the vibration signal (Step S25).
Next, the signal generation unit 303 acquires a vibration pattern corresponding to the result of the recognition of the image. Furthermore, when receiving the sound from the device selected by the device selection unit 310, the signal generation unit 303 acquires a vibration pattern corresponding to the sound. Then, the signal generation unit 303 adds the acquired vibration pattern to the vibration signal, as the vibration effect (Step S26).
As described above, the remote communication system according to the present embodiment selects a device configured to obtain the maximum information about the content to be transmitted, applying a vibration pattern corresponding to the information obtained from the device, as the vibration effect. This configuration makes it possible to transmit vibration more appropriately representing the action of the operator or the object, to the person with whom the operator communicates, enabling activated communication.
The remote communication system 1 according to the present embodiment is illustrated in the block diagram of
It is considered that the operator may place an object that may cause trouble due to vibration or an object that hinders vibration on the top plate 101. For example, an unstable object, a living thing, an object absorbing vibration, or the like can be considered. In addition, it is considered that the operator may have a trouble if the top plate 101 is vibrated. For example, it is considered that the operator is sleeping at the top plate 101, the operator is performing work that may have trouble if vibration is generated on the top plate 101, or the operator is working at a common desk with another person. Furthermore, continuous transmission of the vibration signal generated from the sound collected by the microphone 120 may cause excessive vibration of the vibrator 120. In these cases, it is preferable to suppress vibration by not vibrating the top plate 101, reducing the vibration, or the like.
Therefore, the image analysis unit 304 according to the present embodiment grasps the object that may cause trouble due to vibration, the object that hinders vibration, or a state in which vibration may cause trouble, in advance by machine learning or the like. Then, when it is determined that the object that may cause trouble due to vibration or the object that hinders vibration is positioned on the top plate 101, as a result of recognition of an image captured by the camera 20, the image analysis unit 304 instructs the signal output unit 302 to suppress the vibration. Furthermore, the image analysis unit 304 instructs the signal output unit 302 to suppress the vibration similarly in a state in which the vibration may cause trouble.
Here, in the present embodiment, the image analysis unit 304 has suppressed the vibration by using the result of the recognition of the image of the camera 20, but a trigger to suppress the vibration is not limited thereto. For example, the object that may cause trouble due to vibration, the object that hinders vibration, or the state in which vibration may cause trouble may be determined on the basis of information from the microphone 100, a weight sensor additionally arranged, or the like, arranged on the top plate 101. Furthermore, the image analysis unit 304 may use information from another camera or another microphone arranged in a room, a vibration sensor, an IMU sensor, an acceleration sensor, an application activated on the terminal device 30, and the like to determine a state in which trouble may be caused due to vibration.
When the vibrator 120 is vibrated by the signal from the terminal device 31 of the person with whom the operator remotely communicates, the signal output unit 302 dampens or stops the vibration to suppress the vibration.
Here, in the present embodiment, the vibration is suppressed in the terminal device 30 on the reception side, but the vibration may be suppressed in the terminal device 30 on the transmission side. For example, a request for vibration suppression may be notified from the reception side of the vibration signal whose vibration is desired to be suppressed, to the transmission side of the vibration signal so that the signal generation unit 303 suppresses the signal upon generating the vibration signal in the terminal device 30 on the transmission side.
Furthermore, when the sound collected by the microphone 110 continues for a certain specific time period, the signal generation unit 303 stops generation of the vibration signal. In addition, when recognition of an image acquired from the other camera capturing an image in the room is performed and it is determined that the sound is continued for a predetermined period or more on the basis of a result of the recognition of the image, the signal generation unit 303 may stop generation of the vibration signal. For example, when a vacuum cleaner is recognized as a result of the recognition of the image, the signal generation unit 303 determines that the sound of using the vacuum cleaner continues for a predetermined period or more, and stops generation of the vibration signal based on the sound of the vacuum cleaner.
Furthermore, after removing a steady sound or the like from the sound collected by the microphone 110 by using sound source separation technology to extract a sound to be transmitted, the signal generation unit 303 may use the extracted sound to generate the vibration signal.
As described above, the remote communication system according to the present embodiment adjusts the vibration to be received and the vibration to be transmitted according to the object on the top plate, the surrounding environment, or the like. This configuration makes it possible to suppress transmission of excessive vibration or transmission and reception of the vibrations at inappropriate timing, enabling reduction of discomfort in remote communication and smooth communication.
The action determination unit 311 acquires operation information about an operation performed by the operator by using the input device 306, via the input/output control unit 307. Examples of the operations performed by the operator by using the input device 306 include closing a working file, completing e-mail transmission, finish of a remote meeting, cancellation of a meeting schedule, surfing the Internet, performing time-consuming work, and not typing.
In addition, the action determination unit 311 is connected to an external sensor 43 and acquires information about the action of the operator acquired by the external sensor 43. The external sensor 43 is, for example, another camera that captures an image of the operator. Examples of the actions of the operator include, for example, looking at the mat speaker 10, sitting on a chair for one hour or more, sitting on the chair, having a snack or drinking, and having a memo pad but writing nothing. For example, the action determination unit 311 performs image recognition by using the image of the camera, recognizes the action of the operator, and identifies the recognized action of the operator.
Here, the action determination unit 311 has in advance a weight indicating each operation with the input device 306 by the operator and the desire to talk for each action of the operator. Here, it is assumed that a higher weight indicates a higher desire to talk. For example, when the user is surfing the Internet, the weight is increased, and when the user is performing time-consuming work, the weight is reduced. Then, the action determination unit 311 adds up the weights of operations using the input device 306 by the operator until that time and the actions of the operator and calculates a current weight of the desire to talk of the operator.
Here, the action determination unit 311 holds in advance a reference value for the desire to talk to determine that the desire to talk is higher when the weight is equal to or larger than the reference value and is lower when the weight is less than the reference value. Next, the action determination unit 311 compares the calculated current weight of the desire to talk and the reference value.
When the weight of the current desire to talk is equal to or larger than the reference value, the action determination unit 311 determines that the desire to talk is high. Then, the action determination unit 311 instructs the communication control unit 305 and the input/output control unit 307 to perform processing of presenting the desire to talk, to the person with whom the operator remotely communicates. On the other hand, when the weight of the current desire to talk is less than the reference value, the action determination unit 311 determines that the desire to talk is low, and does not notify of the desire to talk.
The action determination unit 311 may clearly present the desire to talk, to the person with whom the operator remotely communicates. For example, the action determination unit 311 instructs the communication control unit 305 to display an icon indicating that it is OK to talk, on a screen for the camera 20 of the terminal device 30, displayed on the terminal device 30, or to output a pop-up sound from the terminal device 31 of the person with whom the operator remotely communicates. In addition, the action determination unit 311 instructs the communication control unit 305 to perform display in which the brightness of the screen for the camera 20 of the terminal device 30, displayed on the terminal device 30, changes by blinking.
Furthermore, the action determination unit 311 may gently communicate the desire to talk, to the person with whom the operator remotely communicates. For example, the action determination unit 311 causes the communication control unit 305 and the input/output control unit 307 to display a common topic between the operator who desires to have a talk and the person with whom the operator remotely communicates. Furthermore, the action determination unit 311 instructs the communication control unit 305 to gradually increase a sound volume to be transmitted to the terminal device 30 of the person with whom the operator remotely communicates. Furthermore, the action determination unit 311 notifies the signal generation unit 303 of an instruction for gradually increasing the vibration to be transmitted to the terminal device 30 of the person with whom the operator remotely communicates. Furthermore, the action determination unit 311 instructs the communication control unit 305 to gradually increase the angle of view of the image of the camera 20 of the terminal device 30, for the terminal device 31 of the person with whom the operator remotely communicates. Furthermore, the action determination unit 311 instructs the signal generation unit 303 to generate a signal for vibrating an object on the mat speaker 10 to move, for the terminal device 31 of the person with whom the operator remotely communicates. Furthermore, the action determination unit 311 may cause the communication control unit 305 to transmit an image of a swaying tree in the image of the camera 20 of the terminal device 30, to the terminal device 31 of the person with whom the operator remotely communicates, and may instruct the signal generation unit 303 to generate the vibration signal that vibrates the vibrator 120 in accordance with the swaying of the tree. Furthermore, the action determination unit 311 may cause the communication control unit 305 to transmit a command to move a predetermined mascot in the image of the camera 20 of the terminal device 30, to the terminal device 31 of the person with whom the operator remotely communicates, and may instruct the signal generation unit 303 to generate the vibration signal for vibration according to the movement of the mascot.
In this way, the action determination unit 311 determines the operation by the operator or the action of the operator, and causes the signal generation unit 303 to transmit a predetermined signal to the other terminal device 31 according to a result of the determination and to generate the vibration signal based on the result of the determination.
In response to the instruction from the action determination unit 311, the signal generation unit 303 generates the vibration signal corresponding to the result of the determination, according to the result of the determination of the operation or action of the operator by the action determination unit 311.
The input/output control unit 307 receives the instruction for presenting the desire to talk from the operator, from the input device 306. Then, the input/output control unit 307 presents the desire to talk, to the person with whom the operator remotely communicates, by using another visual effect or light.
For example, a light source 43 is arranged in the vicinity of the mat speaker 10. Examples of the light source 43 include, for example, an LED arranged around the mat speaker 10, lighting for illuminating the entire mat speaker 10, a spotlight for illuminating the mat speaker 10, a multi-light source for illuminating the mat speaker 10 in a divided manner, and the like.
The operator uses the input device 306 to input an instruction for blinking the light source 43 around the mat speaker 10. The input/output control unit 307 receives the instruction from the operator, and blinks the light source 43 around the mat speaker 10 of the terminal device 30 as illustrated in a state 50 of
In addition, the operator uses the input device 306 to input an instruction for setting, to the mood of the operator, the brightness or hue of lighting of the light source 43 that illuminates the mat speaker 10. The input/output control unit 307 receiveds the instruction from the operator, and changes the brightness or hue of lighting of the light source 43 that illuminates the mat speaker 10. As illustrated in
In addition, the operator uses the input device 306 to input an instruction for putting a spotlight of the light source 43 on the specific position of the mat speaker 10. The input/output control unit 307 receives the instruction from the operator, and controls the light source 43 to illuminate the specific position of the mat speaker 10 with the spotlight. As illustrated in
Furthermore, the light source 43 of the person with whom the operator remotely communicates may be changed by the input from the operator.
The operator uses the input device 306 to select an LED icon 55 in an image of a camera 20 of the person with whom the operator remotely communicates, on a screen 54 displayed on the display device 308 of the terminal device 30 illustrated in
Furthermore, the operator uses the input device 306 to operate a lighting slide bar 57 in the image of the camera 20 of the person with whom the operator remotely communicates, on the screen 54 displayed on the display device 308 of the terminal device 30 illustrated in
In addition, as illustrated in
As described above, the remote communication system according to the present embodiment automatically determines the desire to talk on the basis of the operation of the application by using the input device by the operator or on the basis of the action of the operator, and presents the desire to talk, to the person with whom the operator remotely communicates, if the operator has the desire to talk. In addition, the remote communication system according to the present embodiment controls the light source in the vicinity of the mat speaker of the terminal device 30 or the light source in the vicinity of the mat speaker of the person with whom the operator communicates, according to an instruction using the input device by the operator. This configuration makes it possible to notify the person with whom the operator remotely communicates of the desire to talk or the timing at which the operator desires to talk, making it easy for the partner to talk. Therefore, communication can be activated.
The microphone 110 is disposed at the center of the back side of the top plate 101. The microphone 110 is a vibration sensor.
The vibrators 121 to 124 are each a vibration actuator and are disposed at four corners of the back side of the top plate 101. In the present embodiment, the vibrators 121 to 124 each serve as a pedestal of the mat speaker 10 as well. Regions 131 to 134 on the front side of the top plate 101 are located at positions facing the vibrators 121 to 124 and are each a region corresponding to each of the vibrators 121 to 124.
Different persons with whom the operator remotely communicates are assigned to the regions 131 to 134. The operator uses the input device 306 of the terminal device 30 to activate a region assignment application.
The operator uses the input device 306 to select each of the icons 141 to 144, for selecting the persons with whom the operator remotely communicates to be assigned to the corresponding regions 131 to 134. For example, the operator selects the icon 141 to assign a terminal device 31A operated by a person A to the region 131. Furthermore, the operator selects the icon 142 to assign a terminal device 31B operated by a person B to the region 132. Furthermore, the operator selects the icon 143 to assign a terminal device 31C operated by a person C to the region 133. Furthermore, the operator selects the icon 144 to assign a terminal device 31D operated by a person D to the region 134. Hereinafter, an example in which the terminal devices 31A to 31D are assigned to the regions 131 to 134 will be described.
The input/output control unit 307 receives an input of assignment information about each of the regions 131 to 133 from the input device 306. Then, the input/output control unit 307 outputs the assignment information about each of the regions 131 to 133 to the communication control unit 305.
Furthermore, during remote communication, the input/output control unit 307 causes the display device 308 to display a user interface 150 of
The image analysis unit 304 performs recognition of an image captured by the camera 20. Then, it is determined whether the operator touches any one of the regions 131 to 134 of the top plate 101. When the operator touches any one of the regions 131 to 134 of the top plate 101, the information about the any one of the regions 131 to 134 selected through touching by the operator is output to the signal generation unit 303 and the communication control unit 305.
The signal generation unit 303 acquires the assignment information about each of the regions 131 to 133, from the input/output control unit 307 and stores the information. Then, the signal generation unit 303 receives an instruction for generating the vibration signal from the contact detection unit 301. At this time, when any one of the regions 131 to 134 is selected, an input of information about the any one region selected from the regions 131 to 134 is received from the image analysis unit 304. Then, the signal generation unit 303 selects any of the terminal devices 31A to 31D corresponding to the area selected from the regions 131 to 134, and sets the selected device as a transmission destination of the vibration signal.
For example, when a notification indicating that the region 131 is selected is received from the image analysis unit 304, the signal generation unit 303 sets the terminal device 31A to which the region 131 is assigned, as the transmission destination of the vibration signal. Furthermore, when a notification indicating that the region 132 is selected is received from the image analysis unit 304, the signal generation unit 303 sets the terminal device 31B to which the region 132 is assigned, as the transmission destination of the vibration signal. Furthermore, when a notification indicating that the region 133 is selected is received from the image analysis unit 304, the signal generation unit 303 sets the terminal device 31C to which the region 133 is assigned, as the transmission destination of the vibration signal. Furthermore, when a notification indicating that the region 134 is selected is received from the image analysis unit 304, the signal generation unit 303 sets the terminal device 31C to which the region 134 is assigned, as the transmission destination of the vibration signal. In this way, the signal generation unit 303 determines the transmission destination of the vibration signal, on the basis of a result of the detection, and causes the communication control unit 305 to transmit the vibration signal.
When any one of the regions 131 to 134 is selected upon transmission of the vibration signal, the communication control unit 305 receives an input of the vibration signal to which the transmission destination is set. Then, the communication control unit 305 transmits the vibration signal to the transmission destination set to the vibration signal.
In addition, the communication control unit 305 acquires the assignment information about each of the regions 131 to 133, from the input/output control unit 307 and stores the information. Then, when receiving an input of the vibration signal from any of the terminal devices 31A to 31D, the communication control unit 305 identifies a transmission source of the vibration signal. When the transmission source is the terminal device 31A, the communication control unit 305 outputs the vibration signal to the signal output unit 302 and instructs the signal output unit 302 to vibrate the vibrator 121 corresponding to the region 131. Furthermore, when the transmission source is the terminal device 31B, the communication control unit 305 outputs the vibration signal to the signal output unit 302 and instructs the signal output unit 302 to vibrate the vibrator 122 corresponding to the region 132. When the transmission source is the terminal device 31C, the communication control unit 305 outputs the vibration signal to the signal output unit 302 and instructs the signal output unit 302 to vibrate the vibrator 123 corresponding to the region 133. When the transmission source is the terminal device 31D, the communication control unit 305 outputs the vibration signal to the signal output unit 302 and instructs the signal output unit 302 to vibrate the vibrator 124 corresponding to the region 134.
The signal output unit 302 receives an input of the vibration signal and an input of the instruction indicating whether to vibrate which of the vibrators 121 to 124. Then, the signal output unit 302 outputs the vibration signal to a vibrator 120 for which the instruction is given, of the vibrators 121 to 124, and vibrates the vibrator 120 for which the instruction is given. The operator can readily understand that the vibration has arrived from which one of the persons A to D who are the persons with whom the operator remotely communicates, according to a vibrated place from among the regions 131 to 134 on the top plate 101. In this way, on the basis of the vibration signal transmitted from another terminal device 31, the signal output unit 302 vibrates a vibrator 120 corresponding to the other terminal device 31 from among the plurality of vibrators 120.
Here, in the present embodiment, four actuators have been arranged on the top plate 101 and four regions 131 to 134 have been assigned, but regions, the number of which is larger than that of the regions 131 to 134, can be assigned by using the technology of tactile super-resolution.
Furthermore, in the present embodiment, upon selecting the transmission destination of the vibration signal, the image of the camera 20 has been analyzed to determine the transmission destination. However, the present embodiment is not limited thereto, and for example, a touch pad may be used as the top plate 101 to acquire information about a touched region so that the signal generation unit 303 may determine the transmission destination. In addition, three or more acceleration sensors may be arranged on the top plate 101 so that the signal generation unit 303 may determine the transmission destination on the basis of the balance of vibration intensities acquired from the acceleration sensors. Furthermore, a tilt sensor that detects the inclination of the top plate 101 may be arranged so that the signal generation unit 303 may determine the transmission destination on the basis of an inclination direction.
The communication control unit 305 receives the vibration signal transmitted via the network (Step S101).
Next, the communication control unit 305 determines whether the received vibration signal is a signal from the terminal device 31A (Step S102). When the received vibration signal is the signal from the terminal device 31A (Step S102: affirmative), the communication control unit 305 instructs the signal output unit 302 to vibrate the vibrator 121. The signal output unit 302 outputs the vibration signal from the vibrator 121 (103).
On the other hand, when the received vibration signal is not the signal from the terminal device 31A (Step S102: negative), the communication control unit 305 determines whether the received vibration signal is a signal from the terminal device 31B (Step S104). When the received vibration signal is the signal from the terminal device 31B (Step S104: affirmative), the communication control unit 305 instructs the signal output unit 302 to vibrate the vibrator 122. The signal output unit 302 outputs the vibration signal from the vibrator 122 (Step S105).
On the other hand, when the received vibration signal is not the signal from the terminal device 31B (Step S104: negative), the communication control unit 305 determines whether the received vibration signal is a signal from the terminal device 31C (Step S106). When the received vibration signal is the signal from the terminal device 31C (Step S106: affirmative), the communication control unit 305 instructs the signal output unit 302 to vibrate the vibrator 123. The signal output unit 302 outputs the vibration signal from the vibrator 123 (Step S107).
On the other hand, when the received vibration signal is not the signal from the terminal device 31B (Step S106: negative), the communication control unit 305 instructs the signal output unit 302 to vibrate the vibrator 124. The signal output unit 302 outputs the vibration signal from the vibrator 124 (Step S108).
The signal generation unit 303 receives the instruction for generating the vibration signal from the contact detection unit 301 (Step S201).
Next, the signal generation unit 303 generates the vibration signal and determines whether the vibration signal generated using a result of the recognition by the image analysis unit 304 is a signal to the terminal device 31A (Step S202). When the acquired vibration signal is a signal to the terminal device 31A (Step S202: affirmative), the signal generation unit 303 sets the transmission destination of the vibration signal to the terminal device 31A. The communication control unit 305 acquires the vibration signal from the signal generation unit 303 and transmits the vibration signal to the terminal device 31A as the set transmission destination (Step S203).
On the other hand, when the generated vibration signal is not the signal to the terminal device 31A (Step S202: negative), the signal generation unit 303 determines whether the generated vibration signal is a signal to the terminal device 31B (Step S204). When the generated vibration signal is the signal to the terminal device 31B (Step S204: affirmative), the signal generation unit 303 sets the transmission destination of the vibration signal to the terminal device 31B. The communication control unit 305 acquires the vibration signal from the signal generation unit 303 and transmits the vibration signal to the terminal device 31B as the set transmission destination (Step S205).
On the other hand, when the generated vibration signal is not the signal to the terminal device 31B (Step S204: negative), the signal generation unit 303 determines whether the generated vibration signal is a signal to the terminal device 31C (Step S206). When the generated vibration signal is a signal to the terminal device 31C (Step S206: affirmative), the signal generation unit 303 sets the transmission destination of the vibration signal to the terminal device 31C. The communication control unit 305 acquires the vibration signal from the signal generation unit 303 and transmits the vibration signal to the terminal device 31C as the set transmission destination (Step S207).
On the other hand, when the acquired vibration signal is not the signal to the terminal device 31C (Step S206: negative), the signal generation unit 303 sets the transmission destination of the vibration signal to the terminal device 31D. The communication control unit 305 acquires the vibration signal from the signal generation unit 303 and transmits the vibration signal to the terminal device 31D as the set transmission destination (Step S208).
As described above, the terminal device according to the present embodiment have different positions to be vibrated on the top plate, according to the transmission source of the vibration signal. In addition, the terminal device according to the present embodiment transmits the vibration signal to different transmission destinations according to a specified position on the top plate. This configuration makes it possible to achieve communication with the plurality of persons by using the vibration, recognizing the transmission source of each piece of information. Furthermore, it is possible to selectively communicate with a specific person from among the plurality of persons with whom the operator communicates. Therefore, communication with the individual persons is facilitated, enabling activated communication using remote communication.
Note that in the above embodiments, the vibrator is arranged on the mat speaker, but the arrangement position of the vibrator is not limited thereto, and for example, the vibrator may be arranged in the terminal device such as a smartphone or a smart watch to vibrate the terminal device itself.
A series of the process steps described above can be executed by hardware or software. In a case where the series of process steps is executed by software, programs constituting the software are installed on a computer. Here, examples of the computer include a computer that is incorporated in dedicated hardware, a general-purpose computer that is configured to execute various functions by installing various programs, and the like.
An input/output interface 910 is also connected to the bus 904. An input unit 911, an output unit 912, a storage unit 913, a communication unit 914, and a drive 915 are connected to the input/output interface 910.
The input unit 911 includes, for example, a keyboard, a mouse, a microphone, a touch screen, an input terminal, and the like. The output unit 912 includes, for example, a display, a speaker, an output terminal, and the like. The storage unit 913 includes, for example, a hard disk, a RAM disk, a non-volatile memory, and the like. The communication unit 914 includes, for example, a network interface. The drive 915 drives a removable medium 921 such as a magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
In the computer configured as described above, the CPU 901 loads, for example, programs stored in the storage unit 913 into the RAM 903 via the input/output interface 910 and the bus 904, for execution, and the series of process steps described above are executed. The RAM 903 also appropriately stores data necessary to execute various processing by the CPU 901.
For example, a program executed by the CPU 901 can be applied by being recorded in the removable medium 921 as a package medium or the like. In this configuration, when the removable medium 921 is inserted into the drive 915, the program can be installed in the storage unit 913 via the input/output interface 910.
Furthermore, the program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, and digital satellite broadcast. In this configuration, the program is allowed to be received by the communication unit 914 and installed in the storage unit 913.
In addition, the program is allowed to be installed in the ROM 902 or the storage unit 913 in advance.
The embodiments of the present disclosure have been described above, but the technical scope of the present disclosure is not strictly limited to the embodiments described above, and various modifications and alterations can be made without departing from the spirit and scope of the present disclosure. Moreover, the component elements of different embodiments and modifications may be suitably combined with each other.
Note that the effects described herein are merely examples and are not limited to the description, and other effects may be provided.
Note that the present technology can also have the following configurations.
An information processing system comprising:
The information processing system according to (1), further comprising
The information processing system according to (1), further comprising
The information processing system according to (3), wherein
The information processing system according to (3) or (4), wherein the signal generation unit generates the vibration signal based on an appearance of the object, upon failure in the recognition of the object by the image analysis unit.
The information processing system according to any one of (1) to (5), wherein
The information processing system according to any one of (1) to (6), further comprising
The information processing system according to any one of (1) to (7), further comprising
The information processing system according to any one of (1) to (8), wherein the signal output unit suppresses vibration of the vibration mechanism caused by the vibration signal, based on the result of the detection of the object placed on the mat.
The information processing system according to any one of (1) to (9), further comprising
The information processing system according to any one of (1) to (10), wherein the signal generation unit determines a transmission destination of the vibration signal based on the result of the detection.
The information processing system according to any one of (1) to (11), wherein
A control method comprising:
A control program causing a computer to execute processing of:
Number | Date | Country | Kind |
---|---|---|---|
2021-167513 | Oct 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/037069 | 10/4/2022 | WO |