The present invention relates to a robot and a robot system.
There have been proposed various robots that can communicate with humans. For example, Patent Document 1 listed below proposes a humanoid robot that talks in a recorded voice of a particular individual and that can simultaneously move its members based on previously registered habits of that individual to express affection and the like. On the other hand, as a watching system for people who are living alone, Patent Document 2 listed below proposes receiving data of detection both from a human detection sensor installed in a home of a person as a target to be watched and from an acceleration sensor that the resident wears, to make judgments on activities and conditions of the resident and events occurring in the home. Furthermore, Patent Document 3 listed below proposes a mastication movement detection device in which the number of mastication movements is counted based on a detected waveform received from a detector that is placed in an external auditory canal and detects an amount of deformation of the external auditory canal. Also, as for cartilage conduction, which has been discovered as a third conduction route in addition to the long-known air conduction and bone conduction, Patent Document 4 listed below describes that vibration generated by a vibration source contacting an ear cartilage around the entrance part of an external auditory canal causes air-conducted sound to be generated from a cartilage surface inside the external auditory canal, and the generated air-conducted sound then proceeds through the inside of the external auditory canal to reach an tympanic membrane.
Patent Document 1: Japanese Patent Application Publication No. 2010-94799
Patent Document 2: Japanese Patent Application Publication No. 2014-89494
Patent Document 3: Japanese Patent Application Publication No. 2011-10791
Patent Document 4: Japanese Patent Application Publication No. 2013-81047
However, as to robots, and robot systems that utilize robots, there are many problems yet to be addressed.
Against the background discussed above, an object of the present invention is to provide a robot, and a robot system that utilizes a robot, that is capable of appropriate communication with humans.
To achieve the above object, according to one aspect of the present invention, there is provided a robot including: a hand; and a cartilage conduction vibration source which is provided in the hand and which conducts vibration to the ear cartilage of a human. Thus, communication is possible between the robot and the human by cartilage conduction with a natural movement.
According to a specific feature, the robot includes two hands, and the cartilage conduction vibration source is provided in each of the two hands. Thus, communication is possible between the robot and the human by cartilage conduction with a comforting staging in which, for example, the head of the person is held gently in both hands of the robot. In addition, stereophonic hearing is possible. According to another specific feature, the robot includes a finger in the hand, and the cartilage conduction vibration source is provided in the finger. Thus, more efficient cartilage conduction is possible. According to a more specific feature, there is provided a joint mechanism which guides the entire hand to achieve contact with the ear cartilage and which adjusts the finger to guide it to the tragus. Thus, adjustment for appropriate cartilage conduction is possible.
According to another specific feature, the robot includes a control unit which, when the two hands make contact with the ear cartilages of two ears respectively for cartilage conduction, controls the two hands so as not to restrain the movement of the face while maintaining the positions of the two hands relative to each other. Thus, cartilage conduction without a sense of restraint is possible.
According to another specific feature, the robot includes an eye which is movable in exterior appearance, and the eye is moved in coordination such that the line of sight of the eye points between the two hands. Thus, more intimate communication with the robot by cartilage conduction is possible. According to another specific feature, the robot includes a mouth mechanism which is movable in exterior appearance, and the mouth mechanism moves in coordination with the voice conducted by the vibration of the cartilage conduction vibration source. Thus, communication by natural cartilage conduction is possible.
According to another specific feature, the robot includes a limiter which, when the hand makes contact with the ear cartilage to conduct the vibration of the cartilage conduction vibration source to the ear cartilage, adjusts the pressure of the contact. Thus, safe communication by cartilage conduction is possible. According to another specific feature, the robot includes a communicating means for asking for consent when the hand is brought into contact with the ear cartilage to conduct the vibration of the cartilage conduction vibration source to the ear cartilage. Thus, communication by cartilage conduction without a sense of discomfort is possible.
According to another specific feature, the robot includes a control unit which, when the hand is brought into contact with the ear cartilage to conduct the vibration of the cartilage conduction vibration source to the ear cartilage, confirms safety beforehand. Thus, highly safe communication by cartilage conduction is possible. According to another specific feature, the robot includes an abnormality detecting means, and, when the hand is brought into contact with the ear cartilage to conduct the vibration of the cartilage conduction vibration source to the ear cartilage, if the abnormality detecting means detects an abnormality, the hand is inhibited from making contact with the ear cartilage. Thus, even in an unforeseen situation, trouble can be avoided.
According to another specific feature, the robot includes a joint mechanism which holds an arm of the robot so as not to resist the external force which guides the hand to the ear cartilage. Thus, a person who attempts to make contact with the robot can easily guide to his ear cartilage the hand of the robot. According to another specific feature, the cartilage conduction vibration source conducts vibration to one ear of the human, and, and the robot includes a following means for making the hand follow the movement of the head of the human. Thus, even when vibration is conducted to one ear, contact can be prevented from being broken by the movement of the head of the human.
According to another specific feature, the hand of the robot has: a first finger in which the cartilage conduction vibration source is provided; and a second finger which supports the weight of the head of the human. Thus, while the robot is making a movement of holding and raising the head of a lying person, natural communication by cartilage conduction is possible.
According to another specific feature of the present invention, there is provided a robot including: a hand; and a heater which heats the hand to human body temperature. Thus, contact of the robot with the human is possible without an uncomfortable sense of coldness.
According to another specific feature of the present invention, there is provided a robot system including: a robot which has, provided in a finger, a cartilage conduction vibration source for conducting vibration to the ear cartilage of a human and which is shared among a large number of humans; and accessories which are to be worn by the large number of humans respectively, each of the accessories covering at least part of the ear cartilage. The vibration of the cartilage conduction vibration source is conducted to the ear cartilage of one of the large number of humans indirectly via the corresponding one of the accessories. Thus, despite the sharing of the hand of the robot touched by no one knows whom, it is possible to build a hearing system that provides the benefits of cartilage conduction hygienically. Specifically, the accessories are each, for example, one of an ear warmer, a headband, an ear cuff, and an ear-worn article of character merchandise.
According to another specific feature of the present invention, the accessories each include an information holding unit that holds information for identifying its wearer, and the robot includes a reading means for reading the information. Thus, it is possible to build a robot system that can very adroitly cope with the needs of the wearers of the accessories, and thus it is possible to motivate people to wear the accessories.
According to another specific feature of the present invention, there is provided a robot system including: accessories which are worn by a large number of wearers respectively and which each include an information holding unit that holds information for identifying its wearer; and a robot including a reading means for reading the information. Thus, the robot can handle the individual wearers in manners proper to them respectively.
As described above, according to the present invention, it is possible to provide a robot, and a robot system that utilizes a robot, that is capable of appropriate communication with humans.
The ear-mounted unit 6 functions as a headset for the mobile phone 10 by performing the short-range communication with the mobile phone 10, and allows a phone call to be made with the mobile phone 10 kept in a clothes pocket. The ear-mounted unit 6 also independently functions as a hearing aid. These functions as a headset and as a hearing aid are both achieved by making use of cartilage conduction, which will be described later. The ear-mounted unit 6 further includes a mastication sensor to detect movement of the tragus, etc., or deformation of the external auditory canal, caused by masticatory movement. Here, the ear-mounted unit 6 is ring-shaped with a hole 6a, so that the entrance of the external auditory canal is open even when the ear-mounted unit 6 is fitted in the external auditory canal. This makes it possible to hear external sound via the hole 6a, and contributes to a comfortable wear of the ear-mounted unit 6 without a feeling of blockage in the external auditory canal. Further, by closing the hole 6a with a finger or covering it with a palm as necessary as will be described later, it is possible to obtain an occlusion effect in the cartilage conduction to hear a larger sound.
The in-home monitor unit 8 has a short-range communication unit 12 for short-range communication with the ear-mounted unit 6 and the mobile phone 10, and a digital communication unit 14 which performs always-on-connection Internet communication with an external device. A control unit 16 controls the entire in-home monitoring unit 8, which includes the short-range communication unit 12 and the digital communication unit 14. A storage unit 18 stores therein a program necessary for the control performed by the control unit 16, and also temporarily stores therein various pieces of data related to the control, etc.
With this configuration, the in-home monitoring unit 8 receives a result of detection of masticatory movement from the ear-mounted unit 6 via the short-range communication. If no masticatory movement expected in daily life has been detected, the in-home monitor unit 8 judges that there is a possibility of an abnormality, and notifies a watching-service provider to that effect via the digital communication unit 14. Further, the in-home monitoring unit 8 receives information regarding presence/absence of voice of the watching-target person detected by the headset function of the ear-mounted unit 6. In a case where there is no voice detected within a predetermined period of time, or in a case where a voice signal conveying urgency, such as a scream, has been detected, the in-home monitoring unit 8 judges that there is a possibility of an abnormality, and notifies a watching-service provider to that effect via the digital communication unit 14.
Further, the mobile phone 10 receives a result of detection of masticatory movement from the ear-mounted unit 6 via short-range communication. If no masticatory movement expected in daily life has been detected, the mobile phone 10 judges that there is a possibility of an abnormality, and makes an automatic phone call to a mobile phone of a member of family of the watching-target person or the like who lives remotely and has been registered in advance, and when an answer to the phone call is received, the mobile phone 10 notifies him/her to that effect in a form of an automatic voice message. Further, the mobile phone 10 receives information regarding presence/absence of voice of the watching-target person detected by the headset function of the ear-mounted unit 6. In a case where there is no voice detected within a predetermined period of time, or in a case where a signal of voice conveying urgency, such as a scream, has been detected, the mobile phone 10 judges that there is a possibility of an abnormality, and makes an automatic phone call to the mobile phone of the member of family of the watching-target person or the like who lives remotely and has been registered in advance, and when an answer to the phone call is received, the mobile phone 10 issues a notification to that effect.
Here, in a case where masticatory movement expected in daily life is detected, too, the mobile phone 10 makes an automatic phone call to the mobile phone of the member of family of the watching-target person or the like who lives remotely, and when an answer to the phone call is received, the mobile phone 10 notifies him/her to the effect that there is no abnormality occurring as an automatic voice message. Further, based on detection of a normal voice of the watching-target person, too, the mobile phone 10 makes an automatic phone call to the member of family of the watching-target person or the like who lives remotely as necessary, and when an answer to the phone call is received, the mobile phone 10 notifies him/her to the effect that there is no abnormality occurring in the form of an automatic voice message. This makes it possible for the member of family of the watching-target person or the like who lives remotely to know that the watching-target person regularly has three meals a day, and presence of conversation that the watching-target person is expected to usually have or a state of voice to be regularly uttered in a previously set time zone (for example, conversation in daily shopping, daily sutra chanting), and to rest reassured knowing that the watching-target person is all right. In this case, however, the mobile phone 10 makes an automatic phone call even when the watching-target person does not intend to, and thus contents of such conversations are to be undesirably heard by the member of family of the watching-target person or the like who lives remotely. Even though it is his or her own family member that hears the contents of such conversations, this is not desirable to the watching-target person in terms of privacy, and thus, as will be described later, what is notified is just whether or not voice has been uttered so that the contents of a conversation cannot be heard.
On the other hand, as illustrated in
As illustrated in
The ear-mounted unit 6 includes a cartilage conduction vibration source 42 (which is, for example, a piezoelectric bimorph element), which vibrates in accordance with a voice signal of a call partner received from the mobile phone 10 via short-range communication, and this vibration is transmitted to an ear cartilage in contact with the ear-mounted unit 6, and this makes it possible to hear the voice of the phone call partner by cartilage conduction, which will be described later. A bone conduction microphone 44 catches bone-conducted own voice of the watching-target person and transmits a voice signal of the own voice to the mobile phone 10 via short-range communication, and this enables conversations to be conducted. In this manner, the ear-mounted unit 6 functions as a headset for the mobile phone 10. An air conduction sound microphone 46 catches an air-conducted voice of an outside conversation partner located close to the watching-target person to obtain a voice signal of the conversation partner, which makes the cartilage conduction vibration source 42 vibrate. In this manner, the ear-mounted unit 6 also independently functions as a hearing aid. The control unit 40 controls the ear-mounted unit 6 also with respect to the head-set and hearing-aid functions. In the headset function, as described above, the bone conduction microphone 44 also functions as a voice sensor for watching whether or not the watching-target person utters voice expected in daily life. A power supply unit 48, which includes a rechargeable storage battery, supplies power to the entire ear-mounted unit 6.
Now, cartilage conduction will be explained. Cartilage conduction is a phenomenon discovered by the present inventors, and denotes the phenomenon in which vibration conducted to the cartilage around an entrance part of the external auditory canal, such as that in the tragus, makes the surface of an external-auditory-canal cartilaginous part vibrate, producing air-conducted sound inside the external auditory canal. The air-conducted sound produced inside the external auditory canal travels on deeper into the external auditory canal and reaches the tympanic membrane. Thus, the greater part of the sound heard by cartilage conduction is the sound heard via the tympanic membrane. Here, however, the sound heard via the tympanic membrane is not ordinary air-conducted sound, i.e., sound that has entered the external auditory canal from outside, but air-conducted sound that is produced inside the external auditory canal.
As can be readily understood from the graph of
As can also be readily understood from the graph of
As is clear from the above description, even when the ear-mounted unit 6 does not have a structure for generating air-conducted sound (such as a vibration plate included in typical earphones), it is possible to obtain sufficient sound pressure by transmitting vibration of the cartilage conduction vibration source 42 to the ear cartilage by bringing the cartilage conduction vibration source 42 into contact with the ear cartilage. As is also clear from the above description, since there is no need of providing a structure for generating air-conducted sound, the ear-mounted unit 6 can be formed in a ring shape having the hole 6a, for example, and this makes it possible to hear outside sound through the hole 6a even when the ear-mounted unit 6 is mounted to an ear, and this contributes to comfortable wear of the ear-mounted unit 6 without a feeling of blockage in the external auditory canal.
Further, as can be readily understood from the graph of
The measurement results of which are illustrated in
In the first embodiment, the occlusion effect as described above can be achieved by closing the hole 6a and increasing the contact pressure of the ear-mounted unit 6 against the cartilage by pushing the ear-mounted unit 6 with a finger placed over the hole 6a. Or, instead, the occlusion effect can be achieved by covering the entire ear 4 with a palm. Thus, clearly, in the first embodiment, too, it is possible to hear a larger sound by closing the hole 6a with a finger or entirely covering the ear with a palm.
The measurement graph of
Next, in Step S6, it is checked whether or not the mastication sensor 38 has detected a masticatory movement. When a mastication movement is found to have been detected, the process proceeds to Step S8, where a detection signal is transmitted to the mobile phone 10 via short-range communication, and then the process proceeds to Step S12. On the other hand, when no mastication movement is found to have been detected in Step S6, the process proceeds directly to Step S12.
In Step S12, it is checked whether or not the bone conduction microphone 44 has detected voice of the watching-target person. When voice of the watching-target person is found to have been detected, the process proceeds to Step S14, and a detected voice signal is transmitted to the mobile phone 10 via the short-range communication, and meanwhile, in Step S16, the detected voice signal is transmitted to the in-home monitoring unit 8 via the short-range communication. Although the steps from Step S12 to Step S16 are illustrated in a simplified manner, in these steps, actually, for a predetermined period of time (10 seconds, for example) after voice starts to be detected by the bone conduction microphone 44, the voice signal continues to be transmitted from the bone conduction microphone 44 simultaneously to the mobile phone 10 and the in-home monitoring unit 8. At this time, even when the voice continues to be detected for a predetermined period of time or longer, the transmission is stopped as soon as the predetermined period of time elapses, whereas even though the voice disappears before the predetermined period of time elapses, the transmission of output of the bone conduction microphone 44 continues to be performed until the predetermined period of time elapses. The above-described transmission of the voice signal continued for a predetermined period of time through the steps from Step S12 to Step S16 is finished, the process proceeds to Step S20. On the other hand, when no voice signal is detected in Step S12, the process proceeds directly to Step S20.
In Step S20, it is checked whether the watching-target person has operated the mobile phone 10 to make a phone call and the other party has answered the phone call, or whether there has been an external incoming call received by the mobile phone 10 and the watching-target person has operated the mobile phone 10 to answer the incoming call. If whichever of the above is found to have occurred, the process proceeds to Step S22, where the air conduction microphone 46 is turned off and the bone conduction microphone 44 is maintained in an on state, and then the process proceeds to Step S24. Thereby, the ear-mounted unit 6 starts to function as a headset for the mobile phone 10, and prevents ambient noise from being picked up by the air conduction microphone 46 to disturb the phone call.
In Step S24, it is checked whether the phone call started in Step S20 has been ended by hanging-up of the phone. Then, when it is detected that the phone call has been ended, the process proceeds to Step S26, where the air conduction microphone 46 is turned on and the bone conduction microphone 44 is maintained in an on state, and the process proceeds to Step S28. Thereby, the ear-mounted unit 6 starts to function as a hearing aid again, and the bone conduction microphone 44 is maintained in the standby state in which it stands by for detection of voice of the watching-target person. On the other hand, when it is found that the phone call has not been ended yet in Step S24, the Step S24 is repeated until end of the phone call is detected. Further, in a case where, in Step S20, neither making a phone call and answering the phone call nor receiving a phone call and answering the phone call is detected, the process proceeds directly to Step S28.
In Step S28, it is checked whether the storage battery of the power supply unit 48 has been exhausted. When the storage battery is found not to have been exhausted, the process proceeds to Step S30, where it is checked whether the ear-mounted unit 6 has been connected to the charger, which is not illustrated, to be charged. This step is provided to deal with a case of removing the ear-mounted unit 6 from the year 4 to be charged even though the storage battery has not been exhausted. When connection for charging is detected in Step S30, the process proceeds to Step S32, where ending processing is performed to end the flow. This is significant in that this helps prevent the ear-mounted unit 6 from being maintained in an operation state by mistake when it is removed from the ear 4 and thus its watching function is disabled. On the other hand, when no connection for charging is detected in Step S30, the process returns to Step S6 to repeat the steps from Step S6 to Step S30 until the storage battery becomes exhausted or connection is achieved for charging, and the ear-mounted unit 6 maintains, as necessary, its hearing-aid function, watching function, and headset function for the mobile phone 10. Here, in a case where it is detected in Step S28 that the storage battery has been exhausted, too, the process proceeds to Step S32, where the ending processing is performed to end the flow.
The flow of
In Step S46, it is checked whether or not a new mastication detection signal has been received from the ear-mounted unit 6, and when it is found that there has been reception of a new mastication detection signal, the process proceeds to Step S48, where an e-mail notifying that the watching-target person is safe is automatically transmitted to a mobile phone of the member of family of the watching-target person or the like who lives remotely and has been registered in advance. Further, it may be set in advance that in Step S48, instead of sending an e-mail, an automatic phone call is made to the mobile phone of the member of family of the watching-target person or the like who lives remotely and has been registered in advance, and on reception of a response from the mobile phone, an automatic voice message is transmitted to notify him/her that the watching-target person is safe. It is also possible to set such that both an e-mail and a phone call are to be sent and made. As for detecting mastication, which basically takes place three times a day and thus can be regarded as not too often, each time a mastication detection signal is detected, the member of family of the watching-target person or the like who lives remotely is notified that the watching-target person is safe and thereby reassured. Here, in a case where the member of family of the watching-target person or the like who lives remotely feels annoyed by such safety notifications, it is possible to set in advance such that Step S48 will be omitted.
Next, the process proceeds to Step S50, where reception history of mastication detection signals stored in the storage unit 32 is updated, together with time and date information, based on the reception of the new mastication detection signal, and a GPS signal at that time point is also stored in the storage unit 32, and then the process proceeds to Step S52. On the other hand, when reception of a mastication detection signal has not been able to be confirmed in Step S46, the process proceeds directly to Step S52.
In Step S52, based on the reception history stored in the storage unit 32, it is checked whether or not there has been reception of a new mastication detection signal within a predetermined period of time after the reception of the preceding mastication detection signal. When it is found that there has not been reception of a new mastication detection signal within the predetermined period of time, the process proceeds to Step S54, where an automatic phone call is made to the mobile phone of the member of family of the watching-target person or the like who lives remotely and has been registered in advance, and on reception of a response to the phone call, an automatic voice message is transmitted to the effect that there is a possibility of an abnormality, and the process proceeds to Step S56. Further, in Step S54, another automatic voice message is transmitted to notify a current location of the watching-target person based on GPS information obtained then. On the other hand, in Step S52, when it is confirmed from the reception history that there has been reception of a new mastication detection signal, the process proceeds to Step S56.
In Step S56, it is checked whether or not there has been reception of a voice signal picked up by the bone conduction microphone 44 of the ear-mounted unit 6. When it is found that there has been reception of such a voice signal, the process proceeds to Step S58, where it is checked whether or not the received voice is a scream or begging for help (urgency) based on recognized contents of the voice signal (such as words included in the voice signal), intensity of the voice signal, a tone pattern, etc. When there is a high possibility that the voice is a scream or begging for help (when it is judged that it is a highly urgent situation), the process proceeds to Step S60, where an automatic phone call is made to the mobile phone of the member of family of the watching-target person or the like who lives remotely and has been registered in advance, and on reception of a response to the phone call, the received voice itself is transmitted to the mobile phone, and then the process proceeds to Step S62. On the other hand, when, in Step S58, it is judged that the received voice is not a scream or begging for help but merely voice of an ordinary conversation (of low urgency), the process proceeds directly to Step S62.
In Step S62, it is checked whether or not the received voice signal has been received in a time zone (for example, a time zone when the watching-target usually goes shopping, a time zone when the watching-target person usually chants a sutra) previously set based on a regular life pattern. When the result of the check is in the affirmative, the process proceeds to Step S64, where an e-mail is automatically transmitted to the mobile phone of the member of family of the watching-target person or the like who lives remotely and has been registered in advance to notify him/her that the watching-target person is safe, and the process proceeds to Step S66. On the other hand, in Step S62, when the received voice signal is found not to have been received in the previously set time zone, the process proceeds directly to Step S66. Here, a setting same as in Step S48 is also possible, that is, instead of or together with an e-mail, an automatic phone call may be made and automatic voice message may be transmitted. Further, in a case where the member of family of the watching-target person or the like who lives remotely feels annoyed by such safety notifications, it is possible to set in advance such that Steps S62 and S64 will be omitted. The message to be transmitted in Step S64 is not the voice signal actually picked up by the bone conduction microphone 44, but a message notifying merely the fact that there has been reception of a voice signal. Thus, in contrast to in Step S60, contents of conversation of the watching-target person are not heard and thus privacy of the watching-target person is preserved.
In Step S66, reception history of voice signals stored in the storage unit 32 is updated, together with time and date information, based on the reception of the new voice signal, and a GPS signal at that time point is also stored in the storage unit 32, and then the process proceeds to Step S68. On the other hand, in a case where reception of a voice signal picked up by the bone conduction microphone 44 has not been confirmed in Step S56, the process proceeds directly to Step S68.
In Step S68, based on the reception history stored in the storage unit 32, it is checked whether or not there has been reception of a new voice signal within a predetermined period of time after the reception of the preceding voice signal. When there has been no reception of a new voice signal within the predetermined period of time, the process proceeds to Step S70, where an automatic phone call is made to the mobile phone of the member of family of the watching-target person or the like who lives remotely and has been registered in advance, and on reception of a response to the phone call, an automatic voice message is transmitted to the effect that there is a possibility of an abnormality, and then the process proceeds to Step S72. In Step S70, too, another automatic voice message is transmitted to notify a current location of the watching-target person based on GPS information obtained then. On the other hand, when it is confirmed in Step S68 that there has been reception of a new voice signal within the predetermined period of time, the process proceeds directly to Step S72. Here, in a case where setting of pairing with the ear-mounted unit 6 is not confirmed in Step S44, the process proceeds directly to Step S72, the steps for watching are not performed, and the mobile phone 10 functions as an ordinary mobile phone.
In Step S72, it is checked whether or not the storage battery of the power supply unit 34 has been exhausted. When the storage battery is found not to have been exhausted, the process returns to Step S44, and then, the steps from Step S44 to Step S72 are repeated until exhaustion of the storage battery is detected, such that the mobile phone 10 deals with various situations in watching. On the other hand, in a case where, in Step S72, the storage battery is found to have been exhausted, the process proceeds to Step S74, where ending processing is performed to end the flow.
In Step S84, it is checked whether or not the state of the short-range communication with the ear-mounted unit 6 has been shifted from an enabled state to a disabled state. This is equivalent to checking whether or not the watching-target person has gone out into a range where the short-range communication is not available. When such shift of the state is found not to have taken place, the process proceeds to Step S86, where it is checked whether or not the state of the short-range communication with the ear-mounted unit 6 has shifted from the disabled state to the enabled state. This is equivalent to checking whether or not the watching-target person has come back into the short-range communication range. When such shift of the state is found to have taken place, the process proceeds to Step S88, where an e-mail is automatically transmitted to the mobile phone of the member of family of the watching-target person or the like who lives remotely and has been registered in advance to notify him/her that the watching-target person has come home.
Further, in Step S90, automatic short-range communication is performed with the mobile phone 10, and processing is performed to confirm that the state of short-range communication has been shifted back into the state of system configuration as illustrated in
In Step S90, further, a cross check of the history of reception from the ear-mounted unit 6 and information exchange are performed between the storage unit 18 of the in-home monitoring unit 8 and the storage unit 32 of the mobile phone 10 to match the information in the storage unit 18 and the information in the storage unit 32 with each other. This is applicable mainly to a case where the watching-target person is out and the in-home monitoring unit 8 cannot receive signals from ear-mounted unit 6, during which information cannot be received from the in-home monitoring unit 8 and thus information is received from the mobile phone 10 instead. This helps prevent inconvenience of, for example, the in-home monitoring unit 8 erroneously recognizing an abnormal state without any signal transmission from the ear-mounted unit 6 for a predetermined period of time or longer, although there has been a transmission of a signal from the ear-mounted unit 6. The function of matching information in the two storage units by the cross check as described above is also useful as a measure to deal with a case where the storage battery of the mobile phone 10 has been exhausted when the mobile phone 10 is in the home 2 and thus information is not received from the ear-mounted unit 6 until the storage battery is recharged.
When the processing in Step S90 is completed, the process proceeds to Step S92, where it is checked whether or not there has been reception of a new mastication detection signal from the ear-mounted unit 6. When it is found that there has been reception of a new mastication detection signal, the process proceeds to Step S94, where reception history of mastication detection signals stored in the storage unit 18 is updated, together with time and date information, based on the reception of the new mastication detection signal, and the process proceeds to Step S96. On the other hand, when reception of a new mastication detection signal has been unable to be confirmed in Step S92, the process proceeds directly to Step S96.
In Step S96, based on the reception history stored in the storage unit 18, it is checked whether or not there has been reception of a new mastication detection signal within a predetermined period of time after the reception of the preceding mastication detection signal. When there is no reception of a new mastication detection signal within the predetermined period of time, the process proceeds to Step S98, where an automatic notification is issued to the watching service provider, with whom a contract has been made in advance, to the effect that there is a possibility of an abnormality, and then the process proceeds to Step S100. On the other hand, when it is confirmed, in Step S96, from the reception history of mastication detection signals, that there has been reception of a new mastication detection signal within the predetermined period of time, it is judged that there is no abnormality occurring, and the process proceeds directly to Step S100.
In Step S100, it is checked whether or not there has been reception of a voice signal picked up by the bone conduction microphone 44 of the ear-mounted unit 6. When it is found that there has been reception of such a voice signal, the process proceeds to Step S102, where it is checked whether or not the received voice is a scream, a cry for help, or the like, based on identification of voice in the contents (words and the like included therein) of the voice signal, intensity pattern, tone, and the like of the voice signal, etc. When there is a high possibility that the voice is a scream or a cry for help, the process proceeds to Step S104, where the received voice itself is transferred to the watching-service provider, and the process proceeds to Step S106. On the other hand, when, in Step S102, it is judged that the received voice is neither a scream nor a cry for help, but voice of an ordinary conversation, the process proceeds directly to Step S106.
In Step S106, reception history of voice signals stored in the storage unit 18 is updated, together with time and data information, based on the reception of the new voice signal, and the process proceeds to Step S108. On the other hand, when reception of a voice signal picked up by the bone conduction microphone 44 has not been confirmed in Step S100, the process proceeds directly to Step S108.
In Step S108, based on the reception history of voice signals stored in the storage unit 18, it is checked whether or not there has been reception of a new voice signal within a predetermined period of time after the reception of the preceding voice signal. When it is found that there has been no reception of a new voice signal within the predetermined period of time, the process proceeds to Step S110, where an automatic notification is issued to the watching service provider to the effect that there is a possibility of an abnormality, and then the process proceeds to Step S112. On the other hand, when it is confirmed in Step S108 that there has been reception of a new voice signal within the predetermined period of time based on the reception history, the process proceeds directly to Step S112. Here, when it is detected in Step S84 that the state of the short-range communication with the ear-mounted unit 6 has been shifted from an enabled state to a disabled state, the process proceeds to Step S114, where an e-mail is automatically transmitted to the mobile phone of the member of family of the watching-target person or the like who lives remotely and has been registered in advance to notify him/her that the watching-target person has gone out, and then the step proceeds to Step S112. In this case, since it is impossible to receive signals from the ear-mounted unit 6 and thus to perform watching, the mobile phone 10 that the watching-target person carries is charged with execution of the watching function, and the in-home monitoring unit 8 does not executes the watching function.
In Step S112, it is checked whether or not power of the in-home monitoring unit 8 has been turned off. Turning off of the power of the in-home monitoring unit 8 includes power-supply disconnection caused by power failure or the like. When it is found that there has been no turning off of the power, the process returns to Step S84, and then the steps of from Step S84 to Step S114 are repeated as long as the power is not turned off, and the in-home monitoring unit 8 deals with various situations in watching. On the other hand, when turning off of the power is detected in Step S112, the process proceeds to Step S116, where ending processing is performed to end the flow.
According to the present invention, a cartilage conduction vibration source, a bone conduction microphone, and a mastication sensor can each be formed with a piezoelectric element, and thus, one piezoelectric element can serve as a cartilage conduction vibration source, a bone conduction microphone, and a mastication sensor. In the second embodiment illustrated in
The various features of the embodiments described above can be implemented not only in those specific embodiments but also in any other embodiment so long as they provide their advantages. Moreover, the various features of the embodiments can be implemented with various modifications. Modified features can be implemented in appropriate combinations with each other and with unmodified features.
For example, in the configuration of the first embodiment, one piezoelectric bimorph element may be used for the functions of the cartilage conduction vibration source, the bone conduction microphone, and the mastication sensor as in the second embodiment. Or, conversely, in the second embodiment, the cartilage conduction vibration source, the bone conduction microphone, and the mastication sensor may be formed as optimum separate elements to be optimally disposed at scattered positions.
Further, in the above embodiments, a bone conduction microphone is adopted to pick up voice of a watching-target person, but an air-conducted sound microphone may be used for this purpose (for example, the air conduction microphone 46 serving for this purpose, too).
As in the first embodiment, the robot 206 exchanges information with the in-home monitoring unit 8 shown in
Next, cartilage conduction by use of the robot 206 will be described. In a middle finger 213 of a right hand 211 of the robot 206, a cartilage conduction vibration source comprising a piezoelectric bimorph or the like is arranged so that the finger tip of the middle finger 213 vibrates efficiently. The vibration of the cartilage conduction vibration source is conducted to the entire right hand 211, and thus cartilage conduction is possible from any part of the right hand 211.
As shown in
Furthermore, to prevent the watching-target person 201 from feeling a sense of restraint, once an adequate pressure between the left and right hands is determined, while the relative distance between the left and right hands are maintained, the joints of the right and left arms 215 and 217 of the robot 206 are controlled relative to a trunk 219 of the robot 206 so that the left and right hands follow, with no resistance, the free movement of the face of the watching-target person 201. Moreover, to prevent the robot 206 from feeling cold when it touches the watching-target person 201 for the first time, the two hands of the robot 206 are heated to human body temperature before starting the movement of holding the face of the watching-target person 201. Furthermore, to conform to the staging in which the face of the watching-target person 201 is held in both hands and speech is uttered, the left and right eyes 207 of the robot 206 are movable in exterior appearance so as not to inadvertently avoid the line of sight of the watching-target person 201 but follows it naturally without giving an intimidating impression.
Utterance of speech by cartilage conduction as described above is useful in cases such as where the watching-target person 201 has impaired hearing for advanced age or the like and where the ambient noise is loud, and helps avoid a situation like one that requires the air-conducted sound speaker to yell out loudly. Moreover, in the thumb 221 of the right hand 211 of the robot 206, a bone conduction microphone is provided to collect the voice of the watching-target person 201 in cases such as where the ambient sound level is high, so as to collect bone-conducted sound from the cheek bone or the like. As will be described later, a similar bone conduction microphone is provided also in the left hand of the robot 206, and collects bone-conducted sound in a complementary manner with the bone conduction microphone on the right hand 211 side. As the bone conduction microphones, as in the second embodiment, cartilage conduction vibration sources comprising piezoelectric bimorph elements may be used to double as bone conduction microphones.
As shown in
9) of the head part 203, a pair of stereo external air-conducted sound microphones 246, and collects external sound including the voice of the watching-target person 201 stereophonically. The head part 203 has, at the left and right eyes 207, a 3D camera (a pair of cameras) 238, and takes an image of the watching-target person 201. The direction of the head part 203 and the following of the line of sight of the left and right eyes 207 are controlled through the recognition of the face and the eyes of the watching-target person 201 based on an analysis of the image of the 3D camera 238. Moreover, at the mouth mechanism 209 in the head part 203 of the robot 206, an air-conducted sound speaker 223 is provided, and this makes utterance of speech to the watching-target person 201 possible as described above. The mouth mechanism 209 moves in coordination with the utterance of speech by the robot 206 from the air-conducted sound speaker 223.
As shown in
As shown in
The configuration of the left arm 217 and the left hand 229 in
During communication with the watching-target person 201 by cartilage conduction, the right and left joint mechanisms 227a and 227b extend the right and left hands 211 and 229 toward the face of the watching-target person 201 as recognized through the analysis of the image by the 3D camera 238. The right and left joint mechanisms 227a and 227b are each provided with a load detection sensor, and detect whether or not any load is acting other than a movement in a free state that may result from collision with an arm, hand, finger, or the like. If any such load is acting, those sensors identify at what part of a joint and with what intensity it is acting.
When the load detection sensors of the right and left joint mechanisms 227a and 227b detect a load, and based on the image by the 3D camera 238, the cause is judged to be the holding of the face of the watching-target person 201 in both hands, then the applied pressure limiters operate to limit the pressure with which the face of the watching-target person 201 is held. Then, while the outputs from the tactile sensors 231a and 231b provided at the finger tips of the left and right middle fingers 213a and 231b and the image by the 3D camera 238 are monitored, the positions of the right and left hands 211 and 229 and the curves of the left and right middle fingers 213a and 231b are fine-tuned. In this way, the left and right middle fingers 213a and 231b are brought into contact with the tragi of the watching-target person 201 (the state illustrated in
Once an adequate pressure between left and right hands is determined, the right and left joint mechanisms 227a and 227b perform control such that the left and right hands move translatorily while keeping the relative distance between them so as to follow, with no resistance, the free movement of the face of the watching-target person 201. This following is performed relative to the state where the right and left arms 215 and 217, in a free state, are raised and kept at rest against their weights, and thus the load of the right and left arms 215 and 217 does not act on the face. In this state, when the load detection sensors of the right and left joint mechanisms 227a and 227b detect a load in a upward, downward, leftward, or rightward direction based on the movement of the face, in response, while the relative distance between the left and right hands are maintained, the right and left joint mechanisms 227a and 227b are driven subordinately. Thus, the watching-target person 201, despite his face being held in both hands of the robot 206, can move the face without restraint while maintaining the cartilage conduction state.
The functions described above are achieved by a control unit 240 based on programs stored in a storage unit 218. The control unit 240 includes a dedicated image processing function unit that is connected to the 3D camera 238. The control unit 240 further includes a dedicated piezoelectric bimorph driving function unit that is connected to the cartilage conduction vibration sources 242a and 242b, and a dedicated sound processing function unit that is connected to the piezoelectric bimorph driving function unit, the air-conducted sound speaker 223, and the bone conduction microphones 244a and 244b. The storage unit 218 temporarily stores various kinds of data for the functioning of the control unit 240. A power supply unit 248 including a rechargeable battery supplies different constituent elements of the robot 206 with voltages that they respectively need. The control unit 240, the storage unit 218, the power supply unit 248, and the short-range communication unit 236 mentioned above can be built in the trunk 219 of the robot 206.
Next, in step S126, it is checked whether or not there is an abnormality, and if there is no abnormality, an advance is made to step S128, where it is checked whether or not now is a timing for regular reporting. If now is not a timing for regular reporting, an advance is made to step S130. On the other hand, if, in step S128, it is confirmed that now is a timing for regular reporting, an advance is made to step S132, where normality reporting processing is performed, and then an advance is made to step S130. The normality reporting processing here is similar to that in the first or second embodiment.
In step S130, it is judged whether or not to start conversation with the watching-target person 201. For example, when the watching-target person 201 meets the robot 206 after a while, as when the watching-target person 201 has come home from a day-care facility or the robot 206 has come back from repair, or when the watching-target person 201 moves toward the robot 206, or when the watching-target person 201 talks to the robot 206, or when the robot 206, having observed the watching-target person 201, chooses to spontaneously talk to him, or the like can be a case where a determination that “conversation should be started” is made.
If, in step S130, a determination to start conversation is made, an advance is made to step S134, where, first, the stereo external air-conducted sound microphones 246 and the air-conducted sound speaker 223 are turned on in preparation for conversation by ordinary air-conducted sound. Next, in step S136, it is checked whether or not the watching-target person 201 is a registered person with whom to conduct conversation by cartilage conduction. Such registration can be performed beforehand in a case where the watching-target person 201 has impaired hearing due to advanced age or the like or by preference of the watching-target person 201 himself. If, in step S136, it cannot be confirmed that the watching-target person 201 is a registered person for cartilage conduction, an advance is made to step S138, where it is checked whether or not the ambient air-conducted sound level is equal to or higher than a predetermined level. In a watching environment, ordinary noise is unlikely to be present; however, in a case where a large number of people are chatting at different places in a room, the ambient air-conducted sound level may be equal to or higher than a predetermined level, making personal conversation between the robot 206 and the watching-target person 201 by air-conducted sound difficult. If, in step S138, it is judged that the ambient air-conducted sound level is equal to or higher than the predetermined level, an advance is made to step S140. On the other hand, if, in step S136, it is judged that the watching-target person 201 is a registered person with whom to conduct conversation by cartilage conduction, an advance is made directly to step S140.
In step S140, processing for bringing both hands of the robot 206 into contact with both ears of the watching-target person 201 for cartilage conduction is performed. This will be described in detail later. On completion of the both-hand contact processing in step S140, an advance is made to step S142, where it is confirmed whether or not the middle fingers 213a and 213b of both hands of the robot 206 are in contact with the tragi of both ears of the watching-target person 201 and whether or not the thumbs 221a and 221b of both hands are in contact with both cheek bones of the watching-target person 201. When it is so confirmed, an advance is made to step S144, where the stereo external air-conducted sound microphones 246 and the air-conducted sound speaker 223 are turned off, and the bone conduction microphones 244a and 244b and the cartilage conduction vibration sources 242a and 242b are all turned on; then an advance is made to step S146. In this way, conversation by air-conducted sound is switched to conversation by cartilage conduction and the bone conduction microphones.
On the other hand, if, in step S142, no contact with the tragi and the cheek bones as mentioned above is confirmed, an advance is made directly to step S146. As mentioned earlier, cartilage conduction is possible if only any part of the hands and fingers of the robot 206 is in contact with any part of the cartilage of the ears of the watching-target person 201. Thus, if such a state is confirmed in the both-hand contact processing in step S140, while the stereo external air-conducted sound microphones 246 and the air-conducted sound speaker 223 are kept on, the bone conduction microphones 244a and 244b and the cartilage conduction vibration sources 242a and 242b are all turned on. By contrast, if no contact between the hands of the robot 206 and the watching-target person 201 takes place in the both-hand contact processing in step S140, no processing for extending both hands of the robot 206 toward the watching-target person 201 is performed, nor are the bone conduction microphones 244a and 244b and the cartilage conduction vibration sources 242a and 242b turned on; thus, conversation with the stereo external air-conducted sound microphones 246 and the air-conducted sound speaker 223 is continued. The both-hand contact processing in step S140 will be described in detail later.
If, in step S138, it is not judged that the ambient air-conducted sound level is equal to or higher than the predetermined level, an advance is made directly to step S146. In this case, no processing for extending both hands of the robot 206 toward the watching-target person 201 is performed, but conversation with the stereo external air-conducted sound microphones 246 and the air-conducted sound speaker 223 is continued.
In step S146, it is checked whether or not there is an abnormality. This is to cope with an unexpected situation that may arise during the conversation started with no abnormality in step S126. At this stage, cartilage conduction by the hands of the robot 206 may already be underway, and thus the robot 206 comes to have increased sensor functions for watching. Specifically, while the bone conduction microphones 244a and 244b and the tactile sensors 231a and 231b contribute to abnormality detection, in case, for example, the watching-target person 201 falls, an abnormal load acts on the load detection sensors in the right and left joint mechanisms 227a and 227b via the hands holding the face; this, together with the image by the 3D camera 238, makes exact abnormality detection possible.
If, in step S146, no abnormality is detected, an advance is made to step S148, where it is checked whether or not conversation has ended. This check is done comprehensively based on a check of whether or not both sides have remained silent for a predetermined period or longer, an analysis of the contents of conversation, a check for presence of key words suggesting the end of conversation, and the like. If it is not judged that conversation has ended, a return is made to step S136 and thereafter, until it is judged that conversation has ended, steps S136 through S148 are repeated. Owing to this repetition, even while no cartilage conduction is being performed yet, when, in step S138, the ambient sound level becomes equal to or higher than the predetermined level, transition to cartilage conduction is possible. Transition from communication by air-conducted sound to communication by cartilage conduction is one-directional. That is, during the just-mentioned repetition, even when, in step S138, the ambient sound level becomes low, no function is available whereby once-started communication by cartilage conduction is switched back to communication by air-conducted sound. Accordingly, in the middle of a session of conversation, no frequent switching occurs between cartilage conduction communication and air-conducted sound communication. However, in case of an abnormality, leaving the hands of the robot 206 holding the face of the watching-target person 201 is dangerous, and this is dealt with by step S146 in the above-mentioned repetition. This will be described in detail later.
If, in step S148, it is judged that conversation has ended, an advance is made to step S150, where it is checked whether or not a cartilage conduction state is in effect. If a cartilage conduction state is in effect, an advance is made to step S152, where both hands of the robot 206 are retracted to release the face of the watching-target person 201, and an advance is made to step S154. On the other hand, if, in step S150, a cartilage conduction state is not detected, it means that conversation is proceeding by ordinary air-conducted sound, and the watching-target person 201 is not restrained; thus an advance is made directly to step S154.
Now, processing in case of an abnormality will be described. If, in step S126, an abnormal state is detected, an advance is made to step S156, where abnormality handling processing is performed, and an advance is made to step S150. The abnormality handling processing here basically is notifying processing similar to that in the first or second embodiment; however, since the third embodiment is configured as a robot 206, in accordance with the condition of the watching-target person 201 as grasped by the robot 206, if possible and if the degree of urgency exceeds the risk, a previously programmed emergency treatment is performed. Also if, in step S146, an abnormality is detected during conversation, an advance is made to step S156, where abnormality handling processing is performed, and an advance is made to step S150. What is important here is that, as mentioned above, in case some abnormal state occurs while conversation is proceeding by cartilage conduction, leaving the hands of the robot 206 holding the face of the watching-target person 201 is dangerous. Here, if, in step S146, an abnormality is detected, then even if it is not judged that conversation has ended, the repetition loop from steps S136 through S148 is exited, and an advance is made, through the abnormality handling processing in step S156, to step S150. Then if, in step S150, it is confirmed that a cartilage conduction state has been in effect, an advance is made to step S152, where both hands of the robot 206 are retracted to release the face of the watching-target person 201.
In step S154, it is checked whether or not, as a result of the main power switch of the robot 206 being turned off or as a result of the rechargeable battery in the power supply unit 248 being depleted, the supply of electric power to the robot 206 has stopped. If it is not confirmed that the supply of electric power has stopped, a return is made to step S126. If, in step S130, it is not detected that conversation has started, a return is made directly to step S126. Thereafter, until, in step S154, it is confirmed that the supply of electric power has stopped, steps S126 to S154 are repeated to cope with various changes in the condition of the robot 206. On the other hand, if, in step S154, it is confirmed that the supply of electric power has stopped, an advance is made to step S158, where predetermined ending processing is performed, and then the flow ends. The ending processing here includes, as a fail-safe measure, a function whereby, in case both hands of the robot 206 remain in contact with the face of the watching-target person, they are retracted.
On the other hand, if, in step S162, contact by the hands of the robot 206 is confirmed, an advance is made to step S164, where rapid heating of the hands to human body temperature by the heaters 225a and 225b is started, and an advance is made to step S166. In step S166, consent-to-contact communication processing is performed. In general, physical contact (in this case, with the robot 206 as a partner) is comfortable to one who is willing to accept it, but can be very annoying in some cases. Accordingly, the robot 206, even if step S140 is reached after the checks in steps S130 through S138 have been gone through, does not act unilaterally but, respecting the intention of the watching-target person 201, ask for consent to contact in step S166 in
In step S168, it is checked whether or not there is an indication of intention of the watching-target person 201 with a refusing phrase, or with a refusing movement such as turning his head away, from the watching-target person 201, and if it is judged that there is no indication of intention to refuse contact, an advance is made to step S170. In step S170, it is checked whether or not the watching-target person 201 uses a wheelchair. This is because, with a wheelchair user, the position of the face is comparatively stable, and applying a contact pressure with the hands of the robot 206 is unlikely to lead to a danger such as a fall. If, in step S170, it is not confirmed that the watching-target person 201 is a wheelchair user, an advance is made to seat-taking guidance processing in step S172. This processing involves checking whether or not there is a chair and where it is located, making an announcement to recommend the watching-target person 201 to take a seat and guiding him to a chair, confirming whether or not he has taken the seat, etc. Next, in step S174, a final check is done to see whether or not bringing the hands into contact with the face of the watching-target person 201 is dangerous. This check should best be done by confirming whether or not the watching-target person 201 has taken a seat, but, even when he is standing, may instead be done by confirming, through the seat-taking guidance processing, that the watching-target person 201 shows no weakness in the legs and thus applying a certain pressure to his face is unlikely to lead to a risk of a fall.
When, in step S174, safety is confirmed, an advance is made to step S176, where line-of-sight alignment is started whereby the direction in which both hands are extended toward the face is aligned with the direction of the line of sight, and an advance is made to step S178. In step S78, both-hand adjustment processing is performed whereby, to hold the face in both hands, based on information from the 3D camera 238, the right and left joint mechanisms 227a and 227b are driven, and an advance is made to step S180. In step S180, it is checked whether or not the load resulting from both hands making contact with the face is detected by the load detection sensors in the right and left joint mechanisms 227a and 227b. If no load is detected, a return is made to step S178, so that, until both hands make contact with the face, steps S178 and S180 are repeated. When, in step S180, a load is detected, an advance is made to step S182, where the applied pressure limiters are turned on to start to limit the pressure with which the face of the watching-target person 201 is held such that it is not excessive.
In this state, it is possible to recognize which parts of both hands (including fingers) of the robot 206 are in contact with the cartilage of both ears of the watching-target person 201; thus, an advance is made to step S184, where the bone conduction microphones 244a and 244b and the cartilage conduction vibration sources 242a and 242b are turned on, and then an advance is made to step S186. In this state, the air-conducted sound speaker 223 and the bone conduction microphones 244a and 244b in an on state are used together. Moreover, in step S186, in a similar manner as when speech is uttered from the air-conducted sound speaker 223 alone, synchronization is continued so that the mouth mechanism 209 moves in coordination with the voice by the vibration of the cartilage conduction vibration sources 242a and 242b. The synchronization in step S186 is continued even after, in step S144 in
Next, in step S188, hand interval holding/joint slackening processing is started. This is processing whereby the right and left joint mechanisms 227a and 227b are controlled so as to follow, so to speak, in a slackened state such that, as mentioned earlier, both hand move translatorily while keeping the relative distance between them such that they follow, with no resistance, free movement of the face of the watching-target person 201. This processing is continued even after the flow exits step S188.
Furthermore, in step S190, processing is started whereby, while the curves of the left and right middle fingers 213a and 231b are fine-tuned, the finger tips are brought into contact with the tragi of the watching-target person 201 and, while the curves of the left and right thumbs 221a and 221b are fine-tuned, they are brought into contact with the cheek bones of the watching-target person 201. Next, in step S192, it is checked whether or not, as a result of the just-mentioned processing, the left and right middle fingers 213a and 231b are in contact with the tragi and the left and right thumbs 221a and 221b are in contact with the cheek bones, and if contact is not confirmed, an advance is made to step S194. In step S194, it is checked whether or not a predetermined time has elapsed since step S190 was started, and if the predetermined time has not elapsed, a return is made to step S190. Thereafter, steps S190 through S194 are repeated. During this repetition, if, in step S192, contact with the tragi and the cheek bones is confirmed, the flow ends, and a jump is made to step S142 in
On the other hand, if, in step S168, it is judged that there is even a slight indication of intention to refuse contact, an advance is made to step S196, where the heating of the hands to human body temperature by the heaters 225a and 225b is stopped, and the flow ends. Likewise, also if, in step S174, safety is not confirmed, an advance is made to step S196, where the heating of the hands to human body temperature by the heaters 225a and 225b is stopped, and the flow ends. In either case, no cartilage conduction by contact between the hands of the robot 206 and the watching-target person 201 is performed, and communication by air-conducted sound is continued.
The various features of the embodiments described above can be implemented not only in those specific embodiments but also in any other embodiment so long as they provide their advantages. Moreover, the various features of the embodiments can be implemented with various modifications. Modified features can be implemented in appropriate combinations with each other and with unmodified features.
For example, in the configuration of the robot of the third embodiment, bone conduction microphones are provided in thumbs and cartilage conduction vibration sources are provided in middle fingers. This, however, is not meant as any limitation; for example, cartilage conduction vibration source may be provided in forefingers. For another example, a plurality of cartilage conduction vibration sources may be provided respectively in a plurality of fingers of one hand. In the third embodiment, when cartilage conduction is started, the heating of the hands to human body temperature is started. Instead, the heaters may be kept on all the time to keep the hands of the robot at human body temperature so that, even when the robot touches the watching-target person for other purposes, it does not feel cold.
In the flow in
In the third embodiment, the hands of the robot are provided with tactile sensors for confirming contact of the finger tips with the tragi of the watching-target person. These sensors may be replaced with optical proximity sensors, or tactile sensors and optical proximity sensors may be used together.
The fourth embodiment shown in
Next, with reference to
The functions of different fingers in the state in
As will be clear from
Next, the configuration that functions when communicating with a watching-target person 201 lying as in
As shown in
With the configuration described above, in the fourth embodiment, as shown in
The flow in
On the other hand, if, in step S202, contact by the hands of the robot 306 is not confirmed, an advance is made to step S204, where rapid heating of the hands to human body temperature by the heaters 225a and 225b is started, and an advance is made to step S206. In step S206, consent-to-contact communication processing is performed whereby, while the intention of the watching-target person 201 is respected, consent to contact for head-raising is prompted. The communication here is similar to that in the third embodiment. Then, with the 3D camera 238 and the stereo external air-conducted sound microphones 246, while a response from the other side is waited, his behavior is observed, and an advance is made to step S208.
In step S208, it is checked whether or not there is an indication of intention of the watching-target person 201 with a refusing phrase or a refusing movement from the watching-target person 201, and if it can be judged that there is no indication of intention to refuse contact, an advance is made to step S210. In step S210, it is checked whether or not the watching-target person 201 is a lying person. If, in step S210, it is confirmed that the watching-target person 201 is a lying person, the first cartilage conduction vibration source 242a, the first tactile sensor 231a, and the first bone conduction microphone 244a are brought into a disabled state, and the second cartilage conduction vibration source 342a, the second tactile sensor 331a, and the second bone conduction microphone 344a are brought into an enabled state. Thus, switching to a state where cartilage conduction is performed by the thumb 321 is complete, and an advance is made to step S214.
In step S214, line-of-sight alignment is performed whereby the direction in which both hands are extended toward the face is aligned with the direction of the line of sight, and an advance is made to step S216. In step S216, based on information from the 3D camera 238, the right and left joint mechanisms 227a and 227b and the hold-and-raise finger adjustment mechanism 355 are driven, then processing is performed whereby the middle to little fingers 313 to 352 of the robot 306 are put behind the back of the head and the neck of the watching-target person 201 to hold and raise him, and an advance is made to step S218. In step S218, processing is performed whereby, in a held and raised state, the curves of the thumb 321 and the forefinger 354 are fine-tuned according to information from the second tactile sensor 331a (as necessary, the first tactile sensor 231a is used together; the same applies in similar cases below) and the 3D camera 238 so that those fingers are brought into contact with the tragus and the mastoid bone respectively, and an advance is made to step S220. In step S220, based on information from the second tactile sensor 331a and the 3D camera 238, it is checked whether or not the above-mentioned contact has been achieved. If no contact is confirmed, a return is made to step S218, so that, until contact is confirmed, steps S218 and S220 are repeated. When, in step S220, contact is confirmed, an advance is made to step S222, where the applied pressure limiters are turned on to start to limit the pressure with which the face of the watching-target person 201 is held on the thumbs 321 and the forefingers 354 of both hands such that it is not excessive.
Subsequently, an advance is made to step S224, where the second bone conduction microphone 344a and the second cartilage conduction vibration source 342a are turned on, and an advance is made to step S226. In step S226, synchronization is performed such that the mouth mechanism 209 moves in coordination with the voice by the vibration of the second cartilage conduction vibration source 342a, and an advance is made to step S228.
In step S228, processing is performed whereby joints are slackened so that, while the distance between the thumbs of both hands and the distance between the forefingers of both hands are maintained, both hands can move translatorily in the horizontal direction. As mentioned earlier, this is processing for performing control such that, without the fingers of the robot 306 restraining the watching-target person 201, both hands follow, in a slackened state, free movement of the head. This processing is continued even after the flow exits the step S228.
When the processing in step S228 is started, an advance is made to step S230, where, based on information from the second tactile sensor 331a and the 3D camera 238, it is checked once again whether or not the thumb 321 and the forefinger 354 are in contact with the tragus and the mastoid bone respectively. If, in step S230, contact is confirmed once again, the flow ends. As a result, the flow jumps to step S142 in
On the other hand, if, in step S208, it is judged that the watching-target person 201 shows even a slight indication of intention to refuse contact, an advance is made to step S234, where the heating of the hands to human body temperature by the heater 225a is stopped, and the flow ends. In this case, no cartilage conduction by contact of the hands of the robot 306 with the watching-target person 201 is performed, and communication by air-conducted sound is continued.
If, in step S210, it is not confirmed that the watching-target person 201 is a lying person, an advance is made to step S236, where the second cartilage conduction vibration source 342a, the second tactile sensor 331a, and the second bone conduction microphone 344a are brought into a disabled state, and the first cartilage conduction vibration source 242a, the first tactile sensor 231a, and the first bone conduction microphone 244a are brought into an enabled state. Thus, switching to a state where cartilage conduction is performed by the middle finger 313 is complete; thus, an advance is made to self-sustained person processing in step S238, and when the processing is complete, the flow ends. The self-sustained person processing in step S238 is the same as steps S170 through S196 in
The fifth embodiment shown in
As will be clear from
As shown in
This serves, when the right hand 211 of the robot 406 makes contact with an accessory (described later) distributed to the customer 401, to read the information of an IC tag provided in the accessory. In a case where, as in
Next, in step S248, it is judged whether or not the situation is one in which to start conversation with the customer 401. If, in step S248, it is judged that conversation should be started, an advance is made to step S250, where, first, the stereo external air-conducted sound microphones 246 and the air-conducted sound speaker 223 (see
In step S258, processing for bringing one hand of the robot 406 into contact with one ear of the customer 401 for cartilage conduction and then maintaining the contact is started. The details will be given later. When the single-hand contact processing in step S258 is started, an advance is made to step S260, where it is checked whether or not the middle finger 413 of the robot 406 is in contact with the tragus of the customer 401 and the thumb 421 is in contact with the cheek bone of the customer 401. When that is confirmed, an advance is made to step S262, where the stereo external air-conducted sound microphones 246 and the air-conducted sound speaker 223 (see
Next, in step S264, it is checked whether or not the customer 401 is a registered person (described later) with whom to conduct conversation by cartilage conduction, and if it is confirmed that he is a registered person, an advance is made to step S266, where IC tag management processing is performed, and an advance is made to step S268. What is performed in the IC tag management processing in step S266 will be described later. On the other hand, if, in step S264, it cannot be confirmed that the customer 401 is a registered person, an advance is made to step S270, where regular menu processing for a state (the state in
If, in step S260, no contact with the tragus or the cheek bone as mentioned above is confirmed, an advance is made directly to step S268. Also if, in step S254, it is not judged that the ambient air-conducted sound level is equal to or higher than the predetermined level or if, in step S256, no consent of the customer 401 to the robot 406 making contact with his ear for cartilage conduction is confirmed, an advance is made to step S268. In these cases, no processing for bringing one hand of the robot 406 to an ear of the customer 401 is performed, nor are the bone conduction microphone 244a and the cartilage conduction vibration source 442a turned on, but conversation by the stereo external air-conducted sound microphones 246 and the air-conducted sound speaker 223 is continued.
In step S268, it is checked whether or not conversation-based handling of the customer 401 by the robot 406 has ended. This check is done, as in the third embodiment, comprehensively based on a check of whether or not both sides have remained silent for a predetermined period or longer, an analysis of the contents of conversation, a check for presence of key words suggesting the end of conversation, and the like. If, in step S268, it is not judged that conversation has ended, a return is made to step S252, and thereafter, until it is judged that conversation-based customer handling has ended, steps S252 through S268 are repeated. Owing to this repetition, even while no cartilage conduction is being performed yet, when, for example, in step S254, the ambient sound level becomes equal to or higher than the predetermined level, transition to cartilage conduction is possible.
On the other hand, if, in step S268, it is judged that conversation-based customer handling has ended, an advance is made to step S272, where it is checked whether or not a cartilage conduction state is in effect. If a cartilage conduction state is in effect, an advance is made to step S274, where the one hand of the robot 406 is retracted, and an advance is made to step S276. At this time, as will be described later, if the customer 401 is, by using his own hand, putting the hand of the robot 406 to the ear, so that the hand of the customer 401 will not be pushed aside, while the load from the hand of the customer 401 is being monitored, until the customer 401 spontaneously removes his hand, a gentle retracting action is performed. On the other hand, if, in step S272, it is confirmed that a cartilage conduction state is not in effect, an advance is made directly to step S276.
In step S276, it is checked whether or not, as a result of the main power switch of the robot 406 being turned off or as a result of the rechargeable battery in the power supply unit 248 (see Fig. mobile phone 10) being depleted, the supply of electric power to the robot 206 has stopped. If it is not confirmed that the supply of electric power has stopped, a return is made to step S248. Thereafter, until, in step S276, it is confirmed that the supply of electric power has stopped, steps S248 through S276 are repeated to cope with various changes in the condition of the robot 406. On the other hand, if, in step S276, it is confirmed that the supply of electric power has stopped, the flow reaches step S278, where predetermined ending processing is performed, and then the flow ends. The ending processing here includes, as a fail-safe measure, a function whereby, in case one hand of the robot 406 remains extended toward the face of the customer 401, it is retracted. When it is retracted, in a similar manner as described in connection with step S274, safety measures are taken so that, for example, the hand of the customer 401 will not be pushed aside.
In step S288, it is checked whether or not the customer 401 has chosen a method in which the customer 401 takes in his own hand a hand of the robot 406 and puts it to his ear. This check is done based on not only a choice response from the customer 401 but also detection of the customer 401 having abruptly started an action of taking a hand of the robot 406 and putting it to his ear. If, in step S288, it is judged that the customer 401 has not chosen contact started by hand-taking, an advance is made to step S290, where line-of-sight alignment is started whereby the direction in which the robot 406 extends one hand toward the face of the customer 401 is aligned with the direction of the line of sight of the robot 406, and an advance is made to step S292.
In step S292, single-hand adjustment processing is performed whereby the right joint mechanism 227a (or the left joint mechanism 227b) is driven based on information from the 3D camera 238 so that one hand is extended automatically to bring a finger into contact with an ear of the customer 401, and an advance is made to step S294. In step S294, it is checked whether or not a load resulting from one hand making contact with the face is detected by the load detection sensor in the right joint mechanism 227a. If no load is detected, a return is made to step S292, and then, until one hand makes contact with the face, steps S292 and S294 are repeated. If, in step S294, a load is detected, the applied pressure limiter is turned on to start to limit the pressure with which the hand of the robot 406 pushes the face of the customer 401 from one side such that it is not excessive, and an advance is made to step S296. The applied pressure limiter prevents an accident such as one in which the robot 406 pushes down the customer 401 sideways.
On the other hand, if, in step S288, it is judged that the customer 401 has chosen a method in which the customer 401 takes in his own hand a hand of the robot 406 and puts it to his ear, an advance is made to joint slackening processing in step S298, where the hand of the robot 406 is left just to follow, without resistance, the movement of the hand of the customer 401. In this slackening processing, to prevent the load of the weight of the hand of the robot 406 from acting on the hand of the customer 401, the arm of the robot 406 is balanced in a gravity-free state the up-down direction and is supported on the trunk 219 (see
In step S296, the bone conduction microphone 244a and the cartilage conduction vibration source 442a are turned on, and an advance is made to step S300. In this state, the air-conducted sound speaker 223 (see
Next, in step S302, finger adjustment processing is started. Started here is processing whereby, while the curve of the middle finger 413 is fine-tuned, the fingertip is brought into contact with the tragus of the customer 401 and, while the curve of the thumb 421 is fine-tuned, it is brought into contact with the cheek bone of the customer 401. Next, in step S304, it is checked whether or not, as a result of the just-mentioned processing, the middle finger 413 is in contact with the tragus and the thumb 421 is in contact with the cheek bone, and if contact is confirmed, an advance is made to step S306, where the normal contact state is registered as a pressure of the tactile sensor 431a and an image of the 3D camera 238 (
Then, in step S312, it is checked whether or not there is a change in the outputs of the acceleration sensor 456 and the tactile sensor 431a or a change in the face recognition position by the 3D camera 238. If there is any change, an advance is made to step S314, where following processing is performed whereby the hand of the robot 406 is made to follow the detected movement of the head, and the flow ends. In the processing here, as mentioned earlier, based on the acceleration, detected by the acceleration sensor 456, of the middle finger 413 which is dragged by the movement of the head due to contact friction and the change in contact pressure detected by the tactile sensor 431a (the pressure decreasing as the head moves in the direction away from the hand of the robot 406 and increasing as the head moves in the direction in which it pushes the hand of the robot 406) as well as information on the movement of the head detected by the 3D camera 238, the control unit 440 comprehensively determines the movement of the head, and makes the hand of the robot 406 follow in such a direction as to compensate for the movement, so that thereby, even when the head of the customer 401 moves, the contact of the middle finger 413 with the tragus and the contact of the thumb 421 with the cheek bone are maintained. If, in step S312, none of a change in the outputs of the acceleration sensor 456 and the tactile sensor 431a and a change in the face recognition position by the 3D camera 238 is detected, the head is considered to be at rest, and thus the flow ends immediately.
On the other hand, if, in step S304, no contact between the middle finger 413 and the tragus or between the thumb 421 and the cheek bone is confirmed, an advance is made to step S316, where it is checked whether or not a predetermined time has elapsed since step S302 was started. If the predetermined time has not elapsed, a return is made to step S302, and then, until, in step S304, contact is detected or, in step S316, the lapse of the predetermined time is detected, the loop of steps S302, S304, and S316 is repeated. If, in step S316, the lapse of the predetermined time is detected, the flow ends immediately. When, via any of these steps, the flow in
On the other hand, in
The accessories shown in
In
Next, when the customer 401 wearing the distributed accessory 461 as shown in
21. The flow starts when the bank starts business on a given day, and in step S320, it is checked whether or not customer handling related to customer registration is requested. If there is a request, an advance is made to step S322, where it is checked whether or not the request is for new registration of a customer. If the request is for registration of a new customer, then, in step S324, mutual agreement processing as to privacy protection is performed, and an advance is made to step S326. In step S326, a customer ID is issued.
Next, in step S328, it is checked whether or not entry of customer information is complete, and if it is complete, then, in step S330, it is checked whether or not entry of the customer's subject information is complete. If this too is complete, an advance is made to step S332, where, for each ID issued, the customer information and the subject information are stored in the customer database 480, and an advance is made to step S334. In step S334, it is checked whether or not writing-in of the ID to an IC tag in the accessory to be distributed to the registered customer is complete, and when it is confirmed to be complete, an advance is made to step S336. At this time, the accessory is distributed to the customer. On the other hand, if, in step S322, no new registration is being request, it is judged that registration is complete and that an accessory has been distributed, and an advance is made directly to step S336.
In step S336, it is checked whether or not the robot 406 has read out, with the IC tag reader 457, the customer ID of the customer 401 currently in contact and has transmitted it. If it has been transmitted, an advance is made to S338, where the customer information corresponding to the received customer ID is read out from the customer database 480. Then an advance is made to step S340, where the subject information corresponding to the received customer ID is read out from the customer database 480. Then, in step S342, the thus read-out data is transmitted to the robot 406.
Next, in step S344, it is checked whether or not a conversation record based on cartilage conduction has been received from the robot 406. If there is a received conversation record, an advance is made to S346, where, for each ID, it is analyzed and organized as conversation information and is written in to the customer database 480; then an advance is made to step S348. If, in step S344, no reception is conformed, an advance is made directly to S348. In step S348, it is checked whether or not handling of the customer 401 by the robot 406 has ended, and if it has not ended, a return is made to S344, so that, until the end of customer handling, steps S344 through S348 are repeated. When, in step S348, the end of customer handling is detected, the flow ends.
On the other hand, if, in step S336, it is not confirmed that the robot 406 has read out a customer ID and has transmitted it, the flow ends immediately. If, in step S320, no request for customer handling related to customer registration is confirmed, an advance is made to step S350, where regular customer handling processing not involving processing by an IC tag 470 is performed, and the flow ends.
A detailed description follows with reference to
On the other hand,
With the structure described above, a customer can, while keeping a posture in which he views bank information displayed on the large-screen display unit 705, pull out the cartilage conduction unit 724 and put it to his ear with his hand to hear bank information conveyed by voice even in a noisy environment in a bank. On the other hand, the customer's own voice can be collected by the microphone 723 which, in a state as just mentioned, is located close to the mouth. The cartilage conduction unit 724 is so balanced that it does not fall under its own weight or put a load on the customer, and is supported in a slack state so as not to resist the movement of the customer's hand. These features are also seen also in the fifth embodiment.
As shown in
The various features of the embodiments described above can be implemented not only in those specific embodiments but also in any other embodiment so long as they provide their advantages. Moreover, the various features of the embodiments can be implemented with various modifications. Modified features can be implemented in appropriate combinations with each other and with unmodified features. For example, a configuration where an accessory like any shown in
In the fifth embodiment, as a means for holding a customer ID, an IC tag is taken.
However, what is usable as such a means for holding information is not limited to an IC tag, but may instead be a bar code or a two-dimensional bar code. A configuration where a customer ID is held directly in a means for holding information is not meant as any limitation. Instead, an accessory ID by which an accessory can be identified may be held there, in which case, by holding information associating each accessory with the customer to which it was distributed in a database, it is possible, by reading out the accessory ID, to identify the wearer indirectly.
Needless to say, the single-hand contact/maintaining processing described with reference to
Conclusive Descriptions: The following is conclusive descriptions of the features of the embodiments disclosed herein.
According to one embodiment disclosed herein, there is provided a watching system including a watching detection device and a watching notification device. The watching detection device has a cartilage conduction vibration source and a watching detection sensor, and is mountable to an ear with an entrance of an external auditory canal open. The watching notification device receives watching information from the watching detection sensor by performing short-range communication with the watching detection device. This contributes to comfortable wear of the watching detection device.
According to a specific feature, the watching detection device has an air conduction microphone, and functions as a hearing aid by vibrating the cartilage conduction vibration source in accordance with a voice signal picked up by the air conduction microphone. This makes it possible to perform watch by using a hearing aid which is used daily. According to another specific feature, the watching detection device makes the cartilage conduction vibration source vibrate in accordance with a voice signal received from the watching notification device via short-range communication. This makes it possible to perform watch by using a device, such as a mobile phone, through which it is possible to hear a voice signal received from another device.
According to another specific feature, the watching detection sensor is a masticatory movement sensor. According to another specific feature, the watching detection sensor is a voice sensor. For example, the voice sensor is a bone conduction microphone or an air-conducted sound microphone.
According to another specific feature, the watching notification device issues a notification when it has been impossible to receive a detection signal for a predetermined period of time.
According to another embodiment disclosed herein, there is provided a watching system including a watching detection device, and a plurality of watching notification devices which each receive watching information from the watching detection device via short-range communication with the watching detection device. The plurality of watching notification devices exchange with each other the watching information received. This makes it possible to deal with a missing part in the watching information received by one watching notification device by sharing the watching information received by the other watching notification devices, and thus to prevent confusion from occurring among the plurality of watching notification devices.
According to another embodiment disclosed herein, there is provided a watching system including a watching detection device, and a plurality of watching notification devices which each receive watching information from the watching detection device via short-range communication with the watching detection device. The plurality of watching notification devices issue different notifications based on the watching information. This makes it possible to perform watch in a manner suitable to each of the plurality of watching notification devices, which are different from each other in properties. According to a specific feature, the plurality of watching notification devices include a mobile phone and a notification device placed in a home.
According to another embodiment disclosed herein, there is provided a watching system including a watching detection device having a voice sensor, and a watching notification device that receives watching information from the watching detection sensor via short-range communication with the watching detection device. The watching notification device issues a notification of whether a voice signal picked up by the voice sensor is present, without issuing any notification of the contents of the voice signal. This helps protect privacy of a watching-target person. According to a specific feature, the watching notification device makes a judgment on urgency of the voice signal picked up by the voice sensor, and when the urgency is high, the contents of the voice signal is exceptionally notified. This makes it possible to obtain a specific notification in raw voice in a case where a scream or a cry for help has been received.
According to another embodiment disclosed herein, there is provided a watching detection device including a cartilage conduction vibration source and a watching detection sensor, and the watching detection device is mountable to an ear with an entrance of an external auditory canal open. This contributes to comfortable wear of the watching detection device.
According to a specific feature, the watching detection device has an air conduction microphone, and functions as a hearing aid by vibrating the cartilage conduction vibration source in accordance with a voice signal picked up by the air conduction microphone. According to another specific feature, the watching detection device vibrates the cartilage conduction vibration source in accordance with a voice signal received from the watching notification device via short-range communication, to thereby function as a device, such as a mobile phone, through which it is possible to hear a voice signal received from another device.
According to another specific feature, the watching detection sensor is a masticatory movement sensor. According to a more specific feature, the masticatory movement sensor can serve also as the cartilage conduction vibration source. According to another specific feature, the watching detection sensor is a voice sensor. More specifically, the voice sensor is a bone conduction microphone. Still more specifically, the bone conduction microphone can serve also as the cartilage conduction vibration source.
According to another specific feature, the watching detection sensor includes an air-conducted sound microphone for a hearing aid, and the air-conducted sound microphone is turned off when the bone conduction microphone is used. According to another specific feature, the voice sensor is an air-conducted sound microphone.
According to another embodiment disclosed herein, there is provided a watching notification device having an acquisition unit that acquires watching information from a voice sensor and a notification unit that issues a notification of whether a voice signal acquired by the acquisition unit is present, without issuing any notification of contents of the voice signal. This helps protect privacy of a watching-target person. According to a specific feature, the notification unit makes a judgment on urgency of the voice signal picked up by the voice sensor, and the contents of the voice signal is notified exceptionally when the urgency is high.
According to one embodiment disclosed herein, there is provided a robot including: a hand; and a cartilage conduction vibration source which is provided in the hand. Thus, communication is possible between the robot and the human by cartilage conduction with a natural movement.
According to a specific feature, the robot includes two hands, and the cartilage conduction vibration source is provided in each of the two hands. Thus, communication is possible between the robot and the human by cartilage conduction with a comforting staging in which, for example, the head of the person is held gently in both hands of the robot. In addition, stereophonic hearing is possible.
According to another specific feature, the robot includes a finger in the hand, and the cartilage conduction vibration source is provided in the finger. Thus, more efficient cartilage conduction is possible.
According to a more specific feature, there is provided a joint mechanism which guides the entire hand to achieve contact with the ear cartilage and which adjusts the finger to guide it to the tragus. Thus, adjustment for appropriate cartilage conduction is possible.
According to another specific feature, the robot includes a control unit which, when the two hands make contact with the ear cartilages of two ears respectively for cartilage conduction, controls the two hands so as not to restrain the movement of the face while maintaining the positions of the two hands relative to each other. Thus, cartilage conduction without a sense of restraint is possible.
According to another specific feature, the robot includes an eye which is movable in exterior appearance, and the eye is moved in coordination such that the line of sight of the eye points between the two hands. Thus, more intimate communication with the robot by cartilage conduction is possible.
According to another specific feature, the robot includes a mouth mechanism which is movable in exterior appearance, and the mouth mechanism moves in coordination with the voice conducted by the vibration of the cartilage conduction vibration source. Thus, communication by natural cartilage conduction is possible.
According to another specific feature, the robot includes a limiter which, when the hand makes contact with the ear cartilage to conduct the vibration of the cartilage conduction vibration source to the ear cartilage, adjusts the pressure of the contact. Thus, safe communication by cartilage conduction is possible.
According to another specific feature, the robot includes a communicating means for asking for consent when the hand is brought into contact with the ear cartilage to conduct the vibration of the cartilage conduction vibration source to the ear cartilage. Thus, communication by cartilage conduction without a sense of discomfort is possible.
According to another specific feature, the robot includes a control unit which, when the hand is brought into contact with the ear cartilage to conduct the vibration of the cartilage conduction vibration source to the ear cartilage, confirms safety beforehand. Thus, highly safe communication by cartilage conduction is possible.
According to another specific feature, the robot includes an abnormality detecting means, and, when the hand is brought into contact with the ear cartilage to conduct the vibration of the cartilage conduction vibration source to the ear cartilage, if the abnormality detecting means detects an abnormality, the hand is inhibited from making contact with the ear cartilage. Thus, even in an unforeseen situation, trouble can be avoided.
According to another feature of an embodiment disclosed herein, there is provided a robot including: a hand; and a heater which heats the hand to human body temperature. Thus, it is possible to achieve comfortable physical contact.
According to another embodiment disclosed herein, there is provided a cartilage conduction hearing system including: a cartilage conduction unit which conducts vibration to the ear cartilage of a human; and an accessory which is worn by the human and which covers at least part of the ear cartilage. The vibration of the cartilage conduction unit is conducted to the ear cartilage indirectly via the accessory. Thus, problems that arise from the cartilage conduction unit being in direct contact with the ear cartilage can be solved. According to a specific feature, the cartilage conduction unit is shared among a large number of people, and different accessories are worn by the large number of people respectively. Thus, despite the sharing of the cartilage conduction unit touched by no one knows whom, it is possible to build a hearing system that provides the benefits of cartilage conduction hygienically.
According to another specific feature, the accessory is configured as one of an ear warmer, a headband, an ear cuff, and an ear-worn article of character merchandise. Thus, it is possible to motivate the person to wear the accessory spontaneously.
According to another specific feature, the cartilage conduction hearing system of this embodiment is configured as a customer handling system in a bank. Thus, despite the sharing of the cartilage conduction unit used by no one knows whom, it is possible to make suitable use of the benefits of the hearing system according to the present invention which provides the benefits of cartilage conduction hygienically.
According to another specific feature, in the cartilage conduction hearing system of this embodiment, the accessory includes an information holding unit that holds information for identifying an wearer thereof, and there is provided a reading means for reading the information in a state where the cartilage conduction unit can conduct the vibration of the cartilage conduction unit to the ear cartilage via the accessory. Thus, it is possible to build a hearing system that can meet the wearer's needs adroitly, and thus to motivate the person to wear the accessory. According to a more specific feature, the information is information on the wearer. According to another more specific feature, the information is information for identifying the accessory, and there is provided a related information holding means for holding information on the relationship between the wearer and the accessory.
According to another specific feature of this embodiment, the cartilage conduction unit is provided in a finger of the robot. Thus, the effect of contact with the ear cartilage is enhanced, and cartilage conduction can be used effectively.
According to a more specific feature, the robot includes a joint mechanism which, when the human guides the finger of the robot to the ear cartilage, holds an arm of the robot so as not to resist it. Thus, the cartilage conduction unit can be guided to the ear cartilage reliably, and smooth cooperation for cartilage conduction is possible between the human and the robot.
According to another more specific feature, the cartilage conduction unit conducts vibration to one ear of the human, and there is provided a following means for making the finger of the robot follow the movement of the head of the human. Thus, despite vibration being conducted to one ear, the conduction of the vibration can be prevented from being broken by the movement of the head of the human.
According to another feature of this embodiment, there is provided a cartilage conduction hearing system including: a cartilage conduction unit which conducts vibration to the ear cartilage of a human; and a support unit which movably supports the cartilage conduction unit. When the human guides to the ear cartilage, the support unit supports the cartilage conduction unit so as not to resist it. Thus, the cartilage conduction unit can be guided to the ear cartilage reliably without giving a sense of discomfort to the user. According to a specific feature, the cartilage conduction unit is provided in a finger of the robot. According to another specific feature, the cartilage conduction hearing system of this embodiment includes a display screen, and the cartilage conduction unit is supported on the display screen by the support unit.
According to another feature of this embodiment, there is provided a cartilage conduction hearing system including: a cartilage conduction unit which is provided in a finger of a robot to conduct vibration to the ear cartilage of a human; and a following means which is provided in the robot to make the finger of the robot follow the movement of the head of the human when the cartilage conduction unit conducts vibration to one ear of the human. Thus, despite vibration being conducted to one ear, the conduction of the vibration can be prevented from being broken by the movement of the head of the human.
According to another feature of this embodiment, there is provided a cartilage conduction hearing system including: a cartilage conduction unit provided in a first finger of a robot to conduct vibration to the ear cartilage of a human. The head of the human is supported by a second finger of the robot. Thus, for example, a lying person can be raised naturally to conduct communication.
According to a specific feature, it is provided in the first finger of each of a left and a right hand of the robot to conduct vibration to the ear cartilages of the left and right ears, respectively, of the human, and the head of the human is supported by the second fingers of the left and right hands of the robot. According to another specific feature, the first finger is a thumb, and the second finger is a middle finger.
According to another feature of this embodiment, there is provided a robot suitable for use in various cartilage conduction hearing systems as described above.
The present invention is applicable to a robot that can communicate with a human, and to a customer handling system in a bank or the like.
211, 229 hand
242
a,
242
b cartilage conduction vibration source
206 robot
213 finger
227
a,
227
b joint mechanism
227
a,
227
b,
240 control unit
207 eye
209 mouth mechanism
225
a,
225
b heater
240 limiter
223, 215, 217, 240 communicating means
240 control unit
246, 238, 227a, 227b abnormality detection unit
321, 313, 413, 724 cartilage conduction unit
461, 462, 463, 464 accessory
470 information holding unit
457 information reading means
480 related information holding unit
227
a joint mechanism
431
a,
456, 238, 440 following means
227
a support unit
705 display unit
321 first finger
313 second finger
Number | Date | Country | Kind |
---|---|---|---|
2015-141168 | Jul 2015 | JP | national |
JP2016-138187 | Jul 2016 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/070848 | 7/14/2016 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2017/010547 | 1/19/2017 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
2045404 | Nicholides | Jun 1936 | A |
4351166 | Belin | Sep 1982 | A |
5125032 | Meister | Jun 1992 | A |
RE34525 | Lazzaroni | Feb 1994 | E |
5295193 | Ono | Mar 1994 | A |
5323468 | Bottesch | Jun 1994 | A |
5396563 | Yoshimi | Mar 1995 | A |
5682173 | Holakovszky et al. | Oct 1997 | A |
5686882 | Giani | Nov 1997 | A |
5687244 | Untersander | Nov 1997 | A |
5956682 | Loudermilk | Sep 1999 | A |
5986813 | Saikawa | Nov 1999 | A |
6028556 | Shiraki | Feb 2000 | A |
6380923 | Fukumoto et al. | Apr 2002 | B1 |
6456721 | Fukuda | Sep 2002 | B1 |
6463157 | May | Oct 2002 | B1 |
6483917 | Kang et al. | Nov 2002 | B1 |
6603863 | Nagayoshi | Aug 2003 | B1 |
6754359 | Svean | Jun 2004 | B1 |
6825830 | Kanesaka et al. | Nov 2004 | B1 |
6912287 | Fukumoto | Jun 2005 | B1 |
6950126 | Homma et al. | Sep 2005 | B1 |
6957049 | Takeda | Oct 2005 | B1 |
7231235 | Harrold | Jun 2007 | B2 |
7257372 | Kaltenbach | Aug 2007 | B2 |
7437122 | Choi | Oct 2008 | B2 |
7442164 | Berrang et al. | Oct 2008 | B2 |
7555136 | Wang | Jun 2009 | B2 |
7616771 | Lenhardt | Nov 2009 | B2 |
7783034 | Manne | Aug 2010 | B2 |
7822215 | Carazo | Oct 2010 | B2 |
7890147 | Tanada | Feb 2011 | B2 |
8194875 | Miranda | Jun 2012 | B2 |
8238967 | Arnold et al. | Aug 2012 | B1 |
8433080 | Rader | Apr 2013 | B2 |
8521239 | Hosoi et al. | Aug 2013 | B2 |
8526646 | Boesen | Sep 2013 | B2 |
8532322 | Parker | Sep 2013 | B2 |
8886263 | Hosoi et al. | Nov 2014 | B2 |
8918149 | Hosoi et al. | Dec 2014 | B2 |
9020168 | Karkkainen | Apr 2015 | B2 |
9020170 | Hosoi et al. | Apr 2015 | B2 |
9107466 | Hoying | Aug 2015 | B2 |
9313306 | Hosoi et al. | Apr 2016 | B2 |
9351090 | Tachibana | May 2016 | B2 |
9367087 | Townsend et al. | Jun 2016 | B1 |
9471102 | Townsend et al. | Oct 2016 | B2 |
9552707 | Bala | Jan 2017 | B1 |
9565285 | Theverapperunna | Feb 2017 | B2 |
9949670 | Ikeda | Apr 2018 | B2 |
10016161 | Townsend et al. | Jul 2018 | B2 |
10079925 | Hosoi | Sep 2018 | B2 |
10356231 | Hosoi et al. | Jul 2019 | B2 |
20010011951 | Kimata et al. | Aug 2001 | A1 |
20010026626 | Athanas | Oct 2001 | A1 |
20020001381 | Mori | Jan 2002 | A1 |
20020003604 | Yaguchi | Jan 2002 | A1 |
20020012441 | Matsunaga et al. | Jan 2002 | A1 |
20020068995 | Yoshida | Jun 2002 | A1 |
20020114214 | Hansen et al. | Aug 2002 | A1 |
20020115478 | Fujisawa et al. | Aug 2002 | A1 |
20020149561 | Fukumoto et al. | Oct 2002 | A1 |
20020183014 | Takeda et al. | Dec 2002 | A1 |
20030064758 | Mizuta et al. | Apr 2003 | A1 |
20030083112 | Fukuda | May 2003 | A1 |
20030108209 | McIntosh | Jun 2003 | A1 |
20030118197 | Nagayasu et al. | Jun 2003 | A1 |
20030119566 | Chen | Jun 2003 | A1 |
20030174856 | Johannsen et al. | Sep 2003 | A1 |
20040013279 | Takeda | Jan 2004 | A1 |
20040048633 | Sato et al. | Mar 2004 | A1 |
20040086149 | Johannsen et al. | May 2004 | A1 |
20040087346 | Johannsen et al. | May 2004 | A1 |
20040105566 | Matsunaga et al. | Jun 2004 | A1 |
20040131211 | Miyata et al. | Jul 2004 | A1 |
20040132509 | Glezerman | Jul 2004 | A1 |
20040137963 | Barras et al. | Jul 2004 | A1 |
20040189151 | Athanas | Sep 2004 | A1 |
20040207542 | Chang et al. | Oct 2004 | A1 |
20040259513 | Park | Dec 2004 | A1 |
20050031152 | Hansen et al. | Feb 2005 | A1 |
20050046790 | Jannard et al. | Mar 2005 | A1 |
20050088530 | Homma et al. | Apr 2005 | A1 |
20050129257 | Tamura | Jun 2005 | A1 |
20050160270 | Goldberg | Jul 2005 | A1 |
20050176459 | Fukuda | Aug 2005 | A1 |
20050184875 | Schmandt et al. | Aug 2005 | A1 |
20050185813 | Sinclair et al. | Aug 2005 | A1 |
20050207599 | Fukumoto et al. | Sep 2005 | A1 |
20050213432 | Hoshuyama | Sep 2005 | A1 |
20050232436 | Nagayasu et al. | Oct 2005 | A1 |
20050237685 | Miyata | Oct 2005 | A1 |
20050244020 | Nakajima | Nov 2005 | A1 |
20050260969 | Nagata et al. | Nov 2005 | A1 |
20050275714 | Ishikawa et al. | Dec 2005 | A1 |
20050276164 | Amron | Dec 2005 | A1 |
20050286734 | Wang | Dec 2005 | A1 |
20060079291 | Granovetter et al. | Apr 2006 | A1 |
20060089522 | Rastatter et al. | Apr 2006 | A1 |
20060093161 | Falcon | May 2006 | A1 |
20060094464 | Kyou et al. | May 2006 | A1 |
20060113932 | Mori et al. | Jun 2006 | A1 |
20060120546 | Tanaka et al. | Jun 2006 | A1 |
20060121960 | Wang | Jun 2006 | A1 |
20060140439 | Nakagawa | Jun 2006 | A1 |
20060158064 | Asakawa et al. | Jul 2006 | A1 |
20060159297 | Wirola et al. | Jul 2006 | A1 |
20060171107 | Yamamoto et al. | Aug 2006 | A1 |
20060215873 | Hansen et al. | Sep 2006 | A1 |
20060216022 | Lee et al. | Sep 2006 | A1 |
20060227984 | Sinclair | Oct 2006 | A1 |
20060262951 | Jun | Nov 2006 | A1 |
20060286998 | Fukuda | Dec 2006 | A1 |
20070003098 | Martenson | Jan 2007 | A1 |
20070014423 | Darbut et al. | Jan 2007 | A1 |
20070015467 | Nagayasu et al. | Jan 2007 | A1 |
20070019452 | Ohkubo et al. | Jan 2007 | A1 |
20070025574 | Azima et al. | Feb 2007 | A1 |
20070036370 | Granovetter et al. | Feb 2007 | A1 |
20070053530 | Ochiai et al. | Mar 2007 | A1 |
20070057601 | Kawase et al. | Mar 2007 | A1 |
20070080951 | Maruyama et al. | Apr 2007 | A1 |
20070081679 | Suzuki et al. | Apr 2007 | A1 |
20070098200 | Takei | May 2007 | A1 |
20070117594 | Ong | May 2007 | A1 |
20070160238 | Kobayashi | Jul 2007 | A1 |
20070160253 | Takei et al. | Jul 2007 | A1 |
20070249395 | Kondo et al. | Oct 2007 | A1 |
20070263893 | Kim | Nov 2007 | A1 |
20070269777 | Fux | Nov 2007 | A1 |
20070297637 | Sugiyama | Dec 2007 | A1 |
20080008344 | Wakabayashi et al. | Jan 2008 | A1 |
20080054862 | Hara | Mar 2008 | A1 |
20080092278 | Rogers et al. | Apr 2008 | A1 |
20080106449 | Doi | May 2008 | A1 |
20080107290 | Tamura | May 2008 | A1 |
20080107300 | Chen | May 2008 | A1 |
20080123893 | Lee | May 2008 | A1 |
20080129703 | Takeshita et al. | Jun 2008 | A1 |
20080137883 | Araki | Jun 2008 | A1 |
20080139254 | Levy | Jun 2008 | A1 |
20080140868 | Kalayjian et al. | Jun 2008 | A1 |
20080143512 | Wakisaka et al. | Jun 2008 | A1 |
20080170725 | Asada et al. | Jul 2008 | A1 |
20080205679 | Darbut et al. | Aug 2008 | A1 |
20080227490 | Homma et al. | Sep 2008 | A1 |
20080227501 | Joe et al. | Sep 2008 | A1 |
20080239061 | Cok et al. | Oct 2008 | A1 |
20080240465 | Shiraishi | Oct 2008 | A1 |
20080247562 | Nagayasu et al. | Oct 2008 | A1 |
20080267433 | Katou et al. | Oct 2008 | A1 |
20080297373 | Hayakawa et al. | Dec 2008 | A1 |
20080319250 | Asnes | Dec 2008 | A1 |
20090002626 | Wakabayashi | Jan 2009 | A1 |
20090028356 | Ambrose et al. | Jan 2009 | A1 |
20090069045 | Cheng | Mar 2009 | A1 |
20090093283 | Mizuta et al. | Apr 2009 | A1 |
20090129620 | Tagawa et al. | May 2009 | A1 |
20090156186 | Lyle | Jun 2009 | A1 |
20090158423 | Orlassino | Jun 2009 | A1 |
20090184884 | Kyou et al. | Jul 2009 | A1 |
20090185699 | Kim | Jul 2009 | A1 |
20090226011 | Abolfathi et al. | Sep 2009 | A1 |
20090226017 | Abolfathi et al. | Sep 2009 | A1 |
20090226020 | Abolfathi et al. | Sep 2009 | A1 |
20090245556 | Parker et al. | Oct 2009 | A1 |
20090245557 | Parker | Oct 2009 | A1 |
20090288489 | Lee et al. | Nov 2009 | A1 |
20090290730 | Fukuda et al. | Nov 2009 | A1 |
20090304210 | Weisman | Dec 2009 | A1 |
20090323976 | Asada et al. | Dec 2009 | A1 |
20100056227 | Hayakawa et al. | Mar 2010 | A1 |
20100061582 | Takigawa et al. | Mar 2010 | A1 |
20100061584 | Lin et al. | Mar 2010 | A1 |
20100098269 | Abolfathi et al. | Apr 2010 | A1 |
20100150368 | Chang et al. | Jun 2010 | A1 |
20100172519 | Kimura et al. | Jul 2010 | A1 |
20100178597 | Ishida et al. | Jul 2010 | A1 |
20100178957 | Chen | Jul 2010 | A1 |
20100184487 | Takada | Jul 2010 | A1 |
20100216526 | Chen et al. | Aug 2010 | A1 |
20100222639 | Purcell | Sep 2010 | A1 |
20100238108 | Rekimoto | Sep 2010 | A1 |
20100245585 | Fisher | Sep 2010 | A1 |
20100246878 | Sim | Sep 2010 | A1 |
20100254562 | Koo | Oct 2010 | A1 |
20100310086 | Magrath et al. | Dec 2010 | A1 |
20100311462 | Endo et al. | Dec 2010 | A1 |
20100320961 | Castillo | Dec 2010 | A1 |
20100322127 | Nakajima | Dec 2010 | A1 |
20100328033 | Kamei | Dec 2010 | A1 |
20100329485 | Fukuda et al. | Dec 2010 | A1 |
20110028190 | Mizuta et al. | Feb 2011 | A1 |
20110028777 | Rastatter et al. | Feb 2011 | A1 |
20110034219 | Filson | Feb 2011 | A1 |
20110059769 | Brunolli | Mar 2011 | A1 |
20110143769 | Jones et al. | Jun 2011 | A1 |
20110143819 | Sugiyama et al. | Jun 2011 | A1 |
20110158425 | Hayakawa | Jun 2011 | A1 |
20110159855 | Cheng | Jun 2011 | A1 |
20110169622 | Billmaier | Jul 2011 | A1 |
20110170718 | Fukuda et al. | Jul 2011 | A1 |
20110180542 | Drollinger et al. | Jul 2011 | A1 |
20110201301 | Okada et al. | Aug 2011 | A1 |
20110237306 | Kamii | Sep 2011 | A1 |
20110254616 | Kawano | Oct 2011 | A1 |
20110263200 | Thornton et al. | Oct 2011 | A1 |
20110267551 | Yokote et al. | Nov 2011 | A1 |
20110280416 | Abolfathi et al. | Nov 2011 | A1 |
20110281617 | Kim et al. | Nov 2011 | A1 |
20110293105 | Arie et al. | Dec 2011 | A1 |
20110293133 | Yan | Dec 2011 | A1 |
20110299695 | Nicholson | Dec 2011 | A1 |
20110301729 | Heiman et al. | Dec 2011 | A1 |
20110316289 | Trimarche | Dec 2011 | A1 |
20110319016 | Gormley et al. | Dec 2011 | A1 |
20120008793 | Knox et al. | Jan 2012 | A1 |
20120008807 | Gran | Jan 2012 | A1 |
20120010735 | Gilboa | Jan 2012 | A1 |
20120020503 | Endo et al. | Jan 2012 | A1 |
20120028679 | Ozasa | Feb 2012 | A1 |
20120082329 | Neumeyer | Apr 2012 | A1 |
20120082335 | Duisters et al. | Apr 2012 | A1 |
20120105192 | Norieda | May 2012 | A1 |
20120130660 | Neumeyer | May 2012 | A1 |
20120133213 | Borke et al. | May 2012 | A1 |
20120139750 | Hayakawa et al. | Jun 2012 | A1 |
20120140917 | Nicholson et al. | Jun 2012 | A1 |
20120162143 | Kai et al. | Jun 2012 | A1 |
20120182429 | Forutanpour et al. | Jul 2012 | A1 |
20120183163 | Apfel | Jul 2012 | A1 |
20120219161 | Amada | Aug 2012 | A1 |
20120221329 | Harsch | Aug 2012 | A1 |
20120237075 | East et al. | Sep 2012 | A1 |
20120238908 | Osako et al. | Sep 2012 | A1 |
20120244917 | Hosoi et al. | Sep 2012 | A1 |
20120249223 | Neugebauer | Oct 2012 | A1 |
20120253236 | Snow | Oct 2012 | A1 |
20120283746 | Hu et al. | Nov 2012 | A1 |
20120289162 | Hosoi et al. | Nov 2012 | A1 |
20120298441 | Lin et al. | Nov 2012 | A1 |
20120300956 | Horii | Nov 2012 | A1 |
20120301859 | Rastatter et al. | Nov 2012 | A1 |
20120330654 | Angell | Dec 2012 | A1 |
20130039508 | Chen et al. | Feb 2013 | A1 |
20130051585 | Karkkainen | Feb 2013 | A1 |
20130100596 | Yokote et al. | Apr 2013 | A1 |
20130111346 | Little | May 2013 | A1 |
20130120311 | Ichikawa | May 2013 | A1 |
20130129121 | Yamashita | May 2013 | A1 |
20130133424 | Donaldson | May 2013 | A1 |
20130136279 | Brown | May 2013 | A1 |
20130142348 | Weisman | Jun 2013 | A1 |
20130169352 | Kawano | Jul 2013 | A1 |
20130169829 | Homma et al. | Jul 2013 | A1 |
20130177188 | Apfel | Jul 2013 | A1 |
20130180033 | Uemoto et al. | Jul 2013 | A1 |
20130191114 | Gim | Jul 2013 | A1 |
20130236043 | Abolfathi et al. | Sep 2013 | A1 |
20130242262 | Lewis | Sep 2013 | A1 |
20130242809 | Tone et al. | Sep 2013 | A1 |
20130252675 | Nicholson | Sep 2013 | A1 |
20130259221 | Shusaku et al. | Oct 2013 | A1 |
20130281152 | Nishimura | Oct 2013 | A1 |
20130293373 | Gegner et al. | Nov 2013 | A1 |
20130301860 | Neumeyer et al. | Nov 2013 | A1 |
20130308799 | Lin et al. | Nov 2013 | A1 |
20130316691 | Forutanpour et al. | Nov 2013 | A1 |
20130324193 | Hosoi et al. | Dec 2013 | A1 |
20130335210 | Arai et al. | Dec 2013 | A1 |
20130336507 | Gran | Dec 2013 | A1 |
20140003641 | Neumeyer et al. | Jan 2014 | A1 |
20140086417 | Hansen et al. | Mar 2014 | A1 |
20140120834 | Gormley et al. | May 2014 | A1 |
20140205131 | Azmi et al. | Jun 2014 | A1 |
20140201889 | Pietrzak | Jul 2014 | A1 |
20140233356 | Pattikonda | Aug 2014 | A1 |
20140305714 | Huang | Oct 2014 | A1 |
20140313280 | Takuno et al. | Oct 2014 | A1 |
20140342783 | Suzuki et al. | Nov 2014 | A1 |
20140355792 | Nabata | Dec 2014 | A1 |
20140378191 | Hosoi et al. | Dec 2014 | A1 |
20150022438 | Hong | Jan 2015 | A1 |
20150023527 | Sato | Jan 2015 | A1 |
20150043748 | Sudo | Feb 2015 | A1 |
20150043758 | Yamada | Feb 2015 | A1 |
20150054779 | Horii et al. | Feb 2015 | A1 |
20150065057 | Hosoi et al. | Mar 2015 | A1 |
20150070083 | Kawano | Mar 2015 | A1 |
20150078569 | Magrath et al. | Mar 2015 | A1 |
20150086047 | Horii et al. | Mar 2015 | A1 |
20150110318 | Miyano | Apr 2015 | A1 |
20150110322 | Andersson | Apr 2015 | A1 |
20150131816 | Inagaki | May 2015 | A1 |
20150131838 | Horii | May 2015 | A1 |
20150141088 | Hosoi et al. | May 2015 | A1 |
20150156295 | Kazama | Jun 2015 | A1 |
20150172588 | Homma et al. | Jun 2015 | A1 |
20150180547 | Gormley et al. | Jun 2015 | A1 |
20150181338 | Hosoi | Jun 2015 | A1 |
20150208153 | Hosoi et al. | Jul 2015 | A1 |
20150256656 | Horii | Sep 2015 | A1 |
20150256946 | Neumeyer et al. | Sep 2015 | A1 |
20150289052 | Takeda et al. | Oct 2015 | A1 |
20150320135 | Lowe | Nov 2015 | A1 |
20160007109 | Neumeyer et al. | Jan 2016 | A1 |
20160018892 | Gu | Jan 2016 | A1 |
20160058091 | Sasaki | Mar 2016 | A1 |
20160062392 | Townsend et al. | Mar 2016 | A1 |
20160073202 | Nabata et al. | Mar 2016 | A1 |
20160086594 | Asada et al. | Mar 2016 | A1 |
20160100262 | Inagaki | Apr 2016 | A1 |
20160150328 | Inagaki | May 2016 | A1 |
20160205233 | Hosoi et al. | Jul 2016 | A1 |
20160248894 | Hosoi et al. | Aug 2016 | A1 |
20160261299 | Hosoi et al. | Sep 2016 | A1 |
20160286296 | Hosoi et al. | Sep 2016 | A1 |
20160337760 | Suenaga | Nov 2016 | A1 |
20160349803 | Dusan | Dec 2016 | A1 |
20170006144 | Hosoi et al. | Jan 2017 | A1 |
20170013338 | Wong | Jan 2017 | A1 |
20170026727 | Hosoi et al. | Jan 2017 | A1 |
20170213452 | Brunolli | Jul 2017 | A1 |
20170230754 | Dusan | Aug 2017 | A1 |
20170295269 | Hosoi | Oct 2017 | A1 |
20170302320 | Hosoi et al. | Oct 2017 | A1 |
20170353797 | Hosoi et al. | Dec 2017 | A1 |
20180124222 | Hosoi et al. | May 2018 | A1 |
20180199127 | Hosoi et al. | Jul 2018 | A1 |
20180262839 | Hosoi et al. | Jul 2018 | A1 |
20180259915 | Hosoi | Sep 2018 | A1 |
20180332152 | Hosoi et al. | Nov 2018 | A1 |
20180352061 | Hosoi | Dec 2018 | A1 |
20190028580 | Hosoi et al. | Jan 2019 | A1 |
20200050269 | Gu | Feb 2020 | A1 |
20200068308 | Hosoi et al. | Feb 2020 | A1 |
Number | Date | Country |
---|---|---|
2198618 | May 1995 | CN |
1110857 | Oct 1995 | CN |
1276142 | Dec 2000 | CN |
1311942 | Sep 2001 | CN |
1411253 | Apr 2003 | CN |
2575916 | Sep 2003 | CN |
1141856 | Mar 2004 | CN |
1627864 | Jun 2005 | CN |
1672114 | Sep 2005 | CN |
1679371 | Oct 2005 | CN |
1723733 | Jan 2006 | CN |
1791283 | Jun 2006 | CN |
2800681 | Jul 2006 | CN |
1843019 | Oct 2006 | CN |
1984505 | Jun 2007 | CN |
101022678 | Aug 2007 | CN |
201035260 | Mar 2008 | CN |
101267463 | Sep 2008 | CN |
101277331 | Oct 2008 | CN |
101321196 | Dec 2008 | CN |
101355823 | Jan 2009 | CN |
101360140 | Feb 2009 | CN |
101390438 | Mar 2009 | CN |
101390440 | Mar 2009 | CN |
201216023 | Apr 2009 | CN |
101513081 | Aug 2009 | CN |
101594161 | Dec 2009 | CN |
101795143 | Aug 2010 | CN |
101874410 | Oct 2010 | CN |
101897198 | Nov 2010 | CN |
101978704 | Feb 2011 | CN |
102075633 | May 2011 | CN |
201845183 | May 2011 | CN |
102670206 | Sep 2012 | CN |
202652216 | Jan 2013 | CN |
102959930 | Mar 2013 | CN |
103053147 | Apr 2013 | CN |
203039851 | Jul 2013 | CN |
103281953 | Sep 2013 | CN |
203181220 | Sep 2013 | CN |
103891308 | Jun 2014 | CN |
103999480 | Aug 2014 | CN |
1705875 | Mar 2005 | EP |
1705075 | Sep 2006 | EP |
1705874 | Sep 2006 | EP |
1 783 919 | May 2007 | EP |
1970792 | Sep 2008 | EP |
2388981 | Nov 2011 | EP |
2544430 | Jan 2013 | EP |
S51-94220 | Aug 1976 | JP |
S5236894 | Mar 1977 | JP |
S55-088497 | Jul 1980 | JP |
S56-17780 | Feb 1981 | JP |
S5690018 | Jul 1981 | JP |
S56089086 | Jul 1981 | JP |
S57162611 | Oct 1982 | JP |
S57169312 | Oct 1982 | JP |
S58-182398 | Oct 1983 | JP |
S60116800 | Aug 1985 | JP |
S62-208680 | Sep 1987 | JP |
S63-115728 | Jul 1988 | JP |
63-142981 | Sep 1988 | JP |
S63140753 | Sep 1988 | JP |
H0212099 | Jan 1990 | JP |
H02-62199 | Mar 1990 | JP |
2-182098 | Jul 1990 | JP |
H02-248121 | Oct 1990 | JP |
3-29424 | Feb 1991 | JP |
H03117995 | Dec 1991 | JP |
4-90298 | Mar 1992 | JP |
H04-303815 | Oct 1992 | JP |
H0573073 | Mar 1993 | JP |
H05-41297 | Jun 1993 | JP |
H05-183618 | Jul 1993 | JP |
H05-207579 | Aug 1993 | JP |
H05-292167 | Nov 1993 | JP |
06-030494 | Feb 1994 | JP |
3003950 | Aug 1994 | JP |
3009206 | Jan 1995 | JP |
07-107146 | Apr 1995 | JP |
07-131268 | May 1995 | JP |
H7-039150 | Jul 1995 | JP |
H07210176 | Aug 1995 | JP |
08-033026 | Feb 1996 | JP |
H879338 | Mar 1996 | JP |
8-102780 | Apr 1996 | JP |
H08-090986 | Apr 1996 | JP |
H08111703 | Apr 1996 | JP |
08-237185 | Sep 1996 | JP |
H08-256080 | Oct 1996 | JP |
H09-023256 | Jan 1997 | JP |
H10-042021 | Feb 1998 | JP |
3050147 | Apr 1998 | JP |
10-136480 | May 1998 | JP |
H10-200608 | Jul 1998 | JP |
10-227 | Sep 1998 | JP |
H11112672 | Apr 1999 | JP |
H11-163980 | Jun 1999 | JP |
3064055 | Sep 1999 | JP |
11-298595 | Oct 1999 | JP |
H11-352138 | Dec 1999 | JP |
2000-013294 | Jan 2000 | JP |
2000-031858 | Jan 2000 | JP |
2000-49935 | Feb 2000 | JP |
3066305 | Feb 2000 | JP |
3070222 | Apr 2000 | JP |
2000217015 | Aug 2000 | JP |
2000-295696 | Oct 2000 | JP |
2000-322186 | Nov 2000 | JP |
2000-324217 | Nov 2000 | JP |
2000-339793 | Dec 2000 | JP |
2001125742 | May 2001 | JP |
2001-177809 | Jun 2001 | JP |
2001169016 | Jun 2001 | JP |
2001-268211 | Sep 2001 | JP |
2001-287183 | Oct 2001 | JP |
2001287183 | Oct 2001 | JP |
2001-320790 | Nov 2001 | JP |
2001-333161 | Nov 2001 | JP |
2001-339504 | Dec 2001 | JP |
2001-352395 | Dec 2001 | JP |
2002-016720 | Jan 2002 | JP |
2002023115 | Jan 2002 | JP |
2002-036158 | Feb 2002 | JP |
2002-041411 | Feb 2002 | JP |
2002036158 | Feb 2002 | JP |
2002041411 | Feb 2002 | JP |
2002051111 | Feb 2002 | JP |
2002-84575 | Mar 2002 | JP |
2002-111822 | Apr 2002 | JP |
2002-149312 | May 2002 | JP |
2002-164986 | Jun 2002 | JP |
2002-171321 | Jun 2002 | JP |
2002-223475 | Aug 2002 | JP |
2002-238262 | Aug 2002 | JP |
2002-262377 | Sep 2002 | JP |
3090729 | Oct 2002 | JP |
2002-359889 | Dec 2002 | JP |
2002-368839 | Dec 2002 | JP |
2003-032768 | Jan 2003 | JP |
2003032343 | Jan 2003 | JP |
2003-037651 | Feb 2003 | JP |
2003-037885 | Feb 2003 | JP |
2003-102094 | Apr 2003 | JP |
2003-103220 | Apr 2003 | JP |
2003-111175 | Apr 2003 | JP |
2003-125473 | Apr 2003 | JP |
2003101625 | Apr 2003 | JP |
2003-143253 | May 2003 | JP |
2003-145048 | May 2003 | JP |
2003-169115 | Jun 2003 | JP |
2003-173375 | Jun 2003 | JP |
2003-179988 | Jun 2003 | JP |
2003-188985 | Jul 2003 | JP |
2003-211087 | Jul 2003 | JP |
2003-218989 | Jul 2003 | JP |
2003198719 | Jul 2003 | JP |
2003-274376 | Sep 2003 | JP |
2003-274470 | Sep 2003 | JP |
2003-300015 | Oct 2003 | JP |
2003-304308 | Oct 2003 | JP |
2003-319022 | Nov 2003 | JP |
2003-348208 | Dec 2003 | JP |
2004-064457 | Feb 2004 | JP |
2004-094389 | Mar 2004 | JP |
2004-128915 | Apr 2004 | JP |
2004-157873 | Jun 2004 | JP |
2004-158961 | Jun 2004 | JP |
2004-166174 | Jun 2004 | JP |
2004-173018 | Jun 2004 | JP |
2004-173264 | Jun 2004 | JP |
2004-187031 | Jul 2004 | JP |
2004-205839 | Jul 2004 | JP |
2004190699 | Jul 2004 | JP |
2004208220 | Jul 2004 | JP |
2004233316 | Aug 2004 | JP |
2004-252626 | Sep 2004 | JP |
2004-266321 | Sep 2004 | JP |
2004-274438 | Sep 2004 | JP |
2004-357198 | Dec 2004 | JP |
2005-020234 | Jan 2005 | JP |
2005-020730 | Jan 2005 | JP |
2005072643 | Mar 2005 | JP |
2005074257 | Mar 2005 | JP |
2005-311125 | Apr 2005 | JP |
2005-512440 | Apr 2005 | JP |
2005-142835 | Jun 2005 | JP |
2005-159969 | Jun 2005 | JP |
2005142729 | Jun 2005 | JP |
2005151292 | Jun 2005 | JP |
2005184267 | Jul 2005 | JP |
2005-223717 | Aug 2005 | JP |
2005-229324 | Aug 2005 | JP |
2005229324 | Aug 2005 | JP |
2005-237026 | Sep 2005 | JP |
2005-244968 | Sep 2005 | JP |
2005-328125 | Nov 2005 | JP |
2005-534269 | Nov 2005 | JP |
2005-340927 | Dec 2005 | JP |
2005-341543 | Dec 2005 | JP |
2005-348193 | Dec 2005 | JP |
2005-352024 | Dec 2005 | JP |
2006-007342 | Jan 2006 | JP |
2006-007919 | Jan 2006 | JP |
2006-011591 | Jan 2006 | JP |
2006-019812 | Jan 2006 | JP |
2006005625 | Jan 2006 | JP |
2006007342 | Jan 2006 | JP |
2006-050056 | Feb 2006 | JP |
2006-051300 | Feb 2006 | JP |
2006-066972 | Mar 2006 | JP |
2006-067049 | Mar 2006 | JP |
2006-074671 | Mar 2006 | JP |
2006-086581 | Mar 2006 | JP |
2006-109326 | Apr 2006 | JP |
2006-115060 | Apr 2006 | JP |
2006-115476 | Apr 2006 | JP |
2006094158 | Apr 2006 | JP |
2006-129117 | May 2006 | JP |
2006-129404 | May 2006 | JP |
2006-148295 | Jun 2006 | JP |
2006-155734 | Jun 2006 | JP |
2006-157226 | Jun 2006 | JP |
2006-157318 | Jun 2006 | JP |
2006-165702 | Jun 2006 | JP |
2006-166128 | Jun 2006 | JP |
2006-166300 | Jun 2006 | JP |
2006186691 | Jul 2006 | JP |
2006-197404 | Jul 2006 | JP |
2006197267 | Jul 2006 | JP |
2006-211317 | Aug 2006 | JP |
2006-226506 | Aug 2006 | JP |
2006-229647 | Aug 2006 | JP |
2006217088 | Aug 2006 | JP |
2006217321 | Aug 2006 | JP |
2006-238072 | Sep 2006 | JP |
2006-295786 | Oct 2006 | JP |
2006283541 | Oct 2006 | JP |
2006303618 | Nov 2006 | JP |
2006-333058 | Dec 2006 | JP |
2006-345025 | Dec 2006 | JP |
2006-345471 | Dec 2006 | JP |
2006333058 | Dec 2006 | JP |
2006339914 | Dec 2006 | JP |
2007-003702 | Jan 2007 | JP |
2007-006369 | Jan 2007 | JP |
2007010518 | Jan 2007 | JP |
2007-019898 | Jan 2007 | JP |
2007-019957 | Jan 2007 | JP |
2007-020051 | Jan 2007 | JP |
2007-028469 | Feb 2007 | JP |
2007-051395 | Mar 2007 | JP |
2007-072015 | Mar 2007 | JP |
2007-081276 | Mar 2007 | JP |
20070-51007 | Mar 2007 | JP |
2007074663 | Mar 2007 | JP |
2007505540 | Mar 2007 | JP |
2007-096386 | Apr 2007 | JP |
2007-103989 | Apr 2007 | JP |
2007-104548 | Apr 2007 | JP |
2007-104603 | Apr 2007 | JP |
2007-129740 | May 2007 | JP |
2007-133698 | May 2007 | JP |
2007-142920 | Jun 2007 | JP |
2007-165938 | Jun 2007 | JP |
2007-180827 | Jul 2007 | JP |
2007-189578 | Jul 2007 | JP |
2007-195239 | Aug 2007 | JP |
2007-214883 | Aug 2007 | JP |
2007-228508 | Sep 2007 | JP |
2007-268028 | Oct 2007 | JP |
2007-275819 | Oct 2007 | JP |
2007281916 | Oct 2007 | JP |
2007-306465 | Nov 2007 | JP |
2007-307124 | Nov 2007 | JP |
2007-330560 | Dec 2007 | JP |
2007-336418 | Dec 2007 | JP |
2008-000709 | Jan 2008 | JP |
2008-006558 | Jan 2008 | JP |
2008-017327 | Jan 2008 | JP |
2008-017398 | Jan 2008 | JP |
2008006558 | Jan 2008 | JP |
2008-042324 | Feb 2008 | JP |
2008-046844 | Feb 2008 | JP |
2008-092164 | Apr 2008 | JP |
2008-092313 | Apr 2008 | JP |
2008-511217 | Apr 2008 | JP |
2008085417 | Apr 2008 | JP |
2008-121796 | May 2008 | JP |
2008-135991 | Jun 2008 | JP |
2008-141589 | Jun 2008 | JP |
2008-141687 | Jun 2008 | JP |
2008-148086 | Jun 2008 | JP |
2008-149427 | Jul 2008 | JP |
2008-153783 | Jul 2008 | JP |
2008-177705 | Jul 2008 | JP |
2008149427 | Jul 2008 | JP |
2008177629 | Jul 2008 | JP |
3144392 | Aug 2008 | JP |
2008-227123 | Sep 2008 | JP |
2008-227806 | Sep 2008 | JP |
2008-229531 | Oct 2008 | JP |
2008-263383 | Oct 2008 | JP |
2008-301071 | Dec 2008 | JP |
2009010593 | Jan 2009 | JP |
2009-044510 | Feb 2009 | JP |
2009-077260 | Apr 2009 | JP |
2009-094986 | Apr 2009 | JP |
2009088942 | Apr 2009 | JP |
2009-117953 | May 2009 | JP |
2009-118396 | May 2009 | JP |
200922261 | May 2009 | JP |
2009111820 | May 2009 | JP |
2009-147680 | Jul 2009 | JP |
2009-159402 | Jul 2009 | JP |
2009-159577 | Jul 2009 | JP |
2009-166213 | Jul 2009 | JP |
2009166213 | Jul 2009 | JP |
2009171249 | Jul 2009 | JP |
4307488 | Aug 2009 | JP |
2009-207056 | Oct 2009 | JP |
2009-232443 | Oct 2009 | JP |
2009-246954 | Oct 2009 | JP |
2009246954 | Oct 2009 | JP |
2009-260883 | Nov 2009 | JP |
2009-542038 | Nov 2009 | JP |
2009267616 | Nov 2009 | JP |
2010-010945 | Jan 2010 | JP |
2010011117 | Jan 2010 | JP |
2010-068299 | Mar 2010 | JP |
2010054731 | Mar 2010 | JP |
2010-094799 | Apr 2010 | JP |
2010-094799 | Apr 2010 | JP |
2010087810 | Apr 2010 | JP |
2010-109795 | May 2010 | JP |
2010-124287 | Jun 2010 | JP |
2010-147727 | Jul 2010 | JP |
2010-166406 | Jul 2010 | JP |
2010-524295 | Jul 2010 | JP |
4541111 | Jul 2010 | JP |
2010-528547 | Aug 2010 | JP |
2010-207963 | Sep 2010 | JP |
2010207963 | Sep 2010 | JP |
2010232755 | Oct 2010 | JP |
2010245854 | Oct 2010 | JP |
2010-258701 | Nov 2010 | JP |
2010-268336 | Nov 2010 | JP |
2010283541 | Dec 2010 | JP |
2011-004195 | Jan 2011 | JP |
2011-008503 | Jan 2011 | JP |
2011-010791 | Jan 2011 | JP |
2011-010791 | Jan 2011 | JP |
2011-015193 | Jan 2011 | JP |
2011-017969 | Jan 2011 | JP |
2011-035560 | Feb 2011 | JP |
2011-048697 | Mar 2011 | JP |
2011-053744 | Mar 2011 | JP |
2011-059376 | Mar 2011 | JP |
2011-087142 | Apr 2011 | JP |
2011-512745 | Apr 2011 | JP |
2011-130334 | Jun 2011 | JP |
2011114454 | Jun 2011 | JP |
2011-139439 | Jul 2011 | JP |
2011-139462 | Jul 2011 | JP |
2011135489 | Jul 2011 | JP |
2011-212167 | Oct 2011 | JP |
2011-223556 | Nov 2011 | JP |
2011-223824 | Nov 2011 | JP |
2011-233971 | Nov 2011 | JP |
2011-234323 | Nov 2011 | JP |
2012-028852 | Feb 2012 | JP |
2012-034064 | Feb 2012 | JP |
2012-508499 | Apr 2012 | JP |
2012070245 | Apr 2012 | JP |
2012-109663 | Jun 2012 | JP |
2012-138770 | Jul 2012 | JP |
2012-515574 | Jul 2012 | JP |
2012142679 | Jul 2012 | JP |
2012-156781 | Aug 2012 | JP |
2012150266 | Aug 2012 | JP |
2012-169817 | Sep 2012 | JP |
2012-178695 | Sep 2012 | JP |
2012-196455 | Oct 2012 | JP |
5108161 | Oct 2012 | JP |
2012-249097 | Dec 2012 | JP |
2012-257072 | Dec 2012 | JP |
2012244515 | Dec 2012 | JP |
2013-005212 | Jan 2013 | JP |
2013-055492 | Mar 2013 | JP |
2013-078116 | Apr 2013 | JP |
2013-514737 | Apr 2013 | JP |
5246695 | Apr 2013 | JP |
2013061176 | Apr 2013 | JP |
2013078116 | Apr 2013 | JP |
2013-081047 | May 2013 | JP |
2013-081047 | May 2013 | JP |
2013105272 | May 2013 | JP |
2013-115800 | Jun 2013 | JP |
2013115638 | Jun 2013 | JP |
2013-128896 | Jul 2013 | JP |
2013130402 | Jul 2013 | JP |
2013-162167 | Aug 2013 | JP |
2013-198072 | Sep 2013 | JP |
2013-201560 | Oct 2013 | JP |
2013-232860 | Nov 2013 | JP |
2013-235316 | Nov 2013 | JP |
2013-236396 | Nov 2013 | JP |
2013-255091 | Dec 2013 | JP |
2013-255212 | Dec 2013 | JP |
2013255091 | Dec 2013 | JP |
2013255212 | Dec 2013 | JP |
2014-003488 | Jan 2014 | JP |
2014003488 | Jan 2014 | JP |
2014-068346 | Apr 2014 | JP |
2014-089494 | May 2014 | JP |
2014-089494 | May 2014 | JP |
2014-116972 | Jun 2014 | JP |
2014116755 | Jun 2014 | JP |
3193583 | Sep 2014 | JP |
2014165692 | Sep 2014 | JP |
2014190965 | Oct 2014 | JP |
2014-216861 | Nov 2014 | JP |
2014-229991 | Dec 2014 | JP |
2014-232905 | Dec 2014 | JP |
5676003 | Feb 2015 | JP |
2015-053640 | Mar 2015 | JP |
2015-061285 | Mar 2015 | JP |
2015082818 | Apr 2015 | JP |
2015-084801 | May 2015 | JP |
2015084801 | May 2015 | JP |
2015089016 | May 2015 | JP |
2015-139132 | Jul 2015 | JP |
3200747 | Oct 2015 | JP |
2015222908 | Dec 2015 | JP |
970008927 | May 1997 | KR |
10-1998-0022845 | Jun 1998 | KR |
20-0389666 | Jul 2005 | KR |
10-2005-0086378 | Aug 2005 | KR |
20060121606 | Nov 2006 | KR |
10-2007-0109323 | Nov 2007 | KR |
10-2008-0006514 | Jan 2008 | KR |
10-2008-0009602 | Jan 2008 | KR |
10-2008-0040962 | May 2008 | KR |
10-2009-0033564 | Apr 2009 | KR |
1020090082879 | Jul 2009 | KR |
10-2009-0120951 | Nov 2009 | KR |
10-2010-0034906 | Apr 2010 | KR |
10-2010-0041386 | Apr 2010 | KR |
20110006838 | Jan 2011 | KR |
20110121012 | Nov 2011 | KR |
20120015209 | Feb 2012 | KR |
101358881 | Feb 2014 | KR |
20150010087 | Jan 2015 | KR |
20160003340 | Jan 2016 | KR |
10-2017755 | Sep 2019 | KR |
200423682 | Nov 2004 | TW |
200536415 | Nov 2005 | TW |
200539664 | Dec 2005 | TW |
200605621 | Feb 2006 | TW |
1353164 | Mar 2009 | TW |
I391880200912814 | Mar 2009 | TW |
201018982 | May 2010 | TW |
201119339 | Jun 2011 | TW |
201233119 | Aug 2012 | TW |
M452360 | May 2013 | TW |
201332333 | Aug 2013 | TW |
201342313 | Oct 2013 | TW |
201513629 | Apr 2015 | TW |
199627253 | Sep 1996 | WO |
199805148 | Feb 1998 | WO |
WO 2001087007 | Nov 2001 | WO |
2003055183 | Jul 2003 | WO |
2004034734 | Apr 2004 | WO |
2005067339 | Jul 2005 | WO |
2005069586 | Jul 2005 | WO |
2005086522 | Sep 2005 | WO |
2005091670 | Sep 2005 | WO |
2005096599 | Oct 2005 | WO |
2005096664 | Oct 2005 | WO |
2006006313 | Jan 2006 | WO |
2006021133 | Mar 2006 | WO |
2006028045 | Mar 2006 | WO |
2006075440 | Jul 2006 | WO |
2007034739 | Mar 2007 | WO |
2007046269 | Apr 2007 | WO |
WO 2007046269 | Apr 2007 | WO |
2007099707 | Sep 2007 | WO |
2008007666 | Jan 2008 | WO |
2008029515 | Mar 2008 | WO |
2009104437 | Aug 2009 | WO |
WO 2009105115 | Aug 2009 | WO |
2009133873 | Nov 2009 | WO |
2009136498 | Nov 2009 | WO |
2009141912 | Nov 2009 | WO |
2010005045 | Jan 2010 | WO |
2010050154 | May 2010 | WO |
2010060323 | Jun 2010 | WO |
2010116510 | Oct 2010 | WO |
2010140087 | Dec 2010 | WO |
2011007679 | Jan 2011 | WO |
2011023672 | Mar 2011 | WO |
2011090944 | Jul 2011 | WO |
2011121740 | Oct 2011 | WO |
2011153165 | Dec 2011 | WO |
2011159349 | Dec 2011 | WO |
2002021881 | Mar 2012 | WO |
2012090947 | Jul 2012 | WO |
2012097314 | Jul 2012 | WO |
2012114772 | Aug 2012 | WO |
WO2012114917 | Aug 2012 | WO |
2013047609 | Apr 2013 | WO |
WO2013121631 | Aug 2013 | WO |
2013168628 | Nov 2013 | WO |
2014156534 | Oct 2014 | WO |
2015064340 | May 2015 | WO |
2015122879 | Aug 2015 | WO |
WO-2015187092 | Dec 2015 | WO |
WO 2015033677 | Mar 2017 | WO |
WO 2017099938 | Jun 2017 | WO |
Entry |
---|
SIPO, Office Action dated Aug. 8, 2018 for Chinese application No. 201580044713.0 (with English translation). |
Japan Patent Office, Office Action dated Oct. 23, 2018 for Japanese application No. 2015-012282 (with English translation). |
Japanese Patent Office, International Search Report for PCT/JP2013/067781 dated Oct. 1, 2013 (with English translation). |
Japanese Patent Office, International Search Report for International Patent Application PCT/JP2012/066376 (dated Oct. 30, 2012). |
International Search Report for International Patent Application PCT/JP2011/080099 (dated Apr. 3, 2012). |
Taiwanese Patent Office, search report in application 100148983 (2 pages) (dated Jan. 17, 2013). |
U.S. Patent and Trademark Office, Office Action in U.S. Appl. No. 13/556,367 (dated Oct. 19, 2012). |
European Patent Office, official communication in Application No. EP 11 85 3718 (dated May 14, 2014). |
U.S. Patent and Trademark Office, Office Action in U.S. Appl. No. 13/489,971 (dated Oct. 24, 2012). |
Isaka et al., “Development of Bone Conduction Speaker by Using Piezoelectric Vibration,” The Japan Society of Mechanical Engineers (No. 04-5) Dynamics and Design Conference 2004 CD-Rom Compilation (Sep. 27-30, 2004; Tokyo) (and English translation). |
Japanese Patent Office, International Search Report for International Patent Application PCT/JP2012/053231 (dated Mar. 13, 2012). |
Extended European Search Report in European patent application No. 12866397.8 dated Jul. 20, 2015. |
Japanese Patent Office, International Search Report for PCT/JP2014/071607 dated Nov. 11, 2014 (with English translation). |
Japan Patent Office, International Search Report for PCT/JP2014/077792 dated Dec. 16, 2014 (with English translation). |
Extended European Search Report for PCTJP2013067781 dated Feb. 19, 2016. |
Japanese Patent Office, official communication in Japanese Patent Application No. 2012-054308 dated Jun. 7, 2016 (and machine translation). |
Japanese Patent Office, official communication in Japanese Patent Application No. 2015-056466 dated Jul. 12, 2016 (and machine translation). |
Japanese Patent Office, official communication in Japanese Patent Application No. 2015-217427 dated Jul. 19, 2016 (and machine translation). |
Japanese Patent Office, official communication in Japanese Patent Application No. 2015-217421 dated Jul. 19, 2016 (and machine translation). |
SIPO of People's Republic of China, official communication for Chinese Patent Application No. 201180031904.5 dated Jul. 20, 2016 (and machine translation). |
Japanese Patent Office, official communication in Japanese Patent Application No. 2012-120173 dated Jul. 26, 2016 (and machine translation). |
Japanese Patent Office, official communication in Japanese Patent Application No. 2015-048052 dated Aug. 2, 2016 (and machine translation). |
Japanese Patent Office, official communication in Japanese Patent Application No. 2012-147753 dated Aug. 23, 2016 (and machine translation). |
Japanese Patent Office, official communication in Japanese Patent Application No. 2015-231478 dated Aug. 30, 2016 (and machine translation). |
News Release, “New Offer of Smartphone Using Cartilage Conduction”, Rohm Semiconductor, Kyoto, Japan, Apr. 23, 2012 (with English translation). |
European Patent Office, Partial Search Report for EP 11 85 3443 dated Oct. 27, 2016. |
Japan Patent Office, International Search Report for PCT/JP2015/071490 dated Nov. 2, 2015 with English translation. |
U.S. Patent and Trademark Office, Office Action in U.S. Appl. No. 15/049,403 dated Nov. 23, 2016. |
U.S. Patent and Trademark Office, Office Action in U.S. Appl. No. 15/174,746 dated Nov. 25, 2016. |
Smartphone Black Berry Bold 9700, Operation guide (2010). |
Office Action for JP Patent Application No. 2016-013411 dated Nov. 22, 2016 with English Translation. |
Office Action for KR Patent Application No. 10-2016-7004740 dated Nov. 28, 2016 with English Translation. |
Office Action for JP Patent Application No. 2012-252203 dated Dec. 20, 2016 with English Translation. |
Office Action for JP Patent Application No. 2012-243480 dated Dec. 20, 2016 with English Translation. |
Office Action for JP Patent Application No. 2012-229176 dated Dec. 27, 2016 with English Translation. |
Office Action for JP Patent Application No. 2012-268649 dated Jan. 31, 2017 with English Translation. |
Office Action for JP Patent Application No. 2012-054308 dated Feb. 7, 2017 with English Translation. |
Final Office Action for JP Patent Application No. 2012-120173 dated Feb. 7, 2017 with English translation. |
Japanese Office Action in Japanese Application No. 2016-051347, dated Feb. 14, 2017, 6 pages (English Translation). |
Korean Office Action in Korean Application No. 10-2015-7005518, dated Mar. 20, 2017, 12 pages (English Translation). |
Japanese Office Action in Japanese Application No. 2015-217421, dated Feb. 28, 2017, 6 pages (English Translation). |
Japanese Office Action in Japanese Application No. 2013-028997, dated Mar. 21, 2017, 8 pages (English Translation). |
International Search Report for International Application No. PCT/JP2017/000787, dated Mar. 28, 2017, 1 page. |
Japanese Office Action in Japanese Application No. 2016-087027, dated Mar. 28, 2017, 9 pages (English Translation). |
Japanese Office Action in Japanese Application No. 2016-097777, dated Mar. 21, 2017, 8 pages (English Translation). |
Chinese Office Action in Chinese Application No. 201510148247.2, dated May 3, 2017, 39 pages (English Translation). |
Japanese Office Action in Japanese Application No. 2016-114221, dated Jun. 13, 2017, English Translation. |
Japanese Office Action in Japanese Application No. 2012-150941, dated May 9, 2017, English Translation. |
Shimomura et al., “Vibration and Acoustic Characteristics of Cartilage Transducer,” Acoustical Society of Japan, 2010 with Partial English Translation. |
Rion Co. Ltd., “New-generation Vibration Level Meter Model VM-51,” Acoustical Society of Japan, 1990 with Partial English Translation. |
Japanese Office Action in Japanese Application No. 2013-106416, dated May 30, 2017, English Translation. |
Japanese Office Action in Japanese Application No. 2012-197484, dated Jun. 13, 2017, English Translation. |
Japanese Office Action in Japanese Application No. 2013-126623, dated Jun. 13, 2017, English Translation. |
Office Action for Japanese Patent Application No. 2016-185559 dated Jul. 25, 2017 with English translation. |
Office Action for Japanese Patent Application No. 2016-195560 dated Aug. 22, 2017 with English translation. |
Office Action for Japanese Patent Application No. 2016-197219 dated Aug. 22, 2017_ with English translation. |
Office Action for Japanese Patent Application No. 2016-197225 dated Aug. 22, 2017 with English translation. |
Office Action for Japanese Patent Application No. 2013-186424 dated Sep. 26, 2017_with English translation. |
Office Action for Japanese Patent Application No. 2013-195756 dated Sep. 26, 2017_with English translation. |
Office Action for Japanese Patent Application No. 2013-173595 dated Oct. 10, 2017 (with English translation). |
Fukumoto, M. and Sugimum, T., Fulltime-wear Interface Technology , NTT Technical Review, 8(1):77-81, (2003) (with English Translation). |
Sasaki C, Crusoe Supplementary Class note Which Condensed the Function Called For, ASCII, 12 pages (2001) (Partial English Translation). |
Japanese Patent Office; Office Action mailed in counterpart Japanese patent Application No. 2017-004233 dated Nov. 21, 2017 (with English-language translation). |
Office Action mailed for KR Patent Application No. 10-2017-7019074 dated Oct. 13, 2017 with English Translation. |
Office Action mailed for Japanese Patent Application No. 2013-227279 dated Oct. 17, 2017 with English translation. |
Office Action for Japanese Patent Application No. 2013-221303 dated Oct. 17, 2017 with English Translation. |
Office Action for Japanese Patent Application No. 2013-237963 dated Nov. 7, 2017 with English Translation. |
Office Action for Japanese Application No. 2017-004233 dated Nov. 21, 2017 with English Translation. |
Office Action for Japanese Patent Application No. 2016-236604 dated Nov. 21, 2017 with English Translation. |
Office Action for Japanese Patent Application No. 2014-010271 dated Nov. 28, 2017 with English Translation. |
Office Action for Japanese Patent Application No. 2017-000580 dated Dec. 19, 2017 with English Translation. |
Office Action for Korean Application No. 10-2016-7004740 dated Dec. 19, 2017 with English Translation. |
Office Action for Japanese Patent Application No. 2013-221303 dated Dec. 26, 2017 with English Translation. |
Office Action for Japanese Patent Application No. 2013-237963 dated Dec. 26, 2017 with English Translation. |
International Search Report for International Application No. PCT/JP2016/070848, dated Sep. 9, 2016, 5 pages. |
European Patent Office, EESR for EP Application No. 16824527.2 dated Feb. 28, 2019. |
European Patent Office. EESR for EP Application No. 16846372.7 dated Feb. 19, 2019. |
Korean Intellectual Property Office, Office Action for Korean Application No. 10-2018-7014722 dated Dec. 26, 2018 with English Translation. |
Korean Intellectual Property Office, Office Action for Korean Application No. 10-2018-7006763 dated Jan. 30, 2019 with English Translation. |
Korean Intellectual Property Office, Office Action for Korean Application No. 10-2018-7034989 dated Mar. 4, 2019 with English Translation. |
SIPO, Office Action for Chinese Application No. 201610520280.8 dated Jan. 3, 2019 with English Translation. |
China Intellectual Property Office, Office Action for China Appln. No. 201510131342.1, dated Nov. 4, 2019, with English Translation. |
European Patent Office, Summons to attend oral proceedings for EP Appln. No. 11853443.7, dated Oct. 10, 2019. |
Korea Intellectual Property Office, Office Action for Korean Appln No. 10-2018-7020853, dated Sep. 16, 2019, with English Translation. |
Korea Intellectual Property Office, Office Action for Korean Application No. 10-2019-7025296, dated Sep. 20, 2019, with English Translation. |
Japan Patent Office, International Search Report for PCT/JP2016/070848 dated Sep. 6, 2016, with English translation. |
Japan Patent Office, International Search Report for PCT/JP2017/000787 dated Mar. 28, 2017 (with English translation). |
Japan Patent Office, JP Application No. 2015-082557 dated Mar. 19, 2019 (with English translation). |
Japan Patent Office, International Search Report for PCT/JP2016/076494 dated Nov. 29, 2016, with English translation. |
International Search Report and Written Opinion in PCT Application No. PCT/JP2019/037808, dated Nov. 12, 2019, 10 pages. |
European Extended Search Report for EP Application No. 18179998.2_dated Oct. 26, 2018. |
Korean Intellectual Property Office, Office Action for counterpart KR Application No. 10-2017-7016517 dated Oct. 31, 2018 with English translation. |
Japan Patent Office, Office Action for JP Application No. 2014-256091 dated Oct. 30, 2018 with English translation. |
SIPO Patent Office, Chinese Patent Application No. 2014800584218 dated Jan. 3, 2018, with English translation. |
Japan Patent Office, Office Action for Japanese Patent Application No. 2013-106416 dated Jan. 9, 2018 with English translation. |
European Patent Office, EESR for European Patent Application No. 15834516 dated Mar. 12, 2018. |
Japan Patent Office, Office Action for Japanese Patent Application No. 2016-202733 dated Mar. 13, 2018 with English translation. |
Office Action in Chinese Appln. No. 201810640135.2, dated Jan. 21, 2020, 14 pages (with English translation). |
Office Action in Japanese Appln. No. 2016-120820, dated Jan. 21, 2020, 6 pages (with English translation). |
Office Action in Japanese Appln. No. 2016-202836, dated Mar. 24, 2020, 8 pages (with English translation). |
Korea Intellectual Property Office, Office Action for Korean Appln No. 10-2019-7011539, dated Dec. 25, 2019, 10 pages (with English translation). |
Japan Patent Office, Office Action for JP 2015-141168, dated Jun. 4, 2019 with English Translation. |
Korean Intellectual Property Office, Office Action for KR 10-2019-7011539 dated Jun. 20, 2019 with English Translation. |
Japan Patent Office, Office Action for JP 2015-204396, dated Jul. 16, 2019 with English Translation. |
Japan Patent Office, Office Action for JP 2015-082557 dated Jul. 30, 2019 with English Translation. |
Japan Patent Office, Office Action for JP 2015-238764 dated Aug. 20, 2019 with English Translation. |
CN Office Action in Chinese Application No. 201680041763.8, dated Apr. 3, 2020, 18 pages (with English translation). |
TW Office Action in Taiwanese Application No. 10920470180, dated May 20, 2020, 6 pages. |
Office Action in Chinese Application No. 201810260704, dated Jul. 1, 2020, 12 pages (with English translation). |
Search Report in European Application No. 20166767.2, dated Jun. 29, 2020, 12 pages. |
Office Action in Taiwanese Application No. 10920697260, dated Jul. 23, 2020, 4 pages. |
Number | Date | Country | |
---|---|---|---|
20190381672 A1 | Dec 2019 | US |