This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2017-100273, filed May 19, 2017, the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to an electronic apparatus and a method.
As conventional technologies, various types of talking systems such as transceivers, intercoms and portable phones were developed. In general, in calling devices which begin calls, for example, telephone numbers or an address was input in some way to determine the response (or partner) device.
Recent years have seen the development of an intercom system in which, when pluralities of partner devices are installed, their priorities are determined in advance. When a call is made, the partner device is determined in the order of descending priorities. This system has been developed to solve the following problem. If an individual call is made, the calling party may not be able to accurately specify the partner device because the room in which the intended person is present is unclear.
In recent years, devices integrally comprising a communication function, a speaker and a microphone and capable of controlling the surrounding electronic devices by the communication function have been developed. When these devices are used, a voice message can be delivered by the speaker. Further, sound can be picked up by the microphone to be transmitted.
A general architecture that implements the various features of the embodiments will now be described with reference to the drawings. The drawings and the associated descriptions are provided to illustrate the embodiments and not to limit the scope of the invention.
In general, according to one embodiment, a system comprises: a receiver and electronic circuitry. The receiver configured to receive a request of communication with a first user. The electronic circuitry configured to: determine whether the first user is present near a first electronic apparatus of electronic apparatuses; determine whether the first user is present near a second electronic apparatus of the electronic apparatuses; establish a first communication route with the first electronic apparatus for the communication with the first user in response to the request, if it is determined that the first user is present near the first electronic apparatus; and establish a second communication route with the second electronic apparatus for the communication with the first user in response to the request, if it is determined that the first user is present near the second electronic apparatus.
Various embodiments will be described hereinafter with reference to the accompanying drawings.
<Example of Communication System in Family>
The family includes, for example, a son 12, a daughter 13, a mother 14, a grandfather 15 and a father 16. The house of the family provides at least the room of the son, the bedroom of the daughter, a kitchen, the bedroom of the mother and father (not shown), a living room (not shown) and a yard. The members of the family bring their respective portable electronic apparatuses associated with each other in one-to-one relationship, such as smartphones, portable phones, tablet computers or wearable devices.
It is assumed that the son 12 is currently present in the room of the son. The daughter 13 is currently present in the bedroom of the daughter. The mother 14 is currently present in the kitchen. The grandfather 15 is currently present in the yard. The father 16 is currently outside the country on a business trip.
In the room of the son 12, an electronic apparatus (in other words, a communication device, an Internet-on-Things [IoT] device or an IoT device) 11A is provided. A communication device (electronic apparatus) 11B is provided in the bedroom of the daughter. A communication device (electronic apparatus) 11C is provided in the kitchen. Each communication device 11 (A, B or C; the alphabetical letters may be omitted unless necessary) may be referred to as an electronic apparatus for talking, a thing (multifunctional device) or an electronic apparatus for monitoring. The configuration of each communication device 11 (A, B or C) is explained with reference to
Each of communication devices 11A, 11B and 11C is capable of performing near-field communication with a router 40. This communication is, for example, communication in accordance with Wi-Fi (registered trademark). The router 40 is capable of communicating with portable terminals in accordance with Wi-Fi. The router 40 is also capable of communicating with a server 50 via the Internet 51.
Each of communication devices 11A, 11B and 11C is capable of communicating with a portable terminal at a short distance (or a nearby portable terminal). This communication is, for example, communication in accordance with Bluetooth (registered trademark).
In the example of
The son 12, the daughter 13 and the mother 14 can call portable terminal 24 of the father 16 by means of the respective communication devices 11A, 11B and 11C or their portable terminals. At this time, the communication route is established (formed) by the router 40, the Internet 51, the server 50, an access point 52 and portable terminal 24 of the father. The father 16 can connect the communication route of his portable terminal 24 to communication devices 11A, 11B and 11C of each of the son 12, the daughter 13 and the mother 14.
Each communication device 11 comprises a controller 100. The controller 100 is capable of outputting audio data. An audio output module 110 applies digital-analog (D-A) conversion to the audio data such that the volume, etc., is controlled. The audio data is supplied to a speaker 111.
The video signal captured by a camera module 121 is subjected to digitalization and encoding in a video capture module 122, and is input to the controller 100. The controller 100 is capable of controlling an illumination power supply 123 to control an illumination device 124. The controller 100 is also capable of obtaining the surrounding sound by a microphone 131 and an audio acquisition module 132. The illumination device 124 is used to illuminate the area to be captured by the camera module 121 when the circumference is dark.
The controller 100 communicates with a nearby portable terminal or communication device via a transceiver 141, using a Wi-Fi function (a communication function within an area [a wireless LAN function]). The controller 100 is capable of communicating with a nearby portable terminal via a transceiver 142 in accordance with near-field communication such as Bluetooth.
Further, the detection signals from various sensors 143 may be input to the controller 100. The controller 100 may control the operation of the sensors 143 (for example, power-on, power-off or changeover of characteristics). The camera module, microphone and speaker may be turned off or on by the user's operation. In each device 11, at least one of the camera module, speaker and microphone may use the data obtained from the elements already provided. Each communication device 11 may be, for example, a computer equipped with a camera module. As the near-field communication function, for example, zigbee (registered trademark) or Z-Wave (registered trademark) may be used.
A specifying module 302 is capable of specifying the first device 311 present near the first user or owned by the first user from a plurality of devices based on the call signal received by the receiver 301. A communication module 303 establishes a communication route based on the result specified by the specifying module 302 such that the first device 311 is allowed to perform communication based on the request.
The configuration shown in
In this system, when the person who begins conversation (the utterer) says the name (for example, “mom” or “dad”) of the intended person to the communication device located near the utterer, voice (calling voice) or ringtone is automatically output to the communication device present near the intended person. The method (user interface [UI]) for identifying (specifying) the intended person may be, for example, a graphical user interface (GUI) displayed on the screen of the terminal held by the utterer, a button displayed on the screen or a button (hardware button) widely used in a remote controller in addition to a voice instruction and the authentication of the result of video recognition. For example, when a voice instruction is used, a dictionary for various ways of calling is prepared such that the same person can be specified for various ways of calling such as the name of the person, calls (“big brother” and “nanny”) which can specify the individual based on information on age like brother and sister and grandfather and grandmother (father and mother), and nicknames. Further, when a voice instruction is used, the person having an age feature for the utterer can be specified by identifying the utterer by speech recognition. For example, when the daughter 13 says “Dad”, the father 16 is specified. When the father 16 says “Dad”, the grandfather 15 is specified.
For example, when the son 12 shown in
In this case, whether the intended person (for example, the mother) is present near communication device 11C is determined by the human recognition (face recognition) function of the camera module attached to communication device 11C or the camera module provided in the room or kitchen in which the mother is present in cooperation with communication device 11C.
For example, when the son 12 says “I want to speak to dad” in the child's room, the father may not be detected by the Wi-Fi of the communication devices or portable terminals in the house based on the router (in other words, it is determined that the father is not present in the house). In this case, the communication route is automatically connected to portable terminal 24 of the father. As shown in
As another example, it is assumed that the mother 14 says “Grampa” to communication device 11C. In the example of
Whether the intended person is present near the communication devices provided in the house may be determined by the communication devices when near-field communication is performed between the portable terminal (device) held by the intended person and the communication devices. When the device of a room detects the smartphone of the intended person in accordance with near-field communication, the communication route is established for the device of the room.
When a communication device detects the portable terminal of the intended person in accordance with near-field communication, the communication device on the utterer side is connected to the communication device which detects the portable terminal of the intended person. For example, as shown in
Whether the intended person is present near a communication device and brings the portable terminal with him/her is determined based on, for example, the output of the acceleration sensor information of the portable terminal. In the example of
The method for detecting the presence of the intended person or a specific person in the vicinity (or the presence or absence of a visitor) is not limited to the detection of the portable terminal held by the partner (user) by each communication device 11, and may be any method as long as a person can be detected. For example, a person may be recognized by a camera module. Sound may be obtained by a microphone (including the use of a speaker as a microphone when the speaker is not used), and an individual may be specified based on the feature amount of the obtained sound. The presence of a person may be detected by a motion (thermal) sensor, and an individual may be specified by audio or video data. It should be noted that, for example, pets are preferably registered as the exclusion from the detection target in advance.
When a communication device is associated with a nearby portable terminal like example 4, and further when it is determined that the portable terminal is not held by the owner, the device on the utterer side is not connected to the portable terminal (when it is determined that the portable terminal is not held by the owner, the device on the utterer side is not connected to the communication device 11 provided in the room).
In the example of
Even when it is determined that a portable terminal is not held as with example 5, if the portable terminal or the room is registered in particular, the communication device on the utterer side is connected to the portable terminal.
For example, since a portable terminal is placed on a bedside table in a bedroom, the portable terminal may be registered as a limited (special) device (it is determined that the person is present in the room even when the person does not bring the portable terminal with him/her). In this way, a connection route can be formed between the portable terminal in the bedroom and the communication device of the calling party. For example, in many cases, a person who cannot freely or easily move around the room (a sick person or a person in need of nursing care) places his/her portable terminal on a bedside table for a long time. Thus, the portable terminal is preferably registered as a limited (special) device. As the conditions for registration, the time periods in which the person is not present in the room presumably, such as the time for a meal or in a bath, are preferably excluded.
For example, when the communication connection destination is unclear (when the communication connection destination communicable with the calling party is not found) in example 1, the user is notified of the fact from the communication device (for example, by sound).
For example, when the son 12 says “I want to speak to Mr. Yamada” in
In the above examples, this specification mainly explains a communication route connected from a communication device to a portable terminal. However, a portable terminal may directly receive a call from outside (calling request [incoming call]). In this case, a communication route may be constructed in the following manner. The portable terminal may be connected to a nearby communication device.
Subsequently, the communication device may be connected to another communication device present near the owner of the portable terminal.
It is assumed that, in
When, regardless of whether an individual holds a portable terminal (brings a portable terminal with him/her), for example, a call is received from a family member via a communication line outside the area, and the location of the partner associated with the portable terminal cannot be specified by any communication device, the portable terminal of the partner (communication target) is notified (informed) of the reception of the call via a public telecommunication network. For example, when the calling party is unclear, a message or data indicating that the intended person is unavailable is sent back. The priority for notifying the portable terminal of the intended person that a call is received may be set in advance. For example, calls from family members and friends are preferably reported even at midnight. However, to calls from work-related calling parties, a request to call back in working hours (typical business hours) may be sent back. It is possible to put restrictions on a case where the reception of a call to a portable terminal present in a registered place is reported as with example 6, and a case where it is clear that the call is made from a family member.
<Reference Tables and Data>
Now, this specification explains reference tables and data used to dynamically construct each of the above communication routes.
The data shown in
To accurately operate the above communication system, as shown in
In the registration information table shown in
In the situation table shown in
The communication connection destination (move destination) determination table shown in
In case 1, the face of the intended person is recognized by a communication device. In case 1, the portable terminal of the intended person is not detected by any communication device in accordance with near-field communication, and the router does not detect the portable terminal of the intended person in accordance with Wi-Fi, and further, the acceleration sensor information of the portable terminal of the intended person is not obtained. In this case, it is determined that the intended person is present in the room in which the communication device which recognizes the face is provided. As a result, a communication route with the communication device which detects the face is established for the called person in case 1.
In case 2 is an example in which the face is not recognized, and the communication device associated with the intended person detects the portable terminal of the intended person in accordance with near-field communication, and further, it is clear that the intended person brings the portable terminal with him/her. In this case, it is determined that the intended person is present in the room in which the communication device is provided. In case 2, a communication route is established such that the device on the utterer side is connected to the communication device provided in the room in the house.
In case 3 is an example in which the face is not recognized, and the communication device provided in the specific room associated with the intended person detects the portable terminal of the intended person in accordance with near-field communication, and further, it is clear that the intended person does not bring the portable terminal with him/her. This example is the environment explained in example 6. In this case, it is determined that the intended person is present in the specific room. In case 3, a communication route is established such that the device on the utterer side is connected to the communication device provided in the limited (specific) room in the house.
In case 4 is an example in which the face is not recognized, and the communication device provided in the specific room associated with the intended person detects the portable terminal of the intended person in accordance with near-field communication, and further, it is clear that the intended person does not bring the portable terminal with him/her. In this case, it is determined that the location of the intended person is unclear. An exception to this case 4 is the above case 3. In case 4, the communication device notifies the user that the communication connection destination communicable with the calling party is not found (by sound, etc.,) as explained in example 7.
In case 5 is an example in which the face is not recognized, and the communication device provided in the specific room associated with the intended person does not detect the portable terminal of the intended person in accordance with near-field communication, and the router recognizes that the intended person brings the portable terminal with him/her in accordance with Wi-Fi communication. In this case, it is determined that the intended person is present near the house but in a place where no communication device (IoT device) is provided in the vicinity.
The case 5 is equivalent to a case where the mother 14 says “Grampa” to communication device 11C as explained in example 3. In this case, the device of the utterer is connected to the portable terminal in accordance with Wi-Fi communication.
In case 6 is an example in which the face is not recognized, and the communication device provided in the specific room associated with the intended person does not detect the portable terminal of the intended person in accordance with near-field communication, and the router detects the portable terminal of the intended person in accordance with Wi-Fi communication, and the intended person does not bring the portable terminal with him/her. In case 6, the communication device notifies the user that the communication connection destination communicable with the calling party is not found (by sound, etc.,) as explained in example 7.
In case 7 is an example in which the face is not recognized, and the communication device provided in the limited (specific) room associated with the intended person does not detect the portable terminal of the intended person in accordance with near-field communication, and the router does not detect the portable terminal of the intended person in accordance with Wi-Fi communication. In this case, the communication system determines that the intended person is away from home. Telephone connection is performed for the portable terminal of the intended person by the number of the portable terminal.
As explained above, this communication system is capable of determining the communication route based on the situation of the intended person. To realize this configuration, the communication system uses the registration information table shown in
The registration information table shown in
For example, a communication device is connected to the portable terminal of a user. As the connection method, for example, the communication device is set to a registration mode. The portable terminal of the user is set to an operation mode in accordance with, for example, Bluetooth. The ID of the communication device is input to the portable terminal. In this way, the communication device can communicate with the portable terminal. The message “Enter registrant ID” is displayed on the screen of the portable terminal. The registrant ID may arbitrary. In this case, the registrant IDs of the family members are preferably different from each other. Subsequent to the registrant IDs, nominal designations such as “Katsuo”, “Wakame”, “dad”, “mom” and “grampa” may be entered.
At this time, the telephone number, Bluetooth ID and MAC address of the portable terminal are automatically transmitted to and registered in the communication device. The message “Do you want to register face image data?” is displayed on the screen of the portable terminal. When the user registers face image data, for example, the user says “yes” or “please”, facing the front side of the camera module of the communication device. Thus, the face image of the user is obtained by the communication device. When the user says “no”, face image is not obtained.
In a state where the communication device is in registration mode, it is possible to register the communication device (a communication device in a specific room) to be called by forming a communication route even when it is determined that the intended person does not bring the portable terminal with him/her. This registration is performed to establish the communication route explained in case 3. In the menu of registration mode, for example, the message “Do you want to register a communication device in a specific room?” is prepared. The user may select the message, enter the ID of the communication device of a specific room in the portable terminal and select a confirmation button.
In the above communication system, the data of the situation table shown in
This change is detected in accordance with, for example, Bluetooth communication between the communication device and the portable terminal (to detect for portable terminal is Bluetooth?) or Wi-Fi communication between the router 40 and the portable terminal (to detect for portable terminal is Wi-Fi?). Communication using Bluetooth or Wi-Fi is initiated by a controller provided in the communication device or a controller provided in the router. The obtained information regarding the change in the acceleration sensor information of the portable terminal is used to update the column of the acceleration sensor information in
The above communication process clarifies whether there is a change in the acceleration sensor information and the ID of the communication device which communicates with the portable terminal or the IP address of the router which communicates with the portable terminal. Thus, the room in which the portable terminal is present or the absence of the portable terminal in any room is confirmed. In this way, the location or situation of each registrant can be updated (Sa5).
The faces of users who register face image data in
When the obtained face image data matches the registered face image data of a registrant, the presence of the registrant in the room of the communication device from which the data is obtained is confirmed. In accordance with the result of confirmation, the located location and situation of the registrant can be updated (Sa6).
Further, as each communication device comprises the microphone, a speech recognition function may be used. For example, the voice data of the registrants is registered in
In block Sb2 shown in
In block Sb3 shown in
In block Sb4 shown in
When the registrant ID is detected, the situation of the intended person is determined based on the registrant ID of the intended person with reference to
The process for specifying the communication route is performed in accordance with various situations. When it is determined that the device of the intended person (registrant ID) is a communication device with reference to
When it is determined that the device of the intended person (registrant ID) is a portable terminal using Wi-Fi communication for a call (communication connecting to IP address for Wi-Fi communication) with reference to
When it is determined that the device of the intended person (registrant ID) is a portable terminal using its telephone number for a call with reference to
When the location of the intended person cannot be specified with reference to
Further, the communication system is capable of outputting sound from each communication device or portable terminal based on the situation of the receiver (the intended person).
Each communication device is capable of detecting an incoming call to a portable terminal in accordance with near-field communication (Sd1). At this time, the communication device obtains the Bluetooth ID of the portable terminal (Sd2). Subsequently, the registrant ID of the owner of the detected portable terminal is obtained from the obtained Bluetooth ID with reference to the registration information shown in
Subsequently, the communication system specifies the location of the owner of the detected portable terminal based on the obtained registrant ID with reference to the situation data shown in
When the owner of the portable terminal is present near a communication device different from the communication device which detects the call in block Sd6-No, the communication system forms a communication route with the communication device different from the communication device which detects the call. The communication system causes the different communication device to output the calling sound and voice received in the portable terminal (Sd8 and Sd9).
When the above communication route is formed, the communication system determines whether or not the call is stopped (Sd10). When the communication route is maintained, the communication system determines whether or not conversation (“I will answer the phone” recognized?) is started (Sd11 (Sd10—No)). If conversation is started, the communication system determined that the call is in progress (Sd12). If the communication system determines that the call is stopped (Sd10—Yes), the communication is terminated.
When the communication system determines that the owner of the portable terminal is not present in the room in which the communication device is provided in block Sd5—No, the process moves to block Sc1 shown in
When the communication device comprises the above function, the utterer can communicate with the partner displaying in sign language and/or characters, etc.
In the electronic apparatus and method shown in
Thus, it is possible to provide, for example, sound and signals based on specific rules by specifying an area instead of an announcement over the whole building. For example, in a department store, it is possible to contact the employees, or play BGM unique to each shop. In a plant, it is possible to independently send a message as an operation instruction to the employees based on each area, also.
The example shown in
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2017-100273 | May 2017 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
6167251 | Segal et al. | Dec 2000 | A |
6836651 | Segal et al. | Dec 2004 | B2 |
8787543 | Peregrin Emparanza et al. | Jul 2014 | B2 |
9152787 | Gathala et al. | Oct 2015 | B2 |
9189624 | Gathala et al. | Nov 2015 | B2 |
9202047 | Gupta et al. | Dec 2015 | B2 |
9292685 | Gupta et al. | Mar 2016 | B2 |
9298494 | Gathala et al. | Mar 2016 | B2 |
9319897 | Gupta et al. | Apr 2016 | B2 |
9324034 | Gupta et al. | Apr 2016 | B2 |
9330257 | Valencia et al. | May 2016 | B2 |
9349001 | Gathala et al. | May 2016 | B2 |
9491187 | Sridhara et al. | Nov 2016 | B2 |
9495537 | Gupta et al. | Nov 2016 | B2 |
9900171 | Guedalia | Feb 2018 | B2 |
20050170818 | Netanel et al. | Aug 2005 | A1 |
20120033795 | Peregrin Emparanza et al. | Feb 2012 | A1 |
20130303154 | Gupta et al. | Nov 2013 | A1 |
20130303159 | Gathala et al. | Nov 2013 | A1 |
20130304676 | Gupta et al. | Nov 2013 | A1 |
20130304677 | Gupta et al. | Nov 2013 | A1 |
20130304869 | Gupta et al. | Nov 2013 | A1 |
20130305101 | Gupta et al. | Nov 2013 | A1 |
20130305358 | Gathala et al. | Nov 2013 | A1 |
20130305359 | Gathala et al. | Nov 2013 | A1 |
20140006513 | Takaoka | Jan 2014 | A1 |
20140038560 | Lee | Feb 2014 | A1 |
20140051432 | Gupta et al. | Feb 2014 | A1 |
20140053260 | Gupta et al. | Feb 2014 | A1 |
20140053261 | Gupta et al. | Feb 2014 | A1 |
20140112506 | Hopkins | Apr 2014 | A1 |
20140150100 | Gupta et al. | May 2014 | A1 |
20140187177 | Sridhara et al. | Jul 2014 | A1 |
20140188781 | Fawaz et al. | Jul 2014 | A1 |
20140205099 | Christodorescu et al. | Jul 2014 | A1 |
20140237595 | Sridhara et al. | Aug 2014 | A1 |
20140245306 | Gathala et al. | Aug 2014 | A1 |
20140267542 | Nakamura | Sep 2014 | A1 |
20140317734 | Valencia et al. | Oct 2014 | A1 |
20140337862 | Valencia et al. | Nov 2014 | A1 |
20150148109 | Gupta et al. | May 2015 | A1 |
20150356462 | Fawaz et al. | Dec 2015 | A1 |
20150365787 | Farrell | Dec 2015 | A1 |
20160088009 | Gupta et al. | Mar 2016 | A1 |
20170231020 | Tomici et al. | Aug 2017 | A1 |
20170353859 | Idnani | Dec 2017 | A1 |
20180343138 | Murakami | Nov 2018 | A1 |
Number | Date | Country |
---|---|---|
3110118 | Dec 2016 | EP |
2002-33839 | Jan 2002 | JP |
2002-527952 | Aug 2002 | JP |
2011-23779 | Feb 2011 | JP |
2011-118822 | Jun 2011 | JP |
2012-247841 | Dec 2012 | JP |
2013-110472 | Jun 2013 | JP |
2014-42323 | Mar 2014 | JP |
2016-9976 | Jan 2016 | JP |
2016-512631 | Apr 2016 | JP |
2017-506465 | Mar 2017 | JP |
WO 2015125451 | Aug 2015 | WO |
Entry |
---|
U.S. Appl. No. 15/857,288, filed Dec. 28, 2017, Murakami. |
Number | Date | Country | |
---|---|---|---|
20180338341 A1 | Nov 2018 | US |