This application claims priority to Taiwanese Patent Application No. 105101985, filed on Jan. 22, 2016, the contents of which are incorporated by reference herein.
The subject matter herein relates to a system and a method of controlling robot by brain electrical signals.
A brain-machine interface is a direct communication and control path established between human brain and computer or other electronic devices. Through this path, one can express himself or operate apparatus directly by his brain activity of brain without speech of motion. Therefore, it is useful for paralysed patients or old folks.
Implementations of the present technology will now be described, by way of example only, with reference to the attached figures.
It will be appreciated that for simplicity and clarity of illustration, where appropriate, reference numerals have been repeated among the different figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein can be practiced without these specific details. In other instances, methods, procedures, and components have not been described in detail so as not to obscure the related relevant feature being described. The drawings are not necessarily to scale and the proportions of certain parts may be exaggerated to better illustrate details and features. The description is not to be considered as limiting the scope of the embodiments described herein.
The term “comprising” means “including, but not necessarily limited to”; it specifically indicates open-ended inclusion or membership in a so-described combination, group, series, and the like.
The manipulator 30 is mounted on the desk 10. The screen 40 is mounted on a rear side of the desk 10. The camera 50 is mounted above the screen 40. The camera 50 is used to capture an image of a user. The screen 40 shows icons for the use of the user. The dinner plate 60 is placed on the desk 10. The dinner plate 60 includes a plurality of food receiving areas 61. The plurality of food receiving areas 61 can receive a plurality of different foods. The manipulator 30 fetches foods from the plurality of food receiving areas 61 to the user. The host computer 70 is connected to the manipulator 30, the screen 40, and the camera 50. The brain electrical signal detection device 80 is worn on the head of the user to detect electrical signals generated by the user. The brain electrical signal detection device 80 communicates with the host computer 70 wirelessly, such as by BLUETOOTH, Wi-Fi, infrared light, and so on.
The storage module 73 stores a plurality of reference parameters which represent brain electrical signal of normal person.
At block 501, the method comprises the display control module 75 controlling an icon to flash at a certain frequency on the screen 40.
At block 502, the method comprises the brain electrical signal detection device 80 detecting brain electrical signal when the user stares at an icon, and sending the brain electrical signal to the brain electrical signal receiving module 71 of the host computer 70.
At block 503, the method comprises the brain electrical signal character capture module 72 processing the brain electrical signal, for example, filtering the brain electrical signal, transforming the brain electrical signal by a Fourier transform, and calculating energy of the brain electrical signal at some frequency. This process includes detecting instant readings, and revising the plurality of reference parameters based on the detected instant readings to obtain a plurality of personal reference parameters, which is stored in the storage module 76.
The above steps are repeated to obtain personal reference parameters in relation to each icon.
At block 601, the method comprises the display control module 75 controlling an icon to flash at a certain frequency on the screen 40.
At block 602, the method comprises the brain electrical signal detection device 80 detecting brain electrical signal, and sending the brain electrical signal to the brain electrical signal receiving module 71 of the host computer 70.
At block 603, the method comprises the brain electrical signal character capture module 72 processing the brain electrical signal to obtain parameters of use.
At block 604, the method comprises the determination module 74 comparing the obtained instant reading (parameter of use) with the personal reference parameters of each icon, and accordingly choosing one icon when the parameter of use is same as or near to the personal reference parameters of the icon.
At block 701, the method comprises the screen 40 showing a plurality of first level icons.
At block 702, the method comprises the brain electrical signal detection device 80 detecting brain electrical signal when the user stares at one first level icon, and processing the brain electrical signal to obtain parameters of use.
At block 703, the method comprises the determination module 74 compares the parameters of use with the personal reference parameters of each first level icon, one first level icon being chosen when the parameter of use is same as or near to the personal reference parameters of the first level icon, and executing the function of the first level icon. Second, and corresponding, icons are then shown.
In the system and method of controlling robot by brain electrical signal, the robot can be simply controlled by brain electrical signal to achieve different functions, such as feeding, watching video, notifying, and calling other people.
The embodiments shown and described above are only examples. Even though numerous characteristics and advantages of the present technology have been set forth in the foregoing description, together with details of the structure and function of the present disclosure, the disclosure is illustrative only, and changes may be made in the details, including in matters of shape, size, and arrangement of the parts within the principles of the present disclosure, up to and including the full extent established by the broad general meaning of the terms used in the claims.
Number | Date | Country | Kind |
---|---|---|---|
105101985 A | Jan 2016 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
5063492 | Yoda | Nov 1991 | A |
5153923 | Matsuba | Oct 1992 | A |
5363858 | Farwell | Nov 1994 | A |
5648709 | Maeda | Jul 1997 | A |
5649061 | Smyth | Jul 1997 | A |
7058445 | Kemere | Jun 2006 | B2 |
7482775 | Zaier | Jan 2009 | B2 |
7546158 | Allison | Jun 2009 | B2 |
7813544 | Fukaya | Oct 2010 | B2 |
7826894 | Musallam | Nov 2010 | B2 |
8175686 | Utsugi | May 2012 | B2 |
8433663 | Rickert | Apr 2013 | B2 |
8812096 | Flaherty | Aug 2014 | B2 |
8864846 | Herr | Oct 2014 | B2 |
8868174 | Sato | Oct 2014 | B2 |
9050200 | Digiovanna | Jun 2015 | B2 |
9118775 | Lim | Aug 2015 | B2 |
9373088 | Nuyujukian | Jun 2016 | B2 |
9477317 | Clements | Oct 2016 | B1 |
9717440 | Abdelghani | Aug 2017 | B2 |
9730816 | Leuthardt | Aug 2017 | B2 |
20020198604 | Schulman | Dec 2002 | A1 |
20030023319 | Andersen | Jan 2003 | A1 |
20040073414 | Bienenstock | Apr 2004 | A1 |
20050017870 | Allison | Jan 2005 | A1 |
20050119791 | Nagashima | Jun 2005 | A1 |
20050159668 | Kemere | Jul 2005 | A1 |
20050228515 | Musallam | Oct 2005 | A1 |
20050267597 | Flaherty | Dec 2005 | A1 |
20060049957 | Surgenor | Mar 2006 | A1 |
20060084858 | Marks | Apr 2006 | A1 |
20060167530 | Flaherty | Jul 2006 | A1 |
20060173259 | Flaherty | Aug 2006 | A1 |
20060189900 | Flaherty | Aug 2006 | A1 |
20060241356 | Flaherty | Oct 2006 | A1 |
20060241788 | Srinivasan | Oct 2006 | A1 |
20080055133 | Chakrabartty | Mar 2008 | A1 |
20080070752 | Einav | Mar 2008 | A1 |
20080112885 | Okunev | May 2008 | A1 |
20080177196 | Burdick | Jul 2008 | A1 |
20100137734 | Digiovanna | Jun 2010 | A1 |
20110218453 | Hirata | Sep 2011 | A1 |
20110224572 | Gilja | Sep 2011 | A1 |
20110238685 | Garcia Molina | Sep 2011 | A1 |
20110307079 | Oweiss | Dec 2011 | A1 |
20120059273 | Meggiolaro | Mar 2012 | A1 |
20120075168 | Osterhout | Mar 2012 | A1 |
20120078381 | Vinjamuri | Mar 2012 | A1 |
20120095619 | Pack | Apr 2012 | A1 |
20120194551 | Osterhout | Aug 2012 | A1 |
20120194553 | Osterhout | Aug 2012 | A1 |
20130165812 | Aksenova | Jun 2013 | A1 |
20140058483 | Zao | Feb 2014 | A1 |
20140210745 | Chizeck | Jul 2014 | A1 |
20140277582 | Leuthardt | Sep 2014 | A1 |
20140282746 | Lin | Sep 2014 | A1 |
20140330404 | Abdelghani | Nov 2014 | A1 |
20150245928 | Kao | Sep 2015 | A1 |
Number | Date | Country |
---|---|---|
2642401 | May 2010 | CA |
Number | Date | Country | |
---|---|---|---|
20170210010 A1 | Jul 2017 | US |