Embodiments of the disclosure relate to computing devices with improved interactive animated conversational interface systems.
Provided herein are exemplary systems and methods including an interactive conversational text-based interaction (ECG_Forms) and a three-dimensional Electronic Caregiver Image (ECI) avatar that allows a user to complete various forms using voice conversation and cloud-based talk-to-text technology. Through the system, the ECI avatar may communicate in multiple languages. The system provides a user with the option of selecting methods for data input comprising either traditional type based data entry or voice communication data entry. Following the user input of data, the system uses cloud-based database connectivity to review user input and provide redundancy against data input errors. When errors are discovered by the system, feedback is provided to the user for correction of errors. To assess data for accuracy in real-time, the system utilizes a catalogue of inputs to determine whether a data type input by the user matches a defined catalogue data type. As such, through the use of cloud-based applications, the system completes data assessment, executes the continuation decision process and provides a response to the user in less than 1.0 second. Once data has been assessed for accuracy and all user data are entered into the system, the system encrypts user input data and proceeds with transmitting the data to a cloud-based primary key design database for storage. The system also provides a company web browser comprising the three-dimensional Electronic Caregiver Image (ECI) avatar for interactive communication with the user. This ECI avatar provides the user with an interactive experience during which the user is guided through the completion of the process. As the process is completed by the user, the ECI avatar provides real-time feedback in conversational form in an effort to simplify and streamline the form completion by the user.
The accompanying drawings, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed disclosure, and explain various principles and advantages of those embodiments.
In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosure. It will be apparent, however, to one skilled in the art, that the disclosure may be practiced without these specific details. In other instances, structures and devices may be shown in block diagram form only in order to avoid obscuring the disclosure.
Various exemplary embodiments described and illustrated herein relate to a computing device comprising a display screen, the computing device being configured to dynamically display a specific, structured interactive animated conversational graphical interface paired with a prescribed functionality directly related to the interactive animated conversational graphical user interface's structure. Accordingly, a user is provided with an interactive conversational interface comprising the Electronic Caregiver Forms (ECG_Forms), text-based conversation, and the Electronic Caregiver Image (ECI), which comprises a three-dimensional avatar paired with voice-driven interaction, all of which may be presented within a web browser.
User data input into document fields is typically tedious and boring for the user. It is also highly prone to human error. As such, text-based conversational “chatbots” have become an increasingly popular interactive option for the replacement of simple keystroke text entry by user paradigms.
As chatbot programs have developed in recent years, they have been incorporated in the effective simulation of logical conversation during human/computer interaction. The implementation of these chatbots has occurred via textual and/or auditory methods effectively providing human users with practical functionality in information acquisition activities. In most cases today, chatbots function simply to provide a conversational experience during the obtaining of data from a user.
As chatbot programs have progressed, the knowledge bases associated with their capabilities have become increasingly complex, but the ability to validate user responses in real-time remains limited. Additionally, the capability of chatbot programs to be functionally incorporated across vast networks is significantly lacking. As such, most chatbot programs cannot be incorporated across multiple systems in a manner that allows them to collect user data while simultaneously verifying the type of data input by the user, transmit data input by the user to various storage sites for further data validation, store data offsite in cloud-based storage solutions and overwrite existing stored data based on new user inputs, all while providing a virtual avatar which guides the user through the data entry process.
In
According to various exemplary embodiments, a three-dimensional Electronic Caregiver Image (ECI) avatar as depicted in
While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. The descriptions are not intended to limit the scope of the technology to the particular forms set forth herein. Thus, the breadth and scope of a preferred embodiment should not be limited by any of the above-described exemplary embodiments. It should be understood that the above description is illustrative and not restrictive. To the contrary, the present descriptions are intended to cover such alternatives, modifications, and equivalents as may be included within the spirit and scope of the technology as defined by the appended claims and otherwise appreciated by one of ordinary skill in the art. The scope of the technology should, therefore, be determined not with reference to the above description, but instead should be determined with reference to the appended claims along with their full scope of equivalents.
This application claims the priority benefit of U.S. Provisional Patent Application Ser. No. 62/618,550 filed on Jan. 17, 2018 and titled “Interactive Animated Conversational Interface System,” which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5211642 | Clendenning | May 1993 | A |
5475953 | Greenfield | Dec 1995 | A |
6665647 | Haudenschild | Dec 2003 | B1 |
7233872 | Shibasaki et al. | Jun 2007 | B2 |
7445086 | Sizemore | Nov 2008 | B1 |
7612681 | Azzaro et al. | Nov 2009 | B2 |
7971141 | Quinn | Jun 2011 | B1 |
8206325 | Najafi et al. | Jun 2012 | B1 |
8771206 | Gettelman et al. | Jul 2014 | B2 |
9317916 | Hanina et al. | Apr 2016 | B1 |
9591996 | Chang et al. | Mar 2017 | B2 |
9972187 | Srinivasan et al. | May 2018 | B1 |
10387963 | Leise | Aug 2019 | B1 |
10417388 | Han et al. | Sep 2019 | B2 |
10628635 | Carpenter, II | Apr 2020 | B1 |
10761691 | Anzures et al. | Sep 2020 | B2 |
10813572 | Dohrmann et al. | Oct 2020 | B2 |
10943407 | Morgan et al. | Mar 2021 | B1 |
11113943 | Wright et al. | Sep 2021 | B2 |
11213224 | Dohrmann et al. | Jan 2022 | B2 |
20020062342 | Sidles | May 2002 | A1 |
20020196944 | Davis et al. | Dec 2002 | A1 |
20040109470 | Derechin | Jun 2004 | A1 |
20040189708 | Larcheveque | Sep 2004 | A1 |
20050035862 | Wildman et al. | Feb 2005 | A1 |
20050055942 | Maelzer et al. | Mar 2005 | A1 |
20070032929 | Yoshioka | Feb 2007 | A1 |
20070238936 | Becker | Oct 2007 | A1 |
20080010293 | Zpevak | Jan 2008 | A1 |
20080186189 | Azzaro et al. | Aug 2008 | A1 |
20090094285 | Mackle | Apr 2009 | A1 |
20100124737 | Panzer | May 2010 | A1 |
20110126207 | Wipfel et al. | May 2011 | A1 |
20110145018 | Fotsch et al. | Jun 2011 | A1 |
20110232708 | Kemp | Sep 2011 | A1 |
20120025989 | Cuddihy et al. | Feb 2012 | A1 |
20120075464 | Derenne et al. | Mar 2012 | A1 |
20120120184 | Fornell et al. | May 2012 | A1 |
20120121849 | Nojima | May 2012 | A1 |
20120154582 | Johnson et al. | Jun 2012 | A1 |
20120165618 | Algoo | Jun 2012 | A1 |
20120179067 | Wekell | Jul 2012 | A1 |
20120179916 | Staker et al. | Jul 2012 | A1 |
20120229634 | Laett et al. | Sep 2012 | A1 |
20120253233 | Greene et al. | Oct 2012 | A1 |
20130000228 | Ovaert | Jan 2013 | A1 |
20130060167 | Dracup | Feb 2013 | A1 |
20130123667 | Komatireddy | May 2013 | A1 |
20130127620 | Siebers et al. | May 2013 | A1 |
20130145449 | Busser et al. | Jun 2013 | A1 |
20130167025 | Patri | Jun 2013 | A1 |
20130204545 | Solinsky | Aug 2013 | A1 |
20130212501 | Anderson | Aug 2013 | A1 |
20130237395 | Hjelt et al. | Sep 2013 | A1 |
20130289449 | Stone et al. | Oct 2013 | A1 |
20130303860 | Bender et al. | Nov 2013 | A1 |
20140074454 | Brown et al. | Mar 2014 | A1 |
20140128691 | Olivier | May 2014 | A1 |
20140148733 | Stone et al. | May 2014 | A1 |
20140171039 | Bjontegard | Jun 2014 | A1 |
20140171834 | DeGoede et al. | Jun 2014 | A1 |
20140214441 | Young | Jul 2014 | A1 |
20140232600 | Larose et al. | Aug 2014 | A1 |
20140243686 | Kimmel | Aug 2014 | A1 |
20140257852 | Walker et al. | Sep 2014 | A1 |
20140267582 | Beutter et al. | Sep 2014 | A1 |
20140278605 | Borucki | Sep 2014 | A1 |
20140317502 | Brown et al. | Oct 2014 | A1 |
20140330172 | Jovanov et al. | Nov 2014 | A1 |
20140337048 | Brown et al. | Nov 2014 | A1 |
20140358828 | Phillipps et al. | Dec 2014 | A1 |
20140368601 | deCharms | Dec 2014 | A1 |
20150005674 | Schindler | Jan 2015 | A1 |
20150019250 | Goodman et al. | Jan 2015 | A1 |
20150109442 | Derenne et al. | Apr 2015 | A1 |
20150142704 | London | May 2015 | A1 |
20150169835 | Hamdan et al. | Jun 2015 | A1 |
20150359467 | Tran | Dec 2015 | A1 |
20160026354 | McIntosh et al. | Jan 2016 | A1 |
20160117470 | Welsh et al. | Apr 2016 | A1 |
20160117484 | Hanina et al. | Apr 2016 | A1 |
20160154977 | Jagadish | Jun 2016 | A1 |
20160217264 | Sanford | Jul 2016 | A1 |
20160253890 | Rabinowitz et al. | Sep 2016 | A1 |
20160267327 | Franz et al. | Sep 2016 | A1 |
20160314255 | Cook et al. | Oct 2016 | A1 |
20170000387 | Forth et al. | Jan 2017 | A1 |
20170000422 | MIoturu et al. | Jan 2017 | A1 |
20170024531 | Malaviya | Jan 2017 | A1 |
20170055917 | Stone et al. | Mar 2017 | A1 |
20170140631 | Pietrocola et al. | May 2017 | A1 |
20170147154 | Steiner et al. | May 2017 | A1 |
20170192950 | Gaither et al. | Jul 2017 | A1 |
20170193163 | Melle et al. | Jul 2017 | A1 |
20170197115 | Cook et al. | Jul 2017 | A1 |
20170213145 | Pathak et al. | Jul 2017 | A1 |
20170223176 | Anzures et al. | Aug 2017 | A1 |
20170273601 | Wang et al. | Sep 2017 | A1 |
20170336933 | Hassel | Nov 2017 | A1 |
20170337274 | Ly | Nov 2017 | A1 |
20170344706 | Torres et al. | Nov 2017 | A1 |
20170344832 | Leung et al. | Nov 2017 | A1 |
20180005448 | Choukroun et al. | Jan 2018 | A1 |
20180075558 | Hill, Sr. et al. | Mar 2018 | A1 |
20180096504 | Valdivia et al. | Apr 2018 | A1 |
20180154514 | Angle et al. | Jun 2018 | A1 |
20180165938 | Honda et al. | Jun 2018 | A1 |
20180182472 | Preston et al. | Jun 2018 | A1 |
20180189756 | Purves | Jul 2018 | A1 |
20180322405 | Fadell et al. | Nov 2018 | A1 |
20180360349 | Dohrmann et al. | Dec 2018 | A9 |
20180365383 | Bates | Dec 2018 | A1 |
20180368780 | Bruno et al. | Dec 2018 | A1 |
20190029900 | Walton et al. | Jan 2019 | A1 |
20190042700 | Alotaibi | Feb 2019 | A1 |
20190043474 | Kingsbury | Feb 2019 | A1 |
20190057320 | Docherty et al. | Feb 2019 | A1 |
20190090786 | Kim et al. | Mar 2019 | A1 |
20190116212 | Spinella-Mamo | Apr 2019 | A1 |
20190130110 | Lee et al. | May 2019 | A1 |
20190156575 | Korhonen | May 2019 | A1 |
20190164015 | Jones, Jr. et al. | May 2019 | A1 |
20190176043 | Gosine et al. | Jun 2019 | A1 |
20190196888 | Anderson et al. | Jun 2019 | A1 |
20190259475 | Dohrmann et al. | Aug 2019 | A1 |
20190282130 | Dohrmann et al. | Sep 2019 | A1 |
20190286942 | Abhiram et al. | Sep 2019 | A1 |
20190311792 | Dohrmann et al. | Oct 2019 | A1 |
20190318165 | Shah et al. | Oct 2019 | A1 |
20190385749 | Dohrmann et al. | Dec 2019 | A1 |
20200101969 | Natroshvili et al. | Apr 2020 | A1 |
20200236090 | De Beer et al. | Jul 2020 | A1 |
20200251220 | Chasko | Aug 2020 | A1 |
20200357256 | Wright et al. | Nov 2020 | A1 |
20200357511 | Sanford | Nov 2020 | A1 |
20210007631 | Dohrmann et al. | Jan 2021 | A1 |
20210110894 | Shriberg et al. | Apr 2021 | A1 |
20210273962 | Dohrmann et al. | Sep 2021 | A1 |
20210358202 | Tveito et al. | Nov 2021 | A1 |
20210398410 | Wright et al. | Dec 2021 | A1 |
20220022760 | Salcido et al. | Jan 2022 | A1 |
20220199252 | Dohrmann et al. | Jun 2022 | A1 |
20220319696 | Dohrmann et al. | Oct 2022 | A1 |
20220319713 | Dohrmann et al. | Oct 2022 | A1 |
20220319714 | Dohrmann et al. | Oct 2022 | A1 |
Number | Date | Country |
---|---|---|
2019240484 | Nov 2021 | AU |
2949449 | Nov 2015 | CA |
104361321 | Feb 2015 | CN |
106056035 | Oct 2016 | CN |
106940692 | Jul 2017 | CN |
107411515 | Dec 2017 | CN |
111801645 | Oct 2020 | CN |
111801939 | Oct 2020 | CN |
111867467 | Oct 2020 | CN |
113795808 | Dec 2021 | CN |
3703009 | Sep 2020 | EP |
3740856 | Nov 2020 | EP |
3756344 | Dec 2020 | EP |
3768164 | Jan 2021 | EP |
3773174 | Feb 2021 | EP |
3815108 | May 2021 | EP |
3920797 | Dec 2021 | EP |
3944258 | Jan 2022 | EP |
3966657 | Mar 2022 | EP |
202027033318 | Oct 2020 | IN |
202027035634 | Oct 2020 | IN |
202127033278 | Aug 2022 | IN |
2000232963 | Aug 2000 | JP |
2002304362 | Oct 2002 | JP |
2005228305 | Aug 2005 | JP |
2008062071 | Mar 2008 | JP |
2008123318 | May 2008 | JP |
2008229266 | Oct 2008 | JP |
2010172481 | Aug 2010 | JP |
2012232652 | Nov 2012 | JP |
2016137226 | Aug 2016 | JP |
2016525383 | Aug 2016 | JP |
2017187914 | Oct 2017 | JP |
1020160040078 | Apr 2016 | KR |
20170069501 | Jun 2017 | KR |
1020200105519 | Sep 2020 | KR |
1020200121832 | Oct 2020 | KR |
1020200130713 | Nov 2020 | KR |
WO2000005639 | Feb 2000 | WO |
WO2014043757 | Mar 2014 | WO |
WO2014210344 | Dec 2014 | WO |
WO2017118908 | Jul 2017 | WO |
WO2018032089 | Feb 2018 | WO |
WO2019143397 | Jul 2019 | WO |
WO2019164585 | Aug 2019 | WO |
WO2019182792 | Sep 2019 | WO |
WO2019199549 | Oct 2019 | WO |
WO2019245713 | Dec 2019 | WO |
WO2020163180 | Aug 2020 | WO |
WO2020227303 | Nov 2020 | WO |
Entry |
---|
Leber, Jessica, “The Avatar will See You Now”, Sep. 17, 2013, MIT Technology Review (Year: 2013). |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2018/057814, Jan. 11, 2019, 9 pages. |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2018/068210, Apr. 12, 2019, 9 pages. |
Bajaj, Prateek, “Reinforcement Learning”, GeeksForGeeks.org [online], [retrieved on Mar. 4, 2020], Retrieved from the Internet :<URL:https://www.geeksforgeeks.org/what-is-reinforcement-learning/>, 7 pages. |
Kung-Hsiang, Huang (Steeve), “Introduction to Various RL Algorithms. Part I (Q-Learning, SARSA, DQN, DDPG)”, Towards Data Science, [online], [retrieved on Mar. 4, 2020], Retrieved from the Internet :<URL:https://towardsdatascience.com/introduction-to-various-reinforcement-learning-algorithms-i-q-learning-sarsa-dqn-ddpg-72a5e0cb6287>, 5 pages. |
Bellemare et al., A Distributional Perspective on Reinforcement Learning:, Proceedings of the 34th International Conference on Machine Learning, Sydney, Australia, Jul. 21, 2017, 19 pages. |
Friston et al., “Reinforcement Learning or Active Inference?” Jul. 29, 2009, [online], [retrieved on Mar. 4, 2020], Retrieved from the Internet :<URL:https://doi.org/10.1371/journal.pone.0006421 PLoS One 4(7): e6421>, 13 pages. |
Zhang et al., “DQ Scheduler: Deep Reinforcement Learning Based Controller Synchronization in Distributed SDN” ICC 2019—2019 IEEE International Conference on Communications (ICC), Shanghai, China, doi: 10.1109/ICC.2019.8761183, pp. 1-7. |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2020/016248, May 11, 2020, 7 pages. |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2019/021678, May 24, 2019, 12 pages. |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2019/025652, Jul. 18, 2019, 11 pages. |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2019/034206, Aug. 1, 2019, 11 pages. |
Rosen et al., “Slipping and Tripping: Fall Injuries in Adults Associated with Rugs and Carpets,” Journal of Injury & Violence Research, 5(1), 61-69. (2013). |
“Office Action”, India Patent Application No. 202027035634, Jun. 30, 2021, 10 pages. |
“Office Action”, India Patent Application No. 202027033121, Jul. 29, 2021, 7 pages. |
“Office Action”, Canada Patent Application No. 3088396, Aug. 6, 2021, 7 pages. |
“Office Action”, China Patent Application No. 201880089608.2, Aug. 3, 2021, 8 pages. |
“Office Action”, Japan Patent Application No. 2020-543924, Jul. 27, 2021, 3 pages [6 pages with translation]. |
“Office Action”, Australia Patent Application No. 2019240484, Aug. 2, 2021, 3 pages. |
“Office Action”, Canada Patent Application No. 3089312, Aug. 19, 2021, 3 pages. |
“Office Action”, Australia Patent Application No. 2019240484, Nov. 13, 2020, 4 pages. |
“Office Action”, Australia Patent Application No. 2018403182, Feb. 5, 2021, 5 pages. |
“Office Action”, Australia Patent Application No. 2018409860, Feb. 10, 2021, 4 pages. |
“Extended European Search Report”, European Patent Application No. 18907032.9, Oct. 15, 2021, 12 pages. |
Marston et al., “The design of a purpose-built exergame for fall prediction and prevention for older people”, European Review of Aging and Physical Activity 12:13, <URL:https://eurapa.biomedcentral.com/track/pdf/10.1186/s11556-015-0157-4.pdf>, Dec. 8, 2015, 12 pages. |
Ejupi et al., “Kinect-Based Five-Times-Sit-to-Stand Test for Clinical and In-Home Assessment of Fall Risk in Older People”, Gerontology (vol. 62), (May 28, 2015), <URL:https://www.karger.com/Article/PDF/381804>, May 28, 2015, 7 pages. |
Festl et al., “iStoppFalls: A Tutorial Concept and prototype Contents”, <URL:https://hcisiegen.de/wp-uploads/2014/05/isCtutorialdoku.pdf>, Mar. 30, 2013, 36 pages. |
“Notice of Allowance”, Australia Patent Application No. 2019240484, Oct. 27, 2021, 4 pages. |
“Extended European Search Report”, European Patent Application No. 19772545.0, Nov. 16, 2021, 8 pages. |
“Office Action”, Australia Patent Application No. 2018409860, Nov. 30, 2021, 4 pages. |
“Office Action”, Korea Patent Application No. 10-2020-7028606, Oct. 29, 2021, 7 pages [14 pages with translation]. |
“Office Action”, India Patent Application No. 202027033318, Nov. 18, 2021, 6 pages. |
“Office Action”, Australia Patent Application No. 2018403182, Dec. 1, 2021, 3 pages. |
Dubois et al., “A Gait Analysis Method Based on a Depth Camera for Fall Prevention,” Proc. of the 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBS), Aug. 30, 2014, pp. 4515-4518 (Abstract only). |
Marston et al., “The design of a purpose-built exergame for fall prediction and prevention for older people,” European Review of Aging and Physical Activity, Dec. 8, 2015, vol. 12, pp. 1-12. |
“Office Action”, Japan Patent Application No. 2020-543924, Nov. 24, 2021, 3 pages [6 pages with translation]. |
“Extended European Search Report”, European Patent Application No. EP19785057, Dec. 6, 2021, 8 pages. |
“Office Action”, Australia Patent Application No. 2020218172, Dec. 21, 2021, 4 pages. |
“Extended European Search Report”, European Patent Application No. 21187314.6, Dec. 10, 2021, 10 pages. |
“Notice of Allowance”, Australia Patent Application No. 2018403182, Jan. 20, 2022, 4 pages. |
“Office Action”, Australia Patent Application No. 2018409860, Jan. 24, 2022, 5 pages. |
“Office Action”, China Patent Application No. 201880089608.2, Feb. 8, 2022, 6 pages (15 pages with translation). |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2021/056060, Jan. 28, 2022, 8 pages. |
“Extended European Search Report”, European Patent Application No. 19822930.4, Feb. 15, 2022, 9 pages. |
“Office Action”, Japan Patent Application No. 2020-550657, Feb. 8, 2022, 8 pages. |
“Office Action”, Singapore Patent Application No. 11202008201P, Apr. 4, 2022, 200 pages. |
“Office Action”, India Patent Application No. 202127033278,,Apr. 20, 2022, 7 pages. |
Wasenmuller et al., “Comparison of Kinect V1 and V2 Depth Images in Terms of Accuracy and Precision”, Computer Vision—ACCV 2016 Workshops (Taipei, Taiwan, Nov. 20-24, 2016), Revised Selected Papers, Part II, Mar. 16, 2017 (Mar. 16, 2017), XP055942856, DOI: 10.1007/978-3-319-54427-4, ISBN: 978-3-319-54427-4 Retrieved from the Internet: URL: https://link.springer.com/content/pdf/10.1007/978-3-319-54427-4_3.pdf>, pp. 1-12. |
Stone et al., “Evaluation of an Inexpesive Depth Camera for In-Home Gait Assessment,” Journal of Ambient Intelligence and Smart Environments Jan. 2011 3(4); pp. 349-361. |
Number | Date | Country | |
---|---|---|---|
20190220727 A1 | Jul 2019 | US |
Number | Date | Country | |
---|---|---|---|
62618550 | Jan 2018 | US |