MULTIFUNCTION MOBILE UNITS

Information

  • Patent Application
  • 20170364828
  • Publication Number
    20170364828
  • Date Filed
    June 14, 2017
    7 years ago
  • Date Published
    December 21, 2017
    6 years ago
Abstract
A network of multitier user's devices (mobile units) that performs household chores and functions as a mobile general purpose autonomous intelligent machines, having some primary functions and a plurality of additional secondary functions. The primary functions include one or more of vacuum cleaning, lawn mowing, typical drone functionalities, the additional autonomous functions include wardrobe management, premises safety, sentry job, rendering personalized music, self-recharging from typical electric outlet, self-learning to predict the users' behaviors and planning and scheduling the functionalities (decision making) accordingly, providing AI (Artificial Intelligence) assistance and support, Day-In-Life recording and support and so forth. The Day-In-Life recording functionalities involve predicting users' behaviors, identifying routine, salient features and important events ahead of time, and electronically recording and storing the routine events, salient features and important events, with the support of a cloud based central support and services server system.
Description
BACKGROUND
1. Technical Field

The present invention relates generally to artificial intelligence based assistant devices, and, more specifically, devices that perform household chores and functions as a mobile general purpose autonomous intelligent machine.


2. Related Art

Many household and business-related everyday mobile, functional devices that typically perform a single household chore are widely in use today. For example, vacuum cleaners that just function as a vacuum cleaner, lawnmowers that just accomplish lawn mowing (even though they might be robotic, in some sense), and drones that just photograph and video record aerial views of the house or business premises (some of the drones may have some alternative functionalities and/or additional functionalities though) and so forth.


All the above-mentioned devices are mobile and possess wheels or flying rotor blades, accordingly. Many of these devices (vacuum cleaners, lawn mowers, drones and such) are autonomous to a certain degree, but require constant attention (for fuel or recharging them, for example). These devices exhibit limited human interaction capabilities (that is, possess some form of user interfaces) and limited personalization-ability or customizability.


The vacuum cleaners, lawnmowers and drones, as they are available today, do possess certain Internet based updating capability. This makes updating themselves with the latest available firmware from their manufacturer's server possible, with human assistance in most of the cases though, for example.


Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of ordinary skill in the art through comparison of such systems with the present invention.


BRIEF SUMMARY OF THE INVENTION

The present invention is directed to apparatus and methods of operation that are further described in the following Brief Description of the Drawings, the Detailed Description of the Invention, and the claims. Other features and advantages of the present invention will become apparent from the following detailed description of the invention made with reference to the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a perspective block diagram of a mobile unit infrastructure that performs a plurality of additional household chores and functions as a mobile general purpose autonomous intelligent machine;



FIG. 2 is a perspective block diagram detailing the artificial intelligence analyst (decision making) of FIG. 1;



FIG. 3 is a perspective block diagram detailing the user behavior pattern module of FIG. 1;



FIG. 4 is a perspective block diagram detailing the Day-In-Life recorder module of FIG. 1;



FIG. 5 is a perspective block diagram detailing the tier 0-2 user's device of FIG. 1;



FIG. 6 is a perspective block diagram illustrating few additional functionalities of the tier 3 user and manufacturer cloud based systems and services, and artificial intelligence analyst module, of FIG. 1;



FIG. 7 is a perspective block diagram illustrating functionalities of an exemplary wardrobe management role of the mobile units of FIG. 1;



FIG. 8 is a flowchart illustrating the processes involved in the autonomous functioning and decision making of the mobile units; and



FIG. 9 is a flowchart illustrating the processes involved in the Day-In-Life functionality of the mobile units.





DETAILED DESCRIPTION OF THE DRAWINGS


FIG. 1 is a perspective block diagram of a mobile unit infrastructure 105 that performs a plurality of additional household chores and functions as a mobile general purpose autonomous intelligent machine. Whereas primary functions may include one or more of vacuum cleaning, lawn mowing, typical drone functionalities, the additional autonomous functions include wardrobe management, premises safety, sentry job, rendering personalized music, self-recharging from typical electric outlet, self-learning to predict the user's, rest of the family member's and guest's behaviors, and planning and scheduling the functionalities (decision making) accordingly, providing AI (Artificial Intelligence) assistance and support, Day-In-Life recording and support and so forth.


As mentioned above, the mobile units 183, 185, 187 and 189 support wardrobe management functionalities. These functionalities essentially involve clothes sizing, inner garments sizing, socks and shoes sizing, weight measurements and predicting weight gain or loss, height and body volume and BMI (Body Mass Index) measurements, taking stock of clothes in the wardrobe, suggesting clothes for different types of occasions and interfacing with online retailers for new clothes or shoes purchasing (the mobile unit 183, 185, 187 or 189 provides the clothes or shoes dimensions' interior and the user filters the selections), among many other wardrobe related functionalities. In addition, a database records of all the measurements taken are stored in a tier 3 user and manufacturer cloud based systems and services 111. To perform these functions, the tier 3 user and manufacturer cloud based systems and services 111 contains wardrobe management module 145. Similarly, for the wardrobe management functionalities, the mobile units 183, 185, 187 and 189 contain user interface module (audio/visual and keyboard interfaces) 193, artificial intelligence module 195 and user personalization module 197.


For example, the user (or a family member or guest) may come out of shower and say “Bot, measure me.” Then, the mobile unit 183, 185, 187 or 189 circles around the user and says “Your body volume is up by 22% since last month, most to your waist line . . . . Would you like clothes sizing information updated?” Alternatively, the user may have instructed the mobile unit 183, 185, 187 or 189 to measure height, weight and BMI (Body Mass Index) periodically, once a week or a month, for example. The mobile unit 183, 185, 187 or 189 may say “Your periodic measurements are being taken, could you please stand in the center of carpet?” and then start measuring all the above-mentioned sizes, weight and other dimensions.


With a strong online retailer's systems 160 interface, the mobile unit 183, 185, 187 or 189 provides accurate sizes, weight and other dimensions to the online retailer's systems 160 and results in far less likely returns. Orders via the online retailer's systems 160 also take into consideration sizes, weight and other dimensions' fluctuations and anticipate the user putting on weight, losing weight or yo-yoing between gaining and losing weight, based on the fitness records, for example.


In addition, the mobile unit 183, 185, 187 or 189 also observes the contents of the wardrobe based on what the user wears, figures out the wear and tear on the items (for example, figures out out-of-style materials, or worn holes in socks or under garments, or knees, or rips or missing buttons) and suggests updates or offers items with new colors that coordinate. The mobile unit 183, 185, 187 or 189, in conjunction with the online retailer's systems 160 for example, also recommends wardrobe makeover or update. For example, the user may put on a shirt from the wardrobe and the mobile unit 183, 185, 187 or 189 may say “Do you like that dirty shirt?” The user may answer, “It is my favorite, Bot.” The mobile unit 183, 185, 187 or 189 then responds by saying “Ok then, can I offer you some sweater options to hide it from the public?”


The mobile units 183, 185, 187 and 189 also support premises safety functionalities. These functionalities include smoke detection, fire inspection, alarm triggering, making sure that the smoke and fire has been brought to everyone's attention, extinguishing the fire, alerting the user about unusual sounds (that may involve destruction of property) or odors and so forth. Besides the Smoke detection, the mobile unit 183, 185, 187 or 189 also identifies the source of the fire and figures out the extent of the damage. Furthermore, the mobile unit 183, 185, 187 or 189 also informs the user, family members and guests about the smoke, fire or odor, its source and the extent of damage. The mobile unit 183, 185, 187 or 189 informs the user (or a predetermined responsible person) about the smoke, fire or odor even when there is nobody at home, via a communication network 191, that involves mobile (cell) phones (via SMS, for example) or computers (via emails, for example). To support these functionalities, the tier 3 user and manufacturer cloud based systems and services 111 contains safety management module 129.


Another additional autonomous function that the mobile unit 183, 185, 187 or 189 performs is rendering personalized music to the user, family member(s) or guest(s). This functionality involves observing and analyzing the behavior of the user, family member(s) or guest(s) as pertaining to their liking or disliking (studied over a prolonged period), identifying their present moods, and the current contexts and circumstances. For example, in the past, the user might have said “I love this song,” while coming back from the work. Alternatively, during a stressful circumstance, he or she might have said “Bot, please play another song, I don't like this kind of songs when stressed out . . . ” The mobile unit 183, 185, 187 or 189 identifies the current mood and contexts and renders personalized music accordingly. Over a period, the mobile unit 183, 185, 187 or 189 begin to make less and less mistakes. To support the personalized music rendering functionalities, the tier 3 user and manufacturer cloud based systems and services 111 contains user behavior pattern module 139 and behavior observation module 147.


Beyond the primary functions for which they were originally meant for, the mobile unit 183, 185, 187 or 189 also solves many of everyday problems such as a wandering WiFi hub, light turner offer, mobile place to integrate preexisting artificial intelligence engines, working with RoboApps via a Robo SDK (Software Development Kit), integrating with cloud artificial engine tools presently available and so forth. To perform as a wandering WiFi hub, the mobile unit 183, 185, 187 or 189 firstly verifies the WiFi signal strength near the user (who is working on a device that needs to be connected to the Internet to function, for example) and if the signal strength is too low, it amplifies the signal and rebroadcasts them within the vicinity of the user. Similarly, when the light is low, the mobile unit 183, 185, 187 or 189 offers to switch on the light, thereby saving the electricity. For example, the mobile unit 183, 185, 187 or 189 may follow the user, while trying to make out a noise coming from outside the home, and switch on a built-in flash light or switch on the lights along that path. Moreover, the mobile unit 183, 185, 187 or 189 could locate itself exactly to have a look at things, and assume exact locations and angles/configs.


One more application (controlled via a cell phone app) that the mobile unit 183, 185, 187 or 189 performs is that of a sentry job. Some of the mobile units 183, 185, 187 and 189 are designed to function indoors, some others are for outdoors and many others to perform both indoors and outdoors. When they are functioning within a household or business premises, if there are more than one mobile unit 183, 185, 187 or 189, they network together to share responsibilities. Thus, the sentry job involves both indoor and outdoor functionalities and these devices network together to perform a plurality of functionalities. For example, a vacuum cleaner 185 does the sentry job inside a house, during nights to keep a vigil on the house, a lawn mower 189 investigates sounds or flash lights outside the house and responds appropriately. The lawn mower 189 patrols around the house periodically, in an unpredictable manner (to catch the intruders off-guard), and communicates with the vacuum cleaner 185 about the intruders. The network of vacuum cleaner 185, lawn mower 189 and drone 187 together identify the source of the intrusion and inform the user, family members and guests, during nights, for example, by waking them up, about the intrusion and damage done if any. The mobile units 183, 185, 187 and 189 also assemble topography details and share between them, to determine the path to follow in the within the house or building area, garden area and backyard, to perform some specific functionality, such as the sentry job.


As an example of furthermore additional functionalities of the mobile units 183, 185, 187 and 189, the artificial intelligence module 195 is adaptable to follow worn paths, roadways, sidewalks, furrows, ground cover transition edges, and is fully automated, semi-automated and fully driven remotely (internet interaction). Similarly, while in motion, the mobile units 183, 185, 187 and 189 can select nearest charger to recharge themselves. Alternatively, via vision systems and artificial intelligence, instead of charging stations for some of the mobile units 183, 185, 187 and 189, they just identify and plug their charging cables to a nearest electric outlet. This is done by a flexible telescopic system, that raises itself from above the ground, toward the electric outlet and plug into them. Alternatively, the same goal can be accomplished via a vertical rail system, along which the plug raises toward the electric socket and plugs itself in.


The mobile units 183, 185, 187 and 189 can predict and forecast results of many of the actions to be taken. This allows them to compare the present reality with that of the possible reality (if certain actions are taken), via their VR (Virtual Reality) capabilities. This can be done for various actions to be taken, for example, before and after for any repairs, furniture layout, lawn repairs, roof repairs, laying out carpets and so forth. This allows the user to correctly and confidently experiment with new possibilities for his or her home, garden and the premises. For example, the mobile unit 183, 185, 187 or 189 (the lawn mower 189, for example), via a VR headset, demonstrates the looks and feels of the lawn and garden with many new layouts, showing a variety of possibilities, to try. This saves the user financially and allows him or her move ahead with new arrangements for the garden confidently, and is less likely to be disappointed with the money spent on it.


One another important additional autonomous functionality of the mobile units 183, 185, 187 and 189 involves their capability to self-learn, by observing, identifying and analyzing the behaviors of the user, family members and guests. The mobile unit 183, 185, 187 or 189 collects data by observing the patterns in the behaviors of the user, family members and guests, by following them closely during the initial stages. For example, the vacuum cleaner 185 may begin to follow the user all the times after purchasing and collets the data about what he or she does. It observes that as soon as the user wakes up, he or she heads straight to the bathroom to brush. Then takes shower, haves breakfast and leaves home for work. This also involves behavior analysis, by sending this information to the tier 3 user and manufacturer cloud based system and services 111, and comparing with the stored database information to take appropriate and relevant actions. This type of data collected over a prolonged period allows the vacuum cleaner 185 to assist the user in a variety of ways. Moreover, such collected data, in aggregation, from many of the users, also allows the tier 3 user and manufacturer cloud based system and services 111 to predict behaviors of users in general.


Yet another important additional autonomous functionality of the mobile units 183, 185, 187 and 189 involves Day-In-Life recording. The mobile units 183, 185, 187 and 189 communicate with each other to determine the salient features (as well as mundane features) of the day's events of the user, family members and guests and offer to store the recorded audio/video clips of them. The vacuum cleaner 185 and drone 187, for example, record events of a party held at home, by the user, family members and guests, and store them in the tier 3 user and manufacturer cloud based system and services 111, with the user's permission. The mobile units 183, 185, 187 and 189 also offer to store weather conditions, many other (news) happenings and so forth.


To perform all the abovementioned functionalities, that include both the primary and secondary autonomous functionalities, the tier 0-2 user's device 181 contains user interface module 193, artificial intelligence module 195, user personalization module 197 and premises safety module 199. Similarly, the tier 3 user and manufacturer cloud based system and services 111 contains operational control support 113, user/app command interface 115 and vice recognition and synthesis 117, WiFi/Bluetooth control module, repair support module 121, Day-In-Life recorder module 143, artificial intelligence engines 123, sensor processing support 125 and voice, security, recognition 127, safety management module 129, prediction/forecasting module 131, wardrobe management module 145, sentry support module 133, user personalization module 135 and music rendering module 137, user behavior pattern module 139, artificial intelligence analyst (decision making) 141 and behavior observation module 147. Finally, the communication between the tier 0-2 user's device 181, tier 3 user and manufacturer cloud based systems and services 111 and online retailer's systems 160 occurs via communication networks 191, that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.



FIG. 2 is a perspective block diagram detailing the artificial intelligence analyst (decision making) of FIG. 1. The tier 0-2 user's devices 281 (mobile units) are smart machines, they can learn from their environment and make decisions autonomously. Over a period, after activation (that is, from the day they are put to work), they begin to improve and perfect their responses, in response to the environmental inputs, which includes specifically the user's inputs, but also that of the family members' and guests′. The mobile units 283, 285, 287 and 289 learn, with functional support from artificial intelligence analyst 211, by collecting data on the conditions in which they can operate, and then they identify patterns in these environmental inputs. Further, they make plan, and schedule, for repeating such operations in the future based on the patterns detected.


In addition, the artificial intelligence based operations of the mobile units 283, 285, 287 and 289 (with support from the artificial intelligence analyst 211) in a user's premises stretch from automating mundane tasks (vacuuming or lawn mowing, for example) to predict user, family member or guest behaviors, and suggesting them different possible actions. For example, the vacuum cleaner 285 follows the user as soon as he or she returns from the work (in the beginning of its life as a mobile unit), and begins to ask questions, such as “What kind of music you like soon as you return from work?” and “What kind of drinks you like to be prepared now?” and in conjunction with the artificial intelligence analyst 211 analyzes the answers to figure out what is the best possible actions to take after returning from the work. In addition to questioning, the vacuum cleaner 285 also follows the user to figure out the house layout and where all things are placed. Then, it begins to offer many possibilities and lets the user choose what they want right then.


Artificial intelligence based operations of the mobile units 283, 285, 287 and 289 (with support from the artificial intelligence analyst 211) supplement user's own regular scheduling and planning. The user receives images and graphics showing weather patterns, lawn mowing and repairs to be conducted to mitigate weather impact (including what-if scenarios for repairs), and rescheduling of outdoor sprinkler and lawnmowing operations. For example, lowering the grass height to be trimmed by lawnmowing while simultaneously increasing sprinkler watering time upon determination that the next week is anticipated to be hotter and drier than usual.


The artificial intelligence analyst 211 employs proprietary algorithms that weighs hundreds of variables to produce probabilistic forecasts of weather impact on a given user's or residence's lawnmowing schedule, repair schedule, home package delivery schedule, vacuum cleaning schedule and so forth. To perform this functionality, the artificial intelligence analyst 211 consists of probabilistic forecast module 219. The artificial intelligence analyst 211 then recommends ways to lessen disruptions or save money and time, if rescheduling is inevitable. The artificial intelligence analyst 211 presents different possible outcomes of various decisions, and it ranks possible scenarios associated with an anticipated weather patterns. If user's inputs are not available at that time, the artificial intelligence analyst 211 autonomously takes decision and responds to the calamity by mitigating the impacts. Part of the artificial intelligence analyst 211 is stand-alone software, such as decision support software in tier 0-2 user's devices 281. The other part is embedded within larger software systems in the artificial intelligence analyst (decision making) 211.


The autonomous decision support module 213 provides support to the tier 0-2 user's device 281 (such as telescopic image bot 283, vacuum cleaner 285, indoor/outdoor drone 287 and lawnmower 289) as and when needed, in cases of being offline, the tier 0-2 user's devices 281 perform these functions all by themselves. It interacts with most of the modules of the artificial intelligence analyst 211 as well as tier 0-2 user's device 281 to provide autonomous response capabilities to the tier 0-2 user's devices 281. For example, the autonomous decision support module 213 assists the tier 0-2 user's devices 281 to construct a schedule of autonomous operation for themselves.


The artificial intelligence analyst 211 conducts data collection and management of behavioral tracking, prediction and recommendation. To perform this, the artificial intelligence analyst 211 consists of user behavior prediction module 215 and guest user behavior prediction module 217. The robotic movement and decision making module 221 assists household or business premises mobile units 283, 285, 287 and 289 network together by sharing necessary information and assisting in making decisions (for all cooperating smart tier 0-2 user's devices 281), and assists in robotic movements as well. The situational and threat analysis module 223 provides the mobile units 283, 285, 287 and 289 capability to analyze based on contexts (situation based) and respond accordingly, and analyze threats to the people living in the premises (that is, the user, family members and guests) and pets. The natural language processing module 225 provides the capability of processing incoming spoken language and responding (user interaction capabilities) accordingly.


The medical diagnostics (and symptoms) module 233 supports the tier 0-2 user's devices 281 capability of diagnosing physical and mental illnesses based on symptoms (not as a substitution for licensed practitioner's diagnosis though). For example, heart disease (Coronary Artery Disease CAD or Ischemic Heart Disease, for example, is the most common disease in the world) symptoms can be identified and diagnosed by the tier 0-2 user's devices 281, with the assistance of the medical diagnostics (and symptoms) module 233. The most common heart disease symptoms include chest pain (angina—generally triggered by physical or emotional stress), shortness of breath (occurs when the heart cannot pump enough blood to meet body's needs) or extreme fatigue with exertion and heart attack. These symptoms are observed by the tier 0-2 user's devices 281 and sent to the medical diagnostics (and symptoms) module 233 for further analysis. In addition, a medical patient treatment module 235 manages the tier 0-2 user's devices 281 capability of providing patients (who are already diagnosed with a disease by a medical practitioner) living in the premises with treatment and therapy. For example, the tier 0-2 user's devices 281 have the capability of providing therapeutic assistance the user's or a family member's diabetes by attempting to restore carbohydrate metabolism to a normal state. (Patients with diabetes tend to have absolute deficiency of insulin.) This goal is achieved by the tier 0-2 user's devices 281 by providing the insulin replacement therapy (as prescribed by the therapist), which is given through injections or an insulin pump.


Interactive computer avatar module 231 are chat bots, they provide the tier 0-2 user's devices 281 capabilities of different avatars talking with different residents and guests of the premises. Pattern recognition module 237 provides image, signal or sequence based pattern recognition (in a complex stream of data) capabilities. Similarly, next sequence/games prediction module 239 makes predictions used in games and behavior prediction possible. Identity systems module 241 is utilized in identifying living residents, handymen and others authorized to work at the premises, as well as intruders within the premises. Advanced decision simulation module 243 provides the tier 0-2 user's devices 281 capability to perform advanced decision simulation for entertainment purposes, such as anticipating the movies residents would want to watch, music that residents might prefer to listen to at a given time and so forth.


Furthermore, anticipatory operations module 227 provides the tier 0-2 user's devices 281 capability to perform anticipatory operations for the premises defense. Automated service & management 229 performs automated services of the tier 0-2 user's devices 281 and management of service activities for the tier 0-2 user's devices 281. Surveillance systems monitoring module 245 provides prediction and analysis capabilities to the tier 0-2 user's devices 281. Cognitive assistance module 247 reasons, learns, and accepts guidance to provide effective and personalized support to the users, family members and guests. Finally, believable and intelligent non-player characters (not shown) are also utilized to enhance the user's, family member's and guest's gaming experiences.


The communication between the systems occur via communication networks 291, that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.



FIG. 3 is a perspective block diagram detailing the user behavior pattern module of the FIG. 1. The user behavior pattern module 311 consists of user behavior observation support module 313, which provides support (higher level behavioral observation processing support) to the basic level observations done by behavior observation module 393 of the tier 0-2 user's devices 381. The user behavior observation starts with the audiovisual recording of face and facial muscle movements, and determining that the face belongs to the user (owner) of the mobile unit 383, 385, 387 or 389. Furthermore, the user behavior observation also involves bodily movements' video and audio recording. Similarly, in the current situation, the environmental context in which the user exists are also recorded. This information is kept stored, for further processing, in the server (tier 3 user and manufacturer cloud based system and services 111 of FIG. 1). The family member/guest behavior observation support module 319, in conjunction with behavior observation module 393 of the tier 0-2 user's devices 381, performs similar recording and processing with regards to each of the family members and guests (though it is done with somewhat lesser storage and power consumption than that of the user or owner).


The user behavior pattern module 311 also consists of user behavior pattern identification support module 315 and family member/guest behavior identification support module 321. The user behavior pattern identification support module 315 works in conjunction with behavior identity module 395 of the tier 0-2 user's devices 381. The most important function of the user behavior pattern identification support module 315 is to find patterns in the recorded audiovisual streams. For example, the incoming stream of audiovisual information (coming in live, from the sensors) are compared with the stored information to find patterns of behaviors. The contexts (coming in live, from the sensors) are also compared with the stored information to find patterns, in an analogous manner. The family member/guest behavior identification support module 321 processes in a similar manner for the family members and guests (and works in conjunction with behavior identify module 395 of the tier 0-2 user's devices 381 as well). These patterns are stored in the tier 3 user and manufacturer cloud based system and services 111 of FIG. 1, for the future use (that assists in autonomous functioning of the mobile units 383, 385, 387 and 389).


These identified patterns are further processed by the user behavior pattern analysis support module 317, together with the behavior analysis module 397 of the tier 0-2 user's devices 381. The family member/guest behavior analysis support module 323 also performs similar processing for the family members and guests (and works in conjunction with behavior identify module 395 of the tier 0-2 user's devices 381). These analyses are mostly done by the user behavior pattern analysis support module 317 or family member/guest behavior analysis support module 323, based mostly upon statistical methodologies. These processes of analysis assist in generation of autonomous operations of the mobile units 383, 385, 387 and 389. The communication between the systems occur via communication networks 391, that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.



FIG. 4 is a perspective block diagram detailing the Day-In-Life recorder module of FIG. 1. The Day-In-Life recording, as a concept, essentially involves keeping a record of salient features of everyday of the user (and user's family members) as a chronicle. They include maintaining an audiovisual record of the user and family members, news happenings of the day and user's and family member's reactions to them, weather conditions of the day, and the user's and family member's emotions throughout the day. All these records are kept connected to each other and with the user's (and family member's permission). The user can edit the Day-In-Life records, if they wish so. Furthermore, the days for which records are kept need not necessarily be for every day, but can also be only for days in which special events are going to take place, only for prescheduled days (once a week, once a month or once a year and so forth, for example), only for prescheduled times during every day (only after returning from work, for couple of hours, for example) and so forth.


The Day-In-Life recorder module 411 consists of user behavior observation module 413, which assists in recording the user, in their environment, and stores them. In specific, the user behavior observation involves audiovisual recording of the user's face and body, after determining to whom the face belongs (user, family members, guests or intruders), and recording the family members and guests and the environmental context in which the user, family members and guests exist. This information is kept stored, as a record of events, in the server (tier 3 user and manufacturer cloud based system and services 111 of FIG. 1). The purpose of the user behavior observation module 413 is just to identify the user, family members, guests and intruders (if any) and identify the events which might be important (to be able to record the salient features of the day), but not necessarily to take actual pictures or audiovisual clips of high quality.


The Day-In-Life recorder module 411 also consists of camera management support module 415 and video clip management support module 419. The user behavior observation module 413 identifies user determined or self-determined moments (either in terms of periods at which the images/audiovisual clips to be taken or periods determined based upon the information collected from various sources) and the camera management support module 415 and/or video clip management support module 419 instruct to take images/audiovisual clips of high quality and store them in photo storage database 417 and/or video clip storage database 421. The special events identification module 431 identifies special moments in the user's and family member's lives and store them for the use, as mentioned above, of user behavior observation module 413.


Calendar/Dairy management support module 433 maintains records in written texts (based on the inputs from the user, family members and guests) their opinions, suggestions and simple explanations of the events, for example. Similarly, weather conditions record module 439 stores recorded weather events outside the premises. The Day-In-Life search module 437 makes searching through the photo storage database 417 and/or video clip storage database 421 possible. Finally, the tier 0-2 user's devices 481 (such as mobile units 483, 485, 487 and 489) contain camera photo/video clip record module 493, event record scheduling module 495, calendar/dairy record module 497 and weather conditions identification module, which contain same processing features as that of the Day-In-Life recorder module 411 at very basic levels, so that the mobile unit 483, 485, 487 or 489 can continue to function in case of emergencies (Internet unavailability, for example).


For example, via a command “Bot, Cheese” or “Bot, Action” and via a Day-In-Life recorder module 411 service, the mobile unit 483, 485, 487 or 489 recognizes the command and the Day-In-Life recorder module 411 instruction directs the mobile unit 483, 485, 487 or 489 to capture the video/image/audio.


Finally, the communication between the systems occur via communication networks 491, that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.



FIG. 5 is a perspective block diagram detailing the tier 0-2 user's device of FIG. 1. The functionalities available with the tier 0-2 user's device 581 are not much different from those available at tier 3 user and manufacturer cloud based systems and services 511 (as described with reference to the FIG. 1, FIG. 2, FIG. 3 and FIG. 4), but they are available at basic levels, so that they are resource-efficient and can function autonomously (even when Internet connection is not available, for example). Typical tier 0-2 user's devices 581 include telescopic image bot 583, vacuum cleaner 585, indoor/outdoor drone 587, lawnmower 589 among many other specific designs that serve a plurality of purposes. They have basic level autonomous capabilities as well as advanced capabilities, when they are connected to the tier 3 user and manufacturer cloud based systems and services 511 or other custom built servers.


Artificial intelligence module 525 provides some basic AI functionalities to the mobile units 483, 485, 487 and 489, but also interacts with the tier 3 user and manufacturer cloud based systems and services 511 and functions cooperatively and works to provide advanced AI functionalities. For example, the user or a family member might say, “Bot, play me some music . . . ” The vacuum cleaner 585 checks mood of the user or family member, by taking a video clip of the face and sending it to the tier 3 user and manufacturer cloud based systems and services 511, and plays music that they like for such a mood. This decision accounts for the experience gained by the vacuum cleaner 585 by interacting with them over a period.


User interface module 521 manages gesture, voice, visual, keyboard and remote control communications based interactions with the user, family members or anyone else. Similarly, wireless communication (WiFi/BT) module 525 manages wireless communications, by identifying possibilities for wireless connections (via WiFi and Bluetooth connects, for example) and logging on to the Internet, autonomously. Lighting control module 527 manages the lighting of the house and premises while music rendering module 531 handles the music delivery aspects of the tier 0-2 user's devices 581. Live camera support module 533 handles built-in camera for live communications or live broadcast. User personalization module 541 manages the user's, family members' and guest's personalization information, either gathered via a voice/video/screen-keyboard based questionnaire or via everyday interactions with them. This information is utilized for the future interactions with the user, family members and guests.


Premises safety module 543 are also built into the vacuum cleaners 585, lawn mowers 589 and drones 587, which together network to perform safety related functionalities. They identify the source of the intrusion, for example, and inform the user, family members and guests, especially during the nights. The tier 0-2 user's devices 581 wake them up, inform about the intrusion and damage done, if any. Also, they provide emergency medical support to the users, family members and the guests.


GPS based device positioning module 545 assists in positioning the mobile unit 583, 585, 587 or 589 in some specific position and with some specific angle. This functionality is essential, for example, in taking pictures or audiovisual clips from a clear and best possible angle, in a party environment (with many guests), to get that perfect picture or video clip. Furthermore, self-recharging support module 547 manages aspects related to recharging of the mobile units 583, 585, 587 and 589, all by themselves. When the power level goes down beyond a preset level, the mobile unit 583, 585, 587 or 589 identifies the charging stations or electric outlets, and plugin themselves to recharge, without human assistance.


AR house and garden mapping module 549 manages mapping of the entire house or business premises and keep them stored for future use. The AR (Augmented Reality) functionality requires that this map is available, for example, for the user to view remotely, or to fathom how the house would appear when repairs are done, carpet is changed or wall painting is done. Lab-on-a-chip diagnostics module 551 handles a plurality of diagnosis related functions of the mobile units 583, 585, 587 and 589. Clothing and wardrobe support module 553 handles the networking of some of the mobile units (such as vacuum cleaner 585 and drone 587) and then performing the wardrobe related functionalities (as described with reference to the FIG. 1).


The communication between the tier 3 user and manufacturer cloud based systems and services 511 and mobile units 583, 585, 587 and 589 occur via communication networks 591, that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.



FIG. 6 is a perspective block diagram illustrating few additional functionalities of the tier 3 user and manufacturer cloud based systems and services, and artificial intelligence analyst module, of FIG. 1. The tier 3 user and manufacturer cloud based systems and services 611, and artificial intelligence analyst module 613, support a plurality of user devices, gain experiences from interactions with them (by collecting data about their preferences, choices and reactions, under a variety of contexts) and apply the experiences to new and emerging situations. That is, when the contexts are similar, the tier 3 user and manufacturer cloud based systems and services 611, and in specific, artificial intelligence analyst module 613, attempt to figure out the response to a given input and then apply the best possible response.


For example, by interacting with thousands of users, the tier 3 user and manufacturer cloud based systems and services 611, and artificial intelligence analyst module 613, learn that when friends gather in the evening, there is a likelihood of a party happening. This situation deserves a Day-In-Life recording, the user device 661, 663 or 665, assisted by the tier 3 user and manufacturer cloud based systems and services 611, and artificial intelligence analyst module 613, takes the decision autonomously (with the support of a decision support server 671, in cases of emergencies).


The most important function of the decision support server 671 is to assist the user devices 661, 663 or 665, to function autonomously, by taking swift decisions. Its functionality is same as that of the autonomous decision support module 213 (of FIG. 2), however, in this embodiment, it is implemented as a separate server.


For example, when the intruders trespass the business property, especially during nights, the decision support server 671 support autonomous decisions of the user device 661, 663 or 665 (supported by the tier 3 user and manufacturer cloud based systems and services 611, and artificial intelligence analyst module 613), in emergencies. As a result, the user device 661, 663 or 665 responds rapidly, by switching on all the lights and alarms (for example), thus waking up the users and family members.


Finally, the communication between the tier 3 user and manufacturer cloud based systems and services 611, user devices 661, 663 and 665 and decision support server 671 is implemented via communication networks 691, that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.



FIG. 7 is a perspective block diagram illustrating functionalities of an exemplary wardrobe management role of the mobile units of FIG. 1. In accordance with the present embodiment, the mobile unit (a vacuum cleaner or a custom-built unit) 727 or 715 has a built-in drone 723 or 721 mounted on it. The drone 723 or 721 parks on the top of the vacuum cleaner 727 or 715 and charges itself. The drone 723 or 721 and vacuum cleaner 727 or 715 network, while the drone 723 or 721 is flying, for example, communicate and share information between each other via a wired connection, Bluetooth® or WiFi connection.


The drone 723 or 721 has a plurality of functionalities, such as taking stock of clothes in the wardrobe 719, suggesting clothes for different types of occasions and interfacing with online retailers for new clothes or shoes purchasing. Similarly, the vacuum cleaner 727 or 715 too has a plurality of functionalities, that include clothes sizing, inner garments sizing, socks and shoes sizing, weight measurements and predicting weight gain or loss, height and body volume and BMI (Body Mass Index) measurements, among many other wardrobe 719 related functionalities.


For example, the user (or a family member or guest) 717 may come out of shower and say “Bot, measure me.” Then, the vacuum cleaner 715 circles around the user 717 and via infrared light reflections, for example, measures the body volume. Then, the vacuum cleaner 715 may reply “Your body volume is up by 10% since last month, most to your waist line and thighs. Your waist line volume is up by 26% . . . . Would you like clothes sizing information updated?” Then, the vacuum cleaner updates the cloth, undergarments sizing and if the user 717 requests, it purchases clothes and under garments for the user 717. Simultaneously, with that updated information sent to the drone 721, the drone 721 checks for any cloth that fits the user 717 and suggests the cloths for the user 717.


Alternatively, if the user instructs the vacuum cleaner 715 to measure height, weight and BMI (Body Mass Index) periodically, the vacuum cleaner 715 responds by saying “Your periodic measurements are being taken, could you please stand still?” and then measures the height, weight and BMI.



FIG. 8 is a flowchart illustrating the processes involved in the autonomous functioning and decision making of the mobile units. The processes begin at a block 807, when the user puts a brand new (or at first) mobile unit to work, in residential and business premises. Then, at a next block 809, the mobile unit begins to follow the user and observe the user's behavior. That is, the mobile unit records the user throughout the day that he or she exists in the residential or business premises and stores them in the cloud.


At a next block 811, the mobile unit analyzes the user's behavioral pattern, by identifying routines in the user's everyday behavior. For the mobile unit, these patterns in behavior are essential in sketching out a plan, and its autonomous functioning and decision making. At a next block 813, the mobile unit analyzes the scheduled activity. This process involves figuring out preexisting (or even manufacturer set) scheduled activity of the mobile unit, either stored locally or existing in the cloud.


At a next block 815, the weather forecasting is factored-in in the process of scheduling the activity. This is important because some of the scheduled activity execution is entirely dependent on the weather conditions.


Then, at a next block 817, the mobile unit develops a schedule of autonomous operation for itself. This may take up to a few weeks for developing such an autonomous scheduled activity, fully finalized. For example, in a simplified logic, the user may wake up in the morning, may finish his or her routine bathroom tasks and may switch on television at about 7 AM. Then, he or she might have breakfast. In the meanwhile, he or she might vacuum the house. So, the morning scheduled activity for the mobile unit would be to switch on the television at 7 AM, and vacuum the house after that.


At a next block 819, the mobile unit begins to follow its fully developed scheduled activities and makes decisions based on this logic where need be (for example, to switch on the television only when the user is at the premises). The process ends at the block 821.



FIG. 9 is a flowchart illustrating the processes involved in the Day-In-Life functionality of the mobile units. The processes begin at a start block 907. Then, at a next block 909, the mobile unit fetches behavioral analysis and activity schedule data, from the processes of autonomous functioning and decision making (described with reference to the FIG. 8). Then, at a next block 911, the mobile unit develops a schedule for predictable Day-In-Life events.


Further, at a next block 913, the mobile unit predicts the behavioral patterns of the user. The mobile unit then attempts to identify present behaviors and their contexts, at a next block 915. Once the behavioral patterns are predicted and present behaviors and their contexts identified, and the contents are compared, the mobile unit decides that the event taking place is important for the user and offers to record video clips/audio clips and/or take pictures, at a next block 917. Along with that, the mobile unit also records salient weather features of the day as well, at a next block 919.


For an example of the abovementioned processes, consider the user having a habit of waking up at 7 AM, brushing teeth, switching on the television and watching news (most of the days). The mobile unit predicts this from the behavioral analysis and activity schedule data and occasionally or on days of significant news (detected based on the social media activities, and keyword and context based searches, for example), it offers to record an audio/video clip recording or take pictures (along with the news clip). Similarly, when the mobile unit identifies the user and family members excited on a Saturday evening, and identifies friends and relatives arriving, it decides that the ongoing event is important for the user, and starts recording audio/video recording and take pictures (along with weather conditions). At the end of the day or next day, the mobile unit allows the user to edit it, before uploading the audio/video, image, news clip and weather condition clip contents to the cloud.


As one of ordinary skill in the art will appreciate, the terms “operably coupled” and “communicatively coupled,” as may be used herein, include direct coupling and indirect coupling via another component, element, circuit, or module where, for indirect coupling, the intervening component, element, circuit, or module does not modify the information of a signal but may adjust its current level, voltage level, and/or power level. A s one of ordinary skill in the art will also appreciate, inferred coupling (i.e., where one element is coupled to another element by inference) includes direct and indirect coupling between two elements in the same manner as “operably coupled” and “communicatively coupled.”


Although the present invention has been described in terms of GPS coordinates/and navigational information communication involving mobile phones and computers, it must be clear that the present invention also applies to other types of devices including mobile devices, laptops with a browser, a hand held device such as a PDA, a television, a set-top-box, a media center at home, robots, robotic devices, vehicles capable of navigation, and a computer communicatively coupled to the network.


The present invention has also been described above with the aid of method steps illustrating the performance of specified functions and relationships thereof. The boundaries and sequence of these functional building blocks and method steps have been arbitrarily defined herein for convenience of description. Alternate boundaries and sequences can be defined so long as the specified functions and relationships are appropriately performed. Any such alternate boundaries or sequences are thus within the scope and spirit of the claimed invention.


The present invention has been described above with the aid of functional building blocks illustrating the performance of certain significant functions. The boundaries of these functional building blocks have been arbitrarily defined for convenience of description. Alternate boundaries could be defined as long as the certain significant functions are appropriately performed. Similarly, flow diagram blocks may also have been arbitrarily defined herein to illustrate certain significant functionality. To the extent used, the flow diagram block boundaries and sequence could have been defined otherwise and still perform the certain significant functionality. Such alternate definitions of both functional building blocks and flow diagram blocks and sequences are thus within the scope and spirit of the claimed invention.


One of average skill in the art will also recognize that the functional building blocks, and other illustrative blocks, modules and components herein, can be implemented as illustrated or by discrete components, application specific integrated circuits, processors executing appropriate software and the like or any combination thereof.


Moreover, although described in detail for purposes of clarity and understanding by way of the aforementioned embodiments, the present invention is not limited to such embodiments. It will be obvious to one of average skill in the art that various changes and modifications may be practiced within the spirit and scope of the invention, as limited only by the scope of the appended claims.

Claims
  • 1. A mobile electronic system network, comprising: a plurality of user's mobile devices;a cloud based central support and services server system; andthe plurality of user's mobile devices performs a multitude of autonomous functionalities, comprising: wardrobe management, comprising: clothes, garments and shoes sizing;waist, height, body volume and BMI (Body Mass Index) measurements;predicting weight gain or loss;taking stock of clothes in the wardrobe;suggesting clothes for different types of occasions; andinterfacing with online retailers for new clothes or shoes purchasing; andthe cloud based central support and services server system provides support and services to the plurality of user's mobile devices.
  • 2. The mobile electronic system network of claim 1, wherein the multitude of autonomous functionalities further comprising: premises safety, comprising: smoke detection;fire inspection;alarm triggering;identifying the source of smoke, unusual odor and fire;alerting about smoke, unusual odor and fire; andextinguishing the fire.
  • 3. The mobile electronic system network of claim 1, wherein the multitude of autonomous functionalities further comprising: sentry job, comprising: keeping a vigil on indoors of the premises;identifying intruders, indoors and outdoors;patrolling indoor and outdoor, within the premises;investigating sounds or flash lights outdoors of the premises;identifying the source of intrusion; andwaking up and altering the users.
  • 4. The mobile electronic system network of claim 1, wherein the multitude of autonomous functionalities further comprising: self-recharging, comprising; identifying typical electric outlet;plugging itself in;identifying recharging station; andplugging itself in.
  • 5. The mobile electronic system network of claim 1, wherein the multitude of autonomous functionalities further comprising: self-learning, to predict the users' behaviors; andapplying the learnt knowledge in the future interactions.
  • 6. The mobile electronic system network of claim 1, wherein the multitude of autonomous functionalities further comprising providing AI (Artificial Intelligence) assistance.
  • 7. The mobile electronic system network of claim 1, wherein the multitude of autonomous functionalities further comprising rendering personalized music.
  • 8. The mobile electronic system network of claim 1, wherein the multitude of autonomous functionalities further comprising diagnosing the users' illnesses.
  • 9. The mobile electronic system network of claim 1, wherein the cloud based central support and services server system further comprising an artificial intelligence analyst module that collects data about a plurality of users' preferences, choices and reactions, under a variety of contexts, and applies this knowledge to new and emerging situations.
  • 10. The mobile electronic system network of claim 1, wherein the user's mobile devices comprising vacuum cleaners.
  • 11. The mobile electronic system network of claim 1, wherein the user's mobile devices comprising lawn mowers.
  • 12. The mobile electronic system network of claim 1, wherein the user's mobile devices comprising drones.
  • 13. A mobile electronic network infrastructure, comprising: a plurality of user's mobile devices;a cloud based central support and services server system; andthe plurality of user's mobile devices performs Day-In-Life recording functionalities, comprising: predicting users' behaviors;identifying routine and salient features of the day ahead of time;identifying important events ahead of time, by observing the user contexts; andelectronically recording and storing the routine events, salient features and important events; andthe cloud based central support and services server system provides support and services to the plurality of user's mobile devices.
  • 14. The mobile electronic network infrastructure of claim 13, wherein the predicting user's behavior comprising: observing the user's behavior through the day;developing a schedule for the observed routine behaviors; anddeveloping a schedule for anniversaries and repeating events.
  • 15. The mobile electronic network infrastructure of claim 13, wherein the Day-In-Life recording functionalities further comprising recording and storing news of the day.
  • 16. The mobile electronic network infrastructure of claim 13, wherein the Day-In-Life recording functionalities further comprising recording and storing weather conditions of the day.
  • 17. The mobile electronic network infrastructure of claim 13, wherein the Day-In-Life recording functionalities further comprising time stamping the recordings.
  • 18. A method performed by a user's device, to produce its autonomous functioning and decision making, the method comprising: following the user and observing the pattern of behaviors;analyzing the pattern of behaviors;analyzing the scheduled activities;factoring in the weather conditions;developing a schedule of autonomous operation; andoperating autonomously, in accordance with the scheduled activities.
  • 19. The method of claim 18, wherein the observing the pattern of behaviors comprising recording the user behavior and comparing it with the present behavior for any similarities.
  • 20. The method of claim 18, wherein the observing the pattern of behaviors comprising recording the user behavior contexts and comparing it with the present behavior contexts for any similarities.
CROSS REFERENCES TO RELATED APPLICATIONS

The present U.S. Utility Patent Application claims priority pursuant to 35 U.S.C. §119(e) to U.S. Provisional Application No. 62/350,187, entitled “Modular Mobile Units,” filed Jun. 15, 2016, which is hereby incorporated herein by reference in its entirety and made part of the present U.S. Utility Patent Application for all purposes.

Provisional Applications (1)
Number Date Country
62350187 Jun 2016 US