The present invention relates to a method and computer program product for graphical user interface (GUI) organization control for robotic floor-cleaning devices.
Robotic floor-cleaning devices are an increasingly popular solution for keeping floors clean in residential and commercial settings. Many robotic floor-cleaning systems generate maps of their environments using sensors to better navigate through the environment. However, such maps often contain errors and may not accurately represent the areas that a user may want the robotic floor-cleaning device to service. Further, users may want to customize operation of a robotic floor-cleaning device based on location within a map. For example, a user might want a robotic floor-cleaning device to service a first room with a steam cleaning function but service a second room without the steam cleaning function. A need exists for a method for users to adjust a robotic floor-cleaning map and control operations of a robotic floor-cleaning device based on location within the map.
The present invention proposes a method and computer program product for graphical user interface (GUI) organization control of robotic floor-cleaning devices.
A map of a workspace is generated from data acquired by sensors positioned on a robotic floor-cleaning device. The map is sent to a user interface on a device such as a smartphone, computer, tablet, dedicated remote control, or any device that may display outputs from the system and receive inputs from a user. Through the user interface, a user may make changes to the map boundaries and select settings for the robotic floor-cleaning device to carry out in user-identified areas of the workspace. User adjustments are sent from the user interface to the robotic floor-cleaning device to implement the changes.
The present invention will now be described in detail with reference to a few embodiments thereof as illustrated in the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art, that the present invention may be practiced without some or all of these specific details. In other instances, well known process steps and/or structures have not been described in detail in order to not unnecessarily obscure the present invention.
The terms “certain embodiments”, “an embodiment”, “embodiment”, “embodiments”, “the embodiment”, “the embodiments”, “one or more embodiments”, “some embodiments”, and “one embodiment” mean one or more (but not all) embodiments unless expressly specified otherwise. The terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless expressly specified otherwise. The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise. The terms “a”, “an” and “the” mean “one or more”, unless expressly specified otherwise.
Various embodiments are described hereinbelow, including methods and techniques. It should be kept in mind that the invention might also cover articles of manufacture that includes a computer readable medium on which computer-readable instructions for carrying out embodiments of the inventive technique are stored. The computer readable medium may include, for example, semiconductor, magnetic, opto-magnetic, optical, or other forms of computer readable medium for storing computer readable code. Further, the invention may also cover apparatuses for practicing embodiments of the invention. Such apparatus may include circuits, dedicated and/or programmable, to carry out tasks pertaining to embodiments of the invention. Examples of such apparatus include a general-purpose computer and/or a dedicated computing device when appropriately programmed and may include a combination of a computer/computing device and dedicated/programmable circuits adapted for the various tasks pertaining to embodiments of the invention.
The term “user interface” as used herein refers to an interface between a human user or operator and one or more devices that enables communication between the user and the device(s). Examples of user interfaces that may be employed in various implementations of the present invention include, but are not limited to, switches, buttons, dials, sliders, a mouse, keyboard, keypad, game controllers, track balls, display screens, various types of graphical user interfaces (GUIs), touch screens, microphones and other types of sensors that may receive some form of human-generated stimulus and generate a signal in response thereto.
Various methods currently exist for generating maps of an environment. Simultaneous localization and mapping (SLAM) techniques, for example, may be used to create a map of a workspace and keep track of a robotic device's location within the workspace. The mapping of a device's environment is not included in the scope of the invention, therefore a detailed description thereof is not provided.
Once a map is established, it may be sent to a user interface. Maps may be sent to a user interface at any stage; they do not need to be complete. Through the interface, a user may view the map and take any of a variety of actions. A user interface may be provided through a software application on a computer, tablet, smartphone, or a dedicated remote control. In the preferred embodiment, a user may adjust or correct the map boundaries within the user interface by selecting all or part of a boundary line using a cursor, pointer, stylus, mouse, the user's finger, a button or buttons, or other input device on the user interface. Once a boundary line is selected, a user may be provided with various options, such as, but not limited to, deleting, trimming, rotating, elongating, redrawing, moving in a left direction, moving in a right direction, moving in an upward direction, moving in a downward direction, etc. A user may be given the option to redraw a boundary line using a cursor, pointer, stylus, mouse, the user's finger, a button or buttons, or other input devices.
Maps generated by robotic devices may contain errors, be incomplete, or simply not reflect the areas that a user wishes a robotic floor-cleaning device to service. By adjusting the map, a user may perfect the information that the robotic device has about its environment, thereby improving the device's ability to navigate through the environment. A user may, for example, extend the boundaries of a map in areas where the actual boundaries are further than those identified by the system, or trim boundaries where the system identified boundaries further than the actual or desired boundaries. Even in cases where a system creates an accurate map of an environment, a user may prefer to adjust the map boundaries to keep the device from entering some areas.
Data may be sent between the robotic floor-cleaning device and the user interface through one or more network communication connections. Any type of wireless network signals may be used, including, but not limited to, radio signals, Wi-Fi signals, or Bluetooth signals. Map data collected by sensors of the robotic floor-cleaning device is sent to the user interface, where a user may make adjustments and/or apply or adjust settings. Changes made by a user in the user interface are sent to the robotic floor-cleaning device through the one or more network communication connections.
Robotic floor-cleaning devices may have a plurality of tools that can be used concurrently or independently, such as, but not limited to, a suction tool, a mopping tool, and a UV light for killing bacteria. Robotic floor-cleaning devices may also have various settings, such as a deep cleaning setting, a regular cleaning setting, speed settings, movement pattern settings, cleaning frequency settings, etc. In the preferred embodiment, a user is enabled to adjust all of these settings through the user interface. A user may select with a cursor, pointer, stylus, mouse, the user's finger, a button or buttons, a keyboard, or other input devices any portion of the workspace and select one or more settings to be applied to the area.
Referring to
Referring to
In a next step 201, a user selects desired settings for the selected area. The particular functions and settings available may be dependent on the capabilities of the particular robotic floor-cleaning device in question. For example, in some embodiments, a user may select any of: cleaning modes, frequency of cleaning, intensity of cleaning, navigation methods, driving speed, etc. In a next step 202, the selections made by the user are sent to the robotic floor-cleaning device. In a next step 203, a processor of the robotic floor-cleaning device processes the received data and applies the user changes.
Referring to
Referring to
Additionally, a real-time robotic floor-cleaning device manager may be provided on the user interface to allow a user to instruct the real-time operation of the robotic floor-cleaning device regardless of the device's location within the two-dimensional map. Instructions may include any of turning on or off a mop tool, turning on or off a UV light tool, turning on or off a suction tool, turning on or off an automatic shutoff timer, increasing speed, decreasing speed, driving to a user-identified location, turning in a left or right direction, driving forward, driving backward, stopping movement, commencing one or a series of movement patterns, or any other preprogrammed action.
This application claims benefit of provisional patent application Ser. No. 62/235,408 filed Sep. 30, 2015 and provisional patent application Ser. No. 62/272,004 filed Dec. 28, 2015 by the present inventors.
Number | Name | Date | Kind |
---|---|---|---|
5942869 | Katou et al. | Aug 1999 | A |
5995884 | Allen | Nov 1999 | A |
6535793 | Allard | Mar 2003 | B2 |
6667592 | Jacobs | Dec 2003 | B2 |
7769492 | Wang et al. | Aug 2010 | B2 |
7813835 | Fujita et al. | Oct 2010 | B2 |
8355828 | Tolia et al. | Jan 2013 | B2 |
8364309 | Bailey | Jan 2013 | B1 |
8438695 | Gilbert, Jr. et al. | May 2013 | B2 |
8528157 | Schnittman et al. | Sep 2013 | B2 |
8798834 | Jeong et al. | Aug 2014 | B2 |
8903590 | Jeon | Dec 2014 | B2 |
9008835 | Dubrovsky et al. | Apr 2015 | B2 |
9283674 | Hoffman et al. | Mar 2016 | B2 |
9298183 | Artes et al. | Mar 2016 | B2 |
20030030399 | Jacobs | Feb 2003 | A1 |
20030184436 | Seales et al. | Oct 2003 | A1 |
20060020369 | Taylor et al. | Jan 2006 | A1 |
20080311878 | Martin et al. | Dec 2008 | A1 |
20090082879 | Dooley et al. | Apr 2009 | A1 |
20090306822 | Augenbraun et al. | Dec 2009 | A1 |
20100082193 | Chiappetta | Apr 2010 | A1 |
20110264305 | Choe | Oct 2011 | A1 |
20110267280 | De Mers et al. | Nov 2011 | A1 |
20120229660 | Matthews et al. | Sep 2012 | A1 |
20130056032 | Choe | Mar 2013 | A1 |
20130060379 | Choe | Mar 2013 | A1 |
20130206177 | Burlutskiy | Aug 2013 | A1 |
20140100736 | Kim et al. | Apr 2014 | A1 |
20140207280 | Duffley | Jul 2014 | A1 |
20140303775 | Oh | Oct 2014 | A1 |
20140320661 | Sankar | Oct 2014 | A1 |
20160193729 | Williams | Jul 2016 | A1 |
20160297072 | Williams et al. | Oct 2016 | A1 |
20170283092 | Brown et al. | Oct 2017 | A1 |
20180200888 | Kim | Jul 2018 | A1 |
Entry |
---|
Non-Final Office Action for Related U.S. Appl. No. 15/949,708 dated Jun. 25, 2018, pp. 1 to 24. |
Final Office Action for Related U.S. Appl. No. 15/949,708, dated Jan. 15, 2019, pp. 1 to 22. |
Number | Date | Country | |
---|---|---|---|
62235408 | Sep 2015 | US | |
62272004 | Dec 2015 | US |