This application is a § 371 National Phase of PCT/AT2016/060108, filed Nov. 11, 2016, the entirety of which is incorporated by reference and which claims priority to German Patent Application No. 10 2015 119 501.1, filed Nov. 11, 2015.
The present description relates to the field of autonomous mobile robots, in particular to the sectoring of maps of an area of robot operation through which the robot moves and by the means of which the robot orients itself.
There are numerous autonomous mobile robots available for a wide variety of domestic or commercial use, for example, for finishing or cleaning floor surfaces, transporting objects or inspecting an environment. Simple devices can manage this without compiling and using a map of the robot's area of operation, for example, by randomly moving over a floor surface intended for cleaning (cf. e.g. the publication EP 2287697 A2 of iRobot Corp.). More complex robots make use of a map of the area of robot operation which they compile themselves or which is made available to them in electronic form.
The map of such an area of robot operation is, as a rule, quite complex and not designed to be readable for a human user. Reading the map, however, may be necessary in order to plan the tasks that the robot is to carry out. In order to simplify the task planning, the area of robot operation can be automatedly sectored into zones. Numerous methods exist for this purpose. Known from the relevant academic literature are various abstract methods for sectoring an area intended for cleaning by means of which, for example, the planning of the robot's path through the area may be simplified or it may be ensured that the floor surface is evenly covered. Due to the fact that these abstract methods do not take into consideration typical characteristics of a human environment (e.g. an apartment), they generally suffer the deficiency of not being geared towards the needs of a user and as a result exhibit a behavior that may be difficult for the human user to understand.
One very simple method is to sector the area of robot operation into several small uniform zones of a predefined shape and size. The zones are then processed according to a standard, previously defined procedure (e.g. they are cleaned). Sectoring the area of robot operation (e.g. an apartment) into rooms such as, for example, living room, corridor, kitchen, bedroom, bathroom, etc. is easy to understand for the user. In order to achieve this, the robot (or a processor connected thereto) will try to determine the position of doors and walls with the aid of, for example, a ceiling camera, a distance sensor aimed at the ceiling, or based on typical geometric features such as, e.g. the width of a door. Another known method is to sector the area of robot operation along the borders of floor coverings, which can be detected by the robot with the aid of sensors. Sectoring the area of operation in this way makes it possible, for example, to select special cleaning methods in dependency of the kind of floor covering. The map and its zones can be shown to the human user, who may then correct the sectoring or adapt it to his/her needs, e.g. by shifting the zone borders or adding new ones.
The application improves upon known methods for sectoring a map of an area of robot operation, as well as to improve and, in particular, render more flexible the use of the map itself. The disclosure provides a method and a robot having the features and structures recited herein. Various embodiments and further developments of the present disclosure are further recited herein.
A method for the automatic sectoring a map of an area of robot operation of an autonomous mobile robot is described. In accordance with one embodiment, the method comprises the following: the detection of obstacles, as well as of their size and position on the map by means of sensors arranged on the robot; the analysis of the map by means of a processor in order to detect an area in which there is a cluster of obstacles; and the definition of a first zone by means of a processor such that this first zone contains a detected cluster.
In accordance with a further embodiment, the method comprises the following: the detection of obstacles, as well as the determination of their size and position on the map by means of sensors arranged on the robot; the analysis of the map by means of a processor, wherein, based on at least one specifiable criterion, hypotheses concerning possible zone borders and/or the function of individual detected obstacles are automatedly generated; and the sectoring of the area of robot operation into zones based on the generated hypotheses.
In accordance with still a further embodiment, the method comprises the detection of obstacles in the form of boundary lines, as well as the determination of their size and position on the map by means of sensors arranged on the robot; the sectoring of the area of robot operation into numerous zones based on the detected boundary lines and on specifiable parameters; and the depiction of the map, including the zones and the detected doorways, on a human-machine interface, wherein, in response to a user input concerning the specifiable parameters, the sectoring of the area of robot operation into zones, the position of doors and/or the designation of the functions of the detected zones is maintained and the sectoring of the area of robot operation is altered in dependence on the user input.
In accordance with a further example of the embodiment, the method comprises the detection of obstacles in the form of boundary lines, as well as the determination of their size and position on the map by means of sensors arranged on the robot and the sectoring, in numerous hierarchical stages, of the area of robot operation into numerous zones based on the detected boundary lines and specifiable parameters. In the course of this, during a first stage of the hierarchical stages the area of robot operation is sectored into numerous first stage zones. During a second stage of the hierarchal stages, the numerous first stage zones are further sectored into second stage zones.
A further method described here for the automatic sectoring of a map of an area of robot operation of an autonomous mobile robot comprises the detection of obstacles in the form of boundary lines, as well as the determination of their size and position on the map by means of sensors arranged on the robot. The method further comprises overlaying the boundary lines with a first rectangle such that every point that is accessible by the robot lies within the rectangle, as well as dividing the first rectangle into at least two adjacent second rectangles, wherein the border line(s) between two adjacent second rectangles traverse(s) boundary lines that are determined in accordance with specifiable criteria.
Further, a method for the automatic planning of the operation of an autonomous mobile robot in numerous zones of an area of robot operation is described. In accordance with one embodiment, the method comprises the receiving of a target processing duration by the robot, as well as the automated selection of the zones that are to be processed within the target processing duration, as well as their order of processing, based on the attributes assigned to the zones, for example priorities and/or an anticipated processing duration of the individual zones and the target processing duration.
A further example of a method for the automatic sectoring of a map of an area of robot operation of an autonomous mobile robot comprises the detection of obstacles, as well as the determination of their size and position on the map by means of sensors arranged on the robot, as well as the sectoring of the area of robot operation based on the detected obstacles, wherein movable obstacles are identified as such and are not considered when sectoring the area of robot operation so that the sectoring is independent of a specific position of the movable obstacle.
A further example of the method comprises the detection of obstacles, as well as the determination of their size and position on the map by means of sensors arranged on the robot, as well as the sectoring of the area of robot operation based on the detected obstacles, wherein, for the determination of the position of at least one of the detected obstacles, positions are used that were detected at various points in time in the past.
Further, a method for the localization of an autonomous mobile robot on the map of the area of robot operation is described, wherein at least one zone in which there is at least one cluster of obstacles is marked on the map. In accordance with one embodiment, the method comprises the detection of obstacles, as well as the determination of their size and position by means of a sensor arranged on the robot, the determination of a position of the robot relative to an area containing a cluster of obstacles, as well as the determination of a position of the robot on the map based on the relative position and on a position of the area containing the cluster of obstacles on the map.
Further embodiments concern a robot that is connected with and internal and/or external data processing device, the device being configured to execute a software program which, when executed by the data processing device, causes the robot to carry out the method described here.
In the following, the disclosure will be described in detail by means of the examples shown in the figures. The illustrations are not necessarily true to scale and the application is not limited to only the illustrated aspects. Instead, importance is given to illustrating the principles underlying the application.
A technical device is of the greatest use for a human user in daily life when, on the one hand, the behavior of the device is logical and understandable for the user and, on the other hand, when its operation can be intuitively carried out. The user expects of an autonomous mobile robot, such as a floor cleaning robot (“robot vacuum cleaner”), for example, that it will adapt itself (with regard to its mode of operation and behavior) to the user. For this purpose, the robot should, by means of technical methods, interpret its area of operation and sector it into zones (e.g. living room, bedroom, corridor, kitchen, dining area, etc.) in a way in which the human user would also do so. This allows for a simple communication between the user and the robot, for example in the form of commands given to the robot (e.g. “clean the bedroom”) and/or in the form of information given to the user (e.g. “cleaning of the bedroom is completed”). In addition to this, the mentioned zones may be used to depict a map of the area of robot operation that itself may be used to operate the robot.
A sectoring of the area of robot operation into zones can be carried out by a user, on the one hand, in accordance with known conventions and, on the other hand, in accordance with personal preferences (and thus, user specifically, e.g. dining area, children's play corner, etc.). One example of a known convention is the sectoring of an apartment into various rooms such as, e.g. bedroom, living room and corridor (cf.
One technical prerequisite for this is that the autonomous mobile robot possess a map of its area of operation, so that it can orient itself by means of the map. The map can, for example, be compiled by the robot independently and permanently saved. In order to achieve the goal of a sectoring of the area of robot operation that is intuitive for the user, technical methods are needed that (1) independently carry out a sectoring of the map of the area of robot operation such as, for example, an apartment, in accordance with given rules, (2) that allow for a simple interaction with the user, in order that the sectoring be adapted to the user's preferences that are not known a-priori, (3) that preprocess the automatically generated sectoring in order that it be simply and understandably depicted to the user on a map, and (4) that it be possible for certain features to be inferred from the thereby generated sectoring as autonomously as possible, and that these features be suitable for achieving a behavior that is expected by the user.
Simplifying the interaction with a human user is not the sole reason for the robot to first automatedly sector its area of operation into zones, but also in order to logically (from the viewpoint of the user) “work through” the area of operation. Such a sectoring into zones allows the robot to carry out its task in its area of operation more easily, more discriminatingly, and (from the viewpoint of the user) more logically, and improves interaction with the user. In order to achieve an appropriate sectoring, the robot must relate various sensor data to each other. In particular, it can make use of information on the accessibility (difficult/easy) of a sector of its area of operation to define a zone. Further, the robot may operate on the (disprovable) assumption, that rooms, as a rule, are rectangular. The robot can learn that some alterations to the sectoring can lead to more logical results (such as, e.g. that with a certain degree of probability, certain obstacles will lie within a certain zone).
As shown in
In order to solve the above mentioned problems and to allow for an automated sectoring of the area of robot operation into various zones (e.g. rooms), the robot generates, based on the sensor data, “hypotheses” about its environment that can be tested using various methods. If a hypothesis can be proved false, it is rejected. If two boundary lines (e.g. the lines A-A′ and O-O′ in
Various sensor measurements are combined when a hypothesis is tested by the robot. In the case of a doorway, for example, the tested measurements include the passage width, the passage depth (given by the thickness of the wall), the existence of a wall to the right and to the left of the doorway or of an open door extending into the room. All this information can be detected, for example, by the robot with the aid of a distance sensor. By means of an acceleration sensor or of a position sensor (e.g. a gyroscopic sensor), the possible existence of a door threshold can be detected when the robot passes over it. With the use of image processing and by measuring the height of the ceiling, additional information can be gathered.
A further example of a possible hypothesis is the run of walls in the area of robot operation. These may be designated, among others, by two parallel lines that are at a distance to each other which corresponds to the common thickness of a wall (see
To test and evaluate hypotheses, a degree of plausibility may be assigned to them. In one simple embodiment, a hypothesis is credited with a previously specified number of points for every confirming sensor measurement. When, in this manner, a certain hypothesis achieves a minimum number of points, it is regarded as plausible. A negative total number of points could result in the hypothesis being rejected. In a further developed embodiment, a probability of being correct is assigned to a certain hypothesis. This requires a probability model that takes into account the correlation between various sensor measurements but also allows complex probability statements to be generated with the aid of stochastic calculation models, thus resulting in a more reliable prediction of the user's expectations. For example, in certain regions (i.e. countries) in which the robot is operated, the width of doors may be standardized. If the robot measures such a standardized width, then this most probably relates to a door. Deviations from the standard widths reduce the probability that they relate to a door. For this purpose, for example, a probability model based on a standard distribution may be used. A further possibility for the generation and evaluation of hypotheses is the use of “machine learning” to generate suitable models and measurement functions (see, e.g. Trevor Hastie, Robert Tibshirani, Jerome Friedman: “The Elements of Statistical Learning”, 2nd edition, Springer Publishing House, 2008). For this purpose, for example, map data is gathered by one or more robots in various living environments. The data can then be supplemented with floor plans or further data input by the user (e.g. regarding the run of doors or doorways or regarding a desired sectoring) and can then be evaluated by a learning algorithm.
A further method that may be employed alternatively or additionally to the hypotheses described above is the sectoring of the area of robot operation (e.g. an apartment) into numerous rectangular zones (e.g. rooms). This approach is based on the assumption that rooms, as a rule, are rectangular or are composed of numerous rectangles. On a map compiled by a robot, this rectangular form of the rooms cannot generally be recognized as, in the rooms, numerous obstacles with complex borders such as, e.g. furniture, limit the robot's area of operation.
Based on the assumption of rectangular rooms, the area of robot operation is overlaid with rectangles of various sizes that are intended to represent the rooms. In particular, the rectangles are selected in such a manner so that a rectangle may be clearly assigned to every point on the map of the area of robot operation that is accessible to the robot. This means that the rectangles generally do not overlap. Nevertheless it is not to be excluded that a rectangle may contain points that are not accessible to the robot (e.g. because pieces of furniture make access impossible). The area designated by the rectangles may thus be larger and of a simpler geometric form than the actual area of robot operation. In order to determine the orientation and size of the individual rectangles, long boundary lines in the map of the area of robot operation, for example, are employed such as those e.g. that run along a wall (see, e.g.,
Based on the assumption that rooms are generally rectangular, the robot can supplement the outer boundary lines of the map of boundary lines (see
The room 300 can be sectored based on the sensor data recorded by the robot (see
Until now, comparatively small obstacles have not been taken into account, wherein “comparatively small” means that the obstacles are of a similar size to that of the robot or smaller. In the upper right hand part of
In the present example (
The borders of “difficult to pass” zones are geometrically not clearly defined (as opposed to, e.g. the borders between different floor coverings). However, hypotheses concerning their positions can be generated on the basis of desired features of the zone to be created and of a complementary zone (see
In the present example, by demarcating the difficult-to-pass zone 320 from the zone 301 (see
In order to avoid the creation of narrow, difficult-to-pass areas in the complementary zone (cf. above rule 6), narrow areas between the (outer) wall and the rectangular zone (
A further possibility for the sectoring of an area of robot operation is illustrated by means of
In order to generate a simplified model of the area of robot operation, based on the map of the boundary lines detected by the robot (see
With regard to the example above, a rectangle can thus be divided along sector lines that are determined, e.g. on the basis of previously adjusted boundary lines running parallel or perpendicular to the edges of the rectangle being divided. In the present example, these are the numerous boundary lines running along a straight line (see
As the rules mentioned above stipulate, there are no relevant sectoring possibilities in rectangle 501 (
In accordance with the criteria that are used to divide the rectangles, the rectangles 507 and/or 508 can be further divided. This results, for example, in zones as shown in
A human user expects that a sectoring of an area of robot operation that he knows well, once generated, will not fundamentally change during or after a use of the robot. Nevertheless, the user will probably accept a few optimizations aimed at improving the robots functioning. Firstly, the basis for the sectoring of a map can be altered from one use to another by displaceable objects. The robot must therefore eventually “learn” a sectoring that is not disrupted by these displacements. Examples of movable objects include doors, chairs or furniture on rollers. Image recognition methods, for example, may be used to detect, classify and once again recognize such objects. The robot can mark objects identified as displaceable as such and, if needed, recognize them at a different location during a later use.
Only objects with a permanent location (such as, for example, walls and larger pieces of furniture) should be used for a long term sectoring of the area of robot operation. Objects that are constantly found at varying locations can be ignored for the sectoring. In particular, a singular alteration in the robot's environment should not lead to a renewed compilation and sectoring of the map. Doors may be used, for example, to specify the border between two rooms (zones) by comparing open and closed states. Furthermore, the robot should take note of a zone that is temporally inaccessible due to a closed door and, if needed, inform the user of this. A zone that was not accessible during a first exploratory run of the robot because of a closed door but which is newly identified during a subsequent run is added to the map as a new zone. As mentioned above, table and chair legs may be used to demarcate a difficult-to-pass zone. Chair legs, however, may change their position during use, which may result in variations in the zone borders at different points in time. Therefore, the border of a difficult-to-pass zone can be adjusted over time so that, in all probability, all chairs will be found within the zone determined to be difficult to pass. This means that, based on the previously saved data on the position and size of obstacles, the frequency, and thus the probability (i.e. the parameter of a probability model) of finding an obstacle at a particular location can be determined. For example, the frequency with which a chair leg appears in a certain area can be measured. Additionally or alternatively, the density of chair legs in a given area, determined by means of numerous measurements, can be evaluated. Here, based on the probability model, the zone determined to have a cluster of obstacles can be adapted such that obstacles will be found, with a specified degree of probability, within this zone. This will result, as the case may be, in an adjustment of the borders of the “difficult-to-pass” zone containing the cluster of (probably present) obstacles.
In addition to this there are object such as, for example, television armchairs, that are generally located at a similar position in a room, but whose specific position may (insignificantly) vary under the influence of a human user. At the same time, such an object may be of a size that allows it to be used to define a zone. This may be, for example, “the area between the sofa and the armchair”. For these objects (e.g. the television chair), the most likely expected position is, over time, determined (e.g. on the basis of the median, of the expectation value or of the mode). This is then used for a long-term sectoring of the map and a final interaction with the user (e.g. when the user operates and controls the robot).
In broad terms, the user can be provided with the possibility to review the automatedly generated sectoring of the map and, if necessary, modify it. The goal of the automated sectoring is nevertheless to obtain a sectoring of the map that is as realistic as possible automatedly and without interaction with the user.
Broadly speaking it should be said, when reference is made here to a sectoring of the map “by the robot”, that the sectoring can be carried out by a processor (including software) arranged in the robot, as well as on a device that is connected with the robot and to which the measurement data gathered by the robot is sent (e.g. via radio). The map sectoring calculations may therefore also be carried out on a personal computer or on an Internet server. This generally makes little difference to the human user.
By appropriately sectoring its area of operation into zones, a robot can carry out its tasks “more intelligently” and with greater efficiency. In order to better adapt the robot's behavior to the expectations of the user, various characteristics (also called attributes) of a zone can be detected, assigned to a zone or used to create a zone. For example, there are characteristics that make it easier for the robot to localize itself within its area of operation after, for example, being carried to a soiled area by the user. The localization allows the robot to independently return to its base station after completing the cleaning. Characteristics of the environment that can be used for this localization include, for example, the type of floor covering, a distinctive (wall) color, and the field strength of the WLAN or other characteristics of electromagnetic fields. In addition to this, small obstacles may provide the robot with an indication of its position on the map. In this case, the exact position of the obstacles need not be used, but only their (frequent) appearance in a given area. Other characteristics either influence the behavior of the robot directly, or they allow it to make suggestions to the user. For example, information concerning the average degree of dirtiness may be used to make suggestions regarding the frequency with which a zone is cleaned (or to automatically determine a cleaning interval). If a zone regularly exhibits an uneven distribution of dirtiness, the robot can suggest to the user that the zone be further sectored (or it can carry out this further sectoring automatically).
Information concerning the floor type (tiles, wood, carpet, anti-slip, smooth, etc.) can be used to automatically choose a cleaning program that is suited to the floor type or to suggest to the user a cleaning program. In addition to this, the robot's movement (e.g. its maximum speed or minimum curve radius) can be automatically adjusted, for example, to correct excess slippage on a carpet. Zones or areas within zones in which the robot frequently becomes stuck (e.g. lying cables or the like), requiring a user's aid to free it, can also be saved as a characteristic of a given zone. Thus, in future, such zones can be avoided, cleaned with less priority (e.g. at the end of a cleaning operation) or only cleaned when the user is present.
As mentioned above, a designation (bedroom, corridor, etc.) can be assigned to a zone identified as a room. This can either be done by the user or the robot can automatedly select a designation. Based on the designation of a zone, the robot can adapt its behavior. For example, the robot—depending on the designation assigned to the zone—can suggest to the user a cleaning procedure that is adapted to the designation and thus simplify the adjustment of the robot to the user's needs. For example, the designation of a room may be considered in a calendar function, in accordance with which, e.g. for a zone designated as a bedroom a time frame (e.g. 10 pm-8 am) can be specified, during which the robot may not enter the given area. Another example is the designation of a zone as dinette (cf.
In order to accelerate the cleaning of numerous zones, this may be carried out sequentially, wherein transitional movement between the zones should be avoided. This can be achieved by selecting the end point of one cleaning run such that it lies near the starting point of the cleaning run in the next zone. Freely accessible zones can be cleaned very efficiently along a meandering path. The distance of the straight path segments of the meandering path can be adapted to the width of the zone to be cleaned, both in order to achieve a cleaning result that is as even as possible, as well as to end the cleaning operation of the zone at a desired location. For example, a rectangular zone is to be cleaned, starting in the upper left hand corner and ending in the lower right hand corner, along a meandering path whose straight path segments 11 run horizontally (similar to
During a cleaning run of the robot through a zone along a meandering path, the straight path segments 11 can be passed through relatively quickly, whereby the curves 12 (over 180°) are passed through relatively slowly. In order to clean a zone as quickly as possible, it can be of an advantage to pass through as few curves 12 as possible. When cleaning a rectangular zone the meandering path can be oriented such that the straight path segments 11 lie parallel to the longest edge of the rectangular zone. If the zone exhibits a geometric form of greater complexity than a rectangle (in particular if it is a non-convex polygon), the orientation of the meandering path can be decisive as to whether the area can be completely cleaned in one run. In the case of a U formed zone as per the example of
A further example for when the direction that the robot takes when moving along a planned path is relevant is the cleaning of a carpet. In the case of a deep pile carpet the direction of movement can have an impact on the cleaning results and alternating directions of movement produce a pattern of stripes on the carpet that may not be desired. In order to avoid this pattern of stripes it is possible, for example, to only clean in a preferred direction while moving along a meandering path. While moving back (against the preferred direction) the cleaning can be deactivated by switching off the brushes and the vacuuming unit. The preferred direction can be determined with the aid of sensors or with input from the user, assigned to the zone in question and then saved for this zone.
As explained above, it is often not desirable for the robot to move slowly through a difficult-to-pass zone (and thereby risk collisions with obstacles). Instead of this the robot should move around the difficult-to-pass zone. Therefore, the attribute “zone-to-be-avoided” can be assigned to a zone that is identified as difficult to pass (see the above). The robot will then not enter the zone that is difficult to access except for a planned cleaning. Other areas as well, e.g. such as a valuable carpet, may be designated either by the user or independently by the robot as a “zone-to-be-avoided”. As a consequence, when planning the path, the given zone will only be considered for a transition run (without cleaning) from one point to another when no other possibility exists. Furthermore, the numerous obstacles of a difficult-to-access zone present a disturbance while cleaning along a meandering path. In a difficult-to-pass zone, therefore, a specially adapted cleaning strategy may be employed (instead of that of the meandering path). This can be particularly adapted so as to prevent, to the greatest extent possible, any isolated areas from being left uncleaned in the course of the numerous runs around the obstacles. If zones are left uncleaned, the robot can record whether access to the uncleaned zone exists and if so, where it is or whether the zone is completely blocked off by close standing obstacles (e.g. table and chair legs). In this case the user can be informed of zones left uncleaned (because of their inaccessibility).
When employing a robot it may be that not enough time is available to completely clean the area of robot operation. In such a case it may be advantageous for the robot to independently plan the cleaning time according to certain stipulations—e.g. taking into consideration an allowed time—and to carry out the cleaning in accordance with this time planning (scheduling). When scheduling the cleaning, e.g. (1.) the time anticipated for cleaning each of the zones designated for cleaning, (2.) the time needed to move from one zone to the next, (3.) the priority of the zones, (4.) the elapsed amount of time since the last cleaning of a zone and/or (5.) the degree of dirtiness of one or more zones, detected during one or more of the previous exploratory and cleaning runs, can be taken into consideration.
In order to be able to predict the cleaning time needed for a specific zone, the robot may make use of empirical values gathered during previous cleaning runs, as well as theoretical values determined with the aid of simulations. For example, the anticipated cleaning time for small, geometrically simple zones can be determined (e.g. the number of meandering segments multiplied by the length of a segment, divided by the speed plus the turning time required by the robot), in order to generate a prediction for more complex zones (those comprised of the combined simple zones). When determining the anticipated cleaning time for numerous zones, the cleaning time for the individual zones and the time required by the robot to move from one zone to the next will be considered. The automatically generated cleaning schedule can be shown to the human user, who may alter it if needed. Alternatively, the user may ask the robot to suggest numerous cleaning schedules, choose one of these and alter it as deemed necessary. In a further example, the robot may, automatically and without further interaction with the user, begin the cleaning according to an automatically generated cleaning schedule. If a target operating duration is set, the robot can compile a schedule based on the attributes assigned to the zones. The attributes in this case may be, for example: priorities, the anticipated cleaning times for individual zones and/or the expected degree of dirtiness of individual zones. After the target operating time has expired, the robot can interrupt the cleaning, finish the zone it is currently working on or extend the operating time until the user terminates it. The principles mentioned above will be explained in the following using two examples.
Example I (cleaning until termination by the user): In this example, the example apartment of
Example 2 (set target time): In a second example the robot is employed, for example, in a department store that is only cleaned during closing hours. The time available for cleaning is thus limited and cannot be exceeded. The area of operation of the robot is in this case so large that it cannot be cleaned in the allotted amount of time. It may therefore be of an advantage to be able to assign priorities to the various zones in the area of robot operation. For example, the zone that encompasses the entrance way should be cleaned daily while other zones that are generally frequented by fewer customers need only be cleaned every third day and, consequently, have a lower priority. On this basis, the robot can carry out a weekly work scheduling. It may also be advantageous, e.g. for the robot to dynamically adapt its schedule to the actual needs. For example, the anticipated degree of dirtiness of a zone can be taken into consideration. This can be determined based on the empirical degree of dirtiness or on the (measurable) number of actual customers entering this zone. The number of customers is, for example, saved in a database, wherein the data is input manually by the department store staff or is determined automatically by means of sensors such as, for example, motion sensors, light barriers or cameras in combination with an image processor. Alternatively, the department store management may spontaneously demand that a zone not previously included in the plan be cleaned if, for example, it is especially heavily soiled due to an accident. The robot can then automatedly include a new zone into the cleaning plan and—in order to meet the set target time—postpone the cleaning of another zone until the following day.
In the following it will be described how a robot map that has been sectored into numerous zones may be used to improve the robot-user communication and interaction. As mentioned above, a sectoring of the area of robot operation would generally be carried out by a human user intuitively. For machines, however, this is, in general, a very difficult task that does not always produce the desired results (the robot lacks human intuition). Therefore, the user should be able to adapt the robot-generated sectoring of the apartment to his/her needs. The possibilities available to the user range here from altering the sectoring by shifting the borders between adjoining zones over the further sectoring of existing zones to the creation of new, user defined zones. These zones may be, for example, so-called “keep-out areas” that the robot is not allowed to enter on its own. The zones (suggested by the robot and, if needed, altered by the user) may thus be furnished by the user with additional attributes that can also influence the behavior of the robot when in operation (in the same manner as the attributes described above that may be automatically assigned to a zone). Possible attributes are, for example, (1.) priority (how important it is for the user that the zone be cleaned), (2.) floor type (which cleaning strategy—dry brushing, wet brushing or only vacuuming, etc.—should be employed?), (3.) accessibility (whether it is even allowed to enter the given zone).
A further possibility is for the user to influence the automatic sectoring process by confirming, for example, or rejecting hypotheses made by the robot. In this case the user may “instruct” the robot to carry out the sectoring of an area of operation. After this the user may influence the sectoring, for example, by designating doors on the map or by eliminating doors incorrectly identified by the robot. Following this the robot, based on the additional information entered by the user, can automatedly carry out a new sectoring of the map. Further, the user can adjust relevant parameters (e.g. typical door widths, thickness of the inner walls, floor plan of the apartment, etc.) so that the robot can use these parameters to generate an adjusted sectoring of its area of operation.
Often, certain parts of the area of robot operation (e.g. of an apartment) will be similar in a given culture, for example the bedroom. If the user furnishes a zone recognized as a room with the designation “bedroom”, various criteria—in particular the probability models used to generate hypotheses—can be adapted to those of a typical bedroom during the further automated sectoring of the bedroom. In this manner, an object found in the bedroom having the dimensions of one by two meters may with relative reliability be interpreted to be a bed. In a room designated as a “kitchen”, an object of similar dimensions might possibly be determined to be a kitchen island. The designation of a zone is carried out by first selecting the zone on the map of the area of robot operation and by then choosing a designation from a list offered by the robot. It may also be possible for the user to freely choose a designation for the room. To simplify the user's orientation on a map generated by the robot, when designating a zone the user may select the zone and then instruct the robot to enter it. In this manner the user is able to recognize a direct connection between the depicted zone and the actual position of the robot in the apartment, thus enabling the user to assign an appropriate designation to the zone.
One prerequisite to the designation of a zone on the map of the robot's area of operation by the user is that the robot has already generated a sectoring of its area of operation that is good enough for the user to be able to recognize, for example, the bedroom (e.g. a rough sectoring as shown in
In accordance with a further embodiment, the robot is configured to carry out the sectoring of the map or to improve the characteristics of a detected zone by asking the user direct questions regarding the hypotheses generated during a first exploratory run. This communication between the robot and the user is made possible quite easily, e.g. by means of a software application installed on a mobile device (e.g. tablet computer, telephone, etc.). This can, in particular, take place before a first version of the map is shown to the user to improve the quality of the map displayed by the robot. In this manner, e.g. the robot may ask the user whether a zone that is difficult to access because of a table with chairs is a regularly used dinette. If the question is answered affirmatively, the robot can automatedly deduct conclusions from this answer and assign certain attributes to the zone in question. In the case of a dinette, the robot can assign a higher cleaning priority to the zone as it must be assumed that this area is dirtier than others. The user may confirm, reject or alter the assigned priority.
If the robot is to be employed in a new, unknown area it may direct specific questions to the user in order to gather preliminary information about the area such as, for example, the size of the apartment (the area of robot operation) and the number of rooms. In particular, the user can inform the robot of any deviations from the usual sectoring of a living area or that it will be employed in a commercially used area, such as a floor of offices. This information allows the robot to adapt several parameters that are relevant for the sectoring of its area of operation (such as, e.g. the probability models used to make hypotheses) in order to be able to generate a better sectoring of the map during a following exploratory run and/or to assign the appropriate attributes (e.g. regarding the cleaning strategy) to the recognized zones.
For the interaction between a human user and the robot, information (such as, e.g. a map of the area of robot operation) can be displayed by the robot for the user via a human machine interface (HMI) or user input for controlling the robot can be relayed. An HMI may be implemented on a tablet computer (or a personal computer, a mobile telephone, etc.) by means of a software application. A robot-generated map is generally quite complex and for an inexperienced user difficult to interpret (cf. e.g.
A human user generally has no problem identifying in a floor plan display in accordance with
This simplified map of the area of robot operation may then be automatedly completed by adding elements that are easy for the user to identify such as inner walls, doors and prominent items of furniture, thus obtaining a simple floor plan of the apartment. The sensor data (e.g. the above mentioned boundary lines, see
When in daily use, a user places certain demands on the robot. He may require, for example, the cleaning of the entire apartment, the cleaning of one room of the apartment such as, for example, the living room (
In order that the user be able to easily navigate the robot generated map, first a greatly simplified floor plan, as shown in
Number | Date | Country | Kind |
---|---|---|---|
10 2015 119 501.1 | Nov 2015 | DE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/AT2016/060108 | 11/11/2016 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2017/079777 | 5/18/2017 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4674048 | Okumura | Jun 1987 | A |
4740676 | Satoh et al. | Apr 1988 | A |
4777416 | George, II et al. | Oct 1988 | A |
5109566 | Kobayashi et al. | May 1992 | A |
5260710 | Omamyuda et al. | Nov 1993 | A |
5284522 | Kobayashi et al. | Feb 1994 | A |
5377106 | Drunk et al. | Dec 1994 | A |
5402051 | Fujiwara et al. | Mar 1995 | A |
5696675 | Nakamura et al. | Dec 1997 | A |
5787545 | Colens | Aug 1998 | A |
5995884 | Allen et al. | Nov 1999 | A |
6366219 | Hoummady | Apr 2002 | B1 |
6389329 | Colens | May 2002 | B1 |
6532404 | Colens | Mar 2003 | B2 |
6594844 | Jones | Jul 2003 | B2 |
6605156 | Clark et al. | Aug 2003 | B1 |
6615108 | Peless et al. | Sep 2003 | B1 |
6667592 | Jacobs et al. | Dec 2003 | B2 |
6690134 | Jones et al. | Feb 2004 | B1 |
6764373 | Osawa et al. | Jul 2004 | B1 |
6781338 | Jones et al. | Aug 2004 | B2 |
6809490 | Jones et al. | Oct 2004 | B2 |
6965209 | Jones et al. | Nov 2005 | B2 |
6972834 | Oka et al. | Dec 2005 | B1 |
7155308 | Jones | Dec 2006 | B2 |
7173391 | Jones et al. | Feb 2007 | B2 |
7196487 | Jones et al. | Mar 2007 | B2 |
7302345 | Kwon et al. | Nov 2007 | B2 |
7388343 | Jones et al. | Jun 2008 | B2 |
7389156 | Ziegler et al. | Jun 2008 | B2 |
7448113 | Jones et al. | Nov 2008 | B2 |
7483151 | Zganec et al. | Jan 2009 | B2 |
7507948 | Park et al. | Mar 2009 | B2 |
7539557 | Yamauchi | May 2009 | B2 |
7571511 | Jones et al. | Aug 2009 | B2 |
7636982 | Jones et al. | Dec 2009 | B2 |
7656541 | Waslowski et al. | Feb 2010 | B2 |
7761954 | Ziegler et al. | Jul 2010 | B2 |
7801676 | Kurosawa et al. | Sep 2010 | B2 |
8438695 | Gilbert et al. | May 2013 | B2 |
8594019 | Misumi | Nov 2013 | B2 |
8739355 | Morse et al. | Jun 2014 | B2 |
8855914 | Alexander et al. | Oct 2014 | B1 |
8892251 | Dooley et al. | Nov 2014 | B1 |
8921752 | Iizuka | Dec 2014 | B2 |
8982217 | Hickman | Mar 2015 | B1 |
9002511 | Hickerson et al. | Apr 2015 | B1 |
9026302 | Stout et al. | May 2015 | B2 |
9037294 | Chung et al. | May 2015 | B2 |
9043017 | Jung et al. | May 2015 | B2 |
9149170 | Ozick et al. | Oct 2015 | B2 |
9220386 | Gilbert, Jr. et al. | Dec 2015 | B2 |
9298183 | Artés et al. | Mar 2016 | B2 |
9486924 | Dubrovsky et al. | Nov 2016 | B2 |
9717387 | Szatmary et al. | Aug 2017 | B1 |
10228697 | Yoshino | Mar 2019 | B2 |
10661433 | Angle | May 2020 | B2 |
20020016649 | Jones | Feb 2002 | A1 |
20020103575 | Sugawara | Aug 2002 | A1 |
20020120364 | Colens | Aug 2002 | A1 |
20030025472 | Jones et al. | Feb 2003 | A1 |
20030030398 | Jacobs et al. | Feb 2003 | A1 |
20030034441 | Kang et al. | Feb 2003 | A1 |
20030142925 | Melchior et al. | Jul 2003 | A1 |
20040020000 | Jones | Feb 2004 | A1 |
20040049877 | Jones et al. | Mar 2004 | A1 |
20040187457 | Colens | Sep 2004 | A1 |
20040207355 | Jones et al. | Oct 2004 | A1 |
20050000543 | Taylor et al. | Jan 2005 | A1 |
20050010331 | Taylor et al. | Jan 2005 | A1 |
20050041839 | Saitou et al. | Feb 2005 | A1 |
20050067994 | Jones et al. | Mar 2005 | A1 |
20050156562 | Cohen et al. | Jul 2005 | A1 |
20050171636 | Tani | Aug 2005 | A1 |
20050171644 | Tani | Aug 2005 | A1 |
20050204717 | Colens | Sep 2005 | A1 |
20050212680 | Uehigashi | Sep 2005 | A1 |
20050256610 | Orita | Nov 2005 | A1 |
20060012493 | Karlsson et al. | Jan 2006 | A1 |
20060020369 | Taylor | Jan 2006 | A1 |
20060095158 | Lee et al. | May 2006 | A1 |
20060237634 | Kim | Oct 2006 | A1 |
20070027579 | Suzuki | Feb 2007 | A1 |
20070061041 | Zweig | Mar 2007 | A1 |
20070234492 | Svendsen et al. | Oct 2007 | A1 |
20070266508 | Jones et al. | Nov 2007 | A1 |
20070282484 | Chung | Dec 2007 | A1 |
20080046125 | Myeong et al. | Feb 2008 | A1 |
20080140255 | Ziegler et al. | Jun 2008 | A1 |
20080155768 | Ziegler et al. | Jul 2008 | A1 |
20080192256 | Wolf et al. | Aug 2008 | A1 |
20080307590 | Jones et al. | Dec 2008 | A1 |
20090048727 | Hong et al. | Feb 2009 | A1 |
20090051921 | Masahiko | Feb 2009 | A1 |
20090143912 | Wang | Jun 2009 | A1 |
20090177320 | Lee et al. | Jul 2009 | A1 |
20090182464 | Myeong et al. | Jul 2009 | A1 |
20090281661 | Dooley et al. | Nov 2009 | A1 |
20100030380 | Shah et al. | Feb 2010 | A1 |
20100049365 | Jones et al. | Feb 2010 | A1 |
20100082193 | Chiappetta | Apr 2010 | A1 |
20100257690 | Jones et al. | Oct 2010 | A1 |
20100257691 | Jones et al. | Oct 2010 | A1 |
20100263158 | Jones et al. | Oct 2010 | A1 |
20100324731 | Letsky | Dec 2010 | A1 |
20100324736 | Yoo et al. | Dec 2010 | A1 |
20110054689 | Nielsen et al. | Mar 2011 | A1 |
20110137461 | Kong | Jun 2011 | A1 |
20110194755 | Jeong et al. | Aug 2011 | A1 |
20110211731 | Lee et al. | Sep 2011 | A1 |
20110224824 | Lee et al. | Sep 2011 | A1 |
20110231018 | Iwai | Sep 2011 | A1 |
20110236026 | Yoo et al. | Sep 2011 | A1 |
20110238214 | Yoo et al. | Sep 2011 | A1 |
20110264305 | Choe et al. | Oct 2011 | A1 |
20110278082 | Chung et al. | Nov 2011 | A1 |
20110295420 | Wagner | Dec 2011 | A1 |
20120008128 | Bamji | Jan 2012 | A1 |
20120013907 | Jung et al. | Jan 2012 | A1 |
20120022785 | DiBernardo et al. | Jan 2012 | A1 |
20120060320 | Lee et al. | Mar 2012 | A1 |
20120069457 | Wolf et al. | Mar 2012 | A1 |
20120169497 | Schmittman et al. | Jul 2012 | A1 |
20120173070 | Schmittman | Jul 2012 | A1 |
20120215380 | Fouillade et al. | Aug 2012 | A1 |
20120223216 | Flaherty et al. | Sep 2012 | A1 |
20120265370 | Kim et al. | Oct 2012 | A1 |
20120271502 | Lee | Oct 2012 | A1 |
20120283905 | Nakano et al. | Nov 2012 | A1 |
20130001398 | Wada et al. | Jan 2013 | A1 |
20130024025 | Hsu | Jan 2013 | A1 |
20130166134 | Shitamoto | Jun 2013 | A1 |
20130206177 | Burlutskiy | Aug 2013 | A1 |
20130217421 | Kim | Aug 2013 | A1 |
20130221908 | Tang | Aug 2013 | A1 |
20130261867 | Burnett et al. | Oct 2013 | A1 |
20130265562 | Tang et al. | Oct 2013 | A1 |
20130317944 | Huang et al. | Nov 2013 | A1 |
20140005933 | Fong et al. | Jan 2014 | A1 |
20140098218 | Wu et al. | Apr 2014 | A1 |
20140100693 | Fong et al. | Apr 2014 | A1 |
20140115797 | Duenne | May 2014 | A1 |
20140124004 | Rosenstein et al. | May 2014 | A1 |
20140128093 | Das | May 2014 | A1 |
20140156125 | Song et al. | Jun 2014 | A1 |
20140207280 | Duffley et al. | Jul 2014 | A1 |
20140207281 | Angle et al. | Jul 2014 | A1 |
20140207282 | Angle et al. | Jul 2014 | A1 |
20140218517 | Kim et al. | Aug 2014 | A1 |
20140257563 | Park et al. | Sep 2014 | A1 |
20140257564 | Sun et al. | Sep 2014 | A1 |
20140257565 | Sun et al. | Sep 2014 | A1 |
20140303775 | Oh et al. | Oct 2014 | A1 |
20140316636 | Hong et al. | Oct 2014 | A1 |
20140324270 | Chan et al. | Oct 2014 | A1 |
20140343783 | Lee | Nov 2014 | A1 |
20150006289 | Jakobson | Jan 2015 | A1 |
20150115138 | Heng et al. | Apr 2015 | A1 |
20150115876 | Noh et al. | Apr 2015 | A1 |
20150120056 | Noh et al. | Apr 2015 | A1 |
20150151646 | Noiri | Jun 2015 | A1 |
20150166060 | Smith | Jun 2015 | A1 |
20150168954 | Hickerson et al. | Jun 2015 | A1 |
20150173578 | Kim et al. | Jun 2015 | A1 |
20150202772 | Kim | Jul 2015 | A1 |
20150212520 | Artes et al. | Jul 2015 | A1 |
20150223659 | Han et al. | Aug 2015 | A1 |
20150260829 | Wada | Sep 2015 | A1 |
20150265125 | Lee et al. | Sep 2015 | A1 |
20150269823 | Yamanishi et al. | Sep 2015 | A1 |
20150314453 | Witelson et al. | Nov 2015 | A1 |
20150367513 | Gettings et al. | Dec 2015 | A1 |
20160008982 | Artes et al. | Jan 2016 | A1 |
20160026185 | Artes et al. | Jan 2016 | A1 |
20160037983 | Hillen et al. | Feb 2016 | A1 |
20160041029 | T'ng et al. | Feb 2016 | A1 |
20160066759 | Miele et al. | Mar 2016 | A1 |
20160103451 | Vicenti | Apr 2016 | A1 |
20160123618 | Hester et al. | May 2016 | A1 |
20160132056 | Yoshino | May 2016 | A1 |
20160150933 | Duenne et al. | Jun 2016 | A1 |
20160165795 | Balutis et al. | Jun 2016 | A1 |
20160166126 | Morin et al. | Jun 2016 | A1 |
20160200161 | Van Den Bossche et al. | Jul 2016 | A1 |
20160209217 | Babu et al. | Jul 2016 | A1 |
20160213218 | Ham et al. | Jul 2016 | A1 |
20160214258 | Yan | Jul 2016 | A1 |
20160229060 | Kim et al. | Aug 2016 | A1 |
20160271795 | Vicenti | Sep 2016 | A1 |
20160282873 | Masaki et al. | Sep 2016 | A1 |
20160297072 | Williams et al. | Oct 2016 | A1 |
20160298970 | Lindhe et al. | Oct 2016 | A1 |
20170001311 | Bushman et al. | Jan 2017 | A1 |
20170083022 | Tang | Mar 2017 | A1 |
20170147000 | Hoennige et al. | May 2017 | A1 |
20170164800 | Arakawa | Jun 2017 | A1 |
20170177001 | Cao et al. | Jun 2017 | A1 |
20170197314 | Stout et al. | Jul 2017 | A1 |
20170231452 | Saito et al. | Aug 2017 | A1 |
20170273528 | Watanabe | Sep 2017 | A1 |
20170364087 | Tang et al. | Dec 2017 | A1 |
20180004217 | Biber et al. | Jan 2018 | A1 |
20180074508 | Kleiner et al. | Mar 2018 | A1 |
20200348666 | Han | Nov 2020 | A1 |
Number | Date | Country |
---|---|---|
2015322263 | Apr 2017 | AU |
2322419 | Sep 1999 | CA |
1381340 | Nov 2002 | CN |
1696612 | Nov 2005 | CN |
101920498 | Dec 2010 | CN |
101945325 | Jan 2011 | CN |
101972129 | Feb 2011 | CN |
102407522 | Apr 2012 | CN |
102738862 | Oct 2012 | CN |
102866706 | Jan 2013 | CN |
103885444 | Jun 2014 | CN |
203672362 | Jun 2014 | CN |
104115082 | Oct 2014 | CN |
104460663 | Mar 2015 | CN |
104634601 | May 2015 | CN |
104765362 | Jul 2015 | CN |
105045098 | Nov 2015 | CN |
105334847 | Feb 2016 | CN |
105467398 | Apr 2016 | CN |
105527619 | Apr 2016 | CN |
105593775 | May 2016 | CN |
105990876 | Oct 2016 | CN |
4421805 | Aug 1995 | DE |
10204223 | Aug 2003 | DE |
10261787 | Jan 2004 | DE |
60002209 | Mar 2004 | DE |
69913150 | Aug 2004 | DE |
102007016802 | May 2008 | DE |
102008028931 | Jun 2008 | DE |
102008014912 | Sep 2009 | DE |
102009059217 | Feb 2011 | DE |
102009041362 | Mar 2011 | DE |
102009052629 | May 2011 | DE |
102010000174 | Jul 2011 | DE |
102010000317 | Aug 2011 | DE |
102010000607 | Sep 2011 | DE |
102010017211 | Dec 2011 | DE |
102010017689 | Jan 2012 | DE |
102010033768 | Feb 2012 | DE |
102011050357 | Feb 2012 | DE |
102012201870 | Aug 2012 | DE |
102011006062 | Sep 2012 | DE |
102011051729 | Jan 2013 | DE |
102012211071 | Nov 2013 | DE |
102012105608 | Jan 2014 | DE |
102012109004 | Mar 2014 | DE |
202014100346 | Mar 2014 | DE |
102012112035 | Jun 2014 | DE |
102012112036 | Jun 2014 | DE |
102013100192 | Jul 2014 | DE |
102014110265 | Jul 2014 | DE |
102014113040 | Sep 2014 | DE |
102013104399 | Oct 2014 | DE |
102013104547 | Nov 2014 | DE |
102015006014 | May 2015 | DE |
102014012811 | Oct 2015 | DE |
102015119501 | Nov 2015 | DE |
102014110104 | Jan 2016 | DE |
102016102644 | Feb 2016 | DE |
102016114594 | Feb 2018 | DE |
102016125319 | Jun 2018 | DE |
142594 | May 1985 | EP |
0402764 | Dec 1990 | EP |
0769923 | May 1997 | EP |
1062524 | Dec 2000 | EP |
1342984 | Sep 2003 | EP |
1533629 | May 2005 | EP |
1553536 | Jul 2005 | EP |
1557730 | Jul 2005 | EP |
1621948 | Feb 2006 | EP |
1942313 | Jul 2008 | EP |
1947477 | Jul 2008 | EP |
1983396 | Oct 2008 | EP |
2027806 | Feb 2009 | EP |
2053417 | Apr 2009 | EP |
2078996 | Jul 2009 | EP |
2287697 | Feb 2011 | EP |
2327957 | Jun 2011 | EP |
1941411 | Sep 2011 | EP |
2407847 | Jan 2012 | EP |
2407847 | Jan 2012 | EP |
2450762 | May 2012 | EP |
2457486 | May 2012 | EP |
2498158 | Sep 2012 | EP |
2502539 | Sep 2012 | EP |
2511782 | Oct 2012 | EP |
2515196 | Oct 2012 | EP |
2571660 | Mar 2013 | EP |
2573639 | Mar 2013 | EP |
2595024 | May 2013 | EP |
2740013 | Jun 2014 | EP |
2741159 | Jun 2014 | EP |
2853976 | Apr 2015 | EP |
2752726 | May 2015 | EP |
2870852 | May 2015 | EP |
3079030 | Nov 2015 | EP |
3156873 | Apr 2017 | EP |
3184013 | Jun 2017 | EP |
2509989 | Jul 2014 | GB |
2509990 | Jul 2014 | GB |
2509991 | Jul 2014 | GB |
2513912 | Nov 2014 | GB |
H04338433 | Nov 1992 | JP |
2001125641 | May 2001 | JP |
2002085305 | Mar 2002 | JP |
2003330543 | Nov 2003 | JP |
2003345437 | Dec 2003 | JP |
2004133882 | Apr 2004 | JP |
2005205028 | Aug 2005 | JP |
2008047095 | Feb 2008 | JP |
2008140159 | Jun 2008 | JP |
2009509673 | Mar 2009 | JP |
2009123045 | Jun 2009 | JP |
2009238055 | Oct 2009 | JP |
2009301247 | Dec 2009 | JP |
2010066932 | Mar 2010 | JP |
2010227894 | Oct 2010 | JP |
2012011200 | Jan 2012 | JP |
2013077088 | Apr 2013 | JP |
2013146302 | Aug 2013 | JP |
2013537651 | Oct 2013 | JP |
2014176260 | Sep 2014 | JP |
201541203 | Mar 2015 | JP |
2015536489 | Dec 2015 | JP |
2016192040 | Nov 2016 | JP |
2016201095 | Dec 2016 | JP |
20060081131 | Jul 2006 | KR |
100735565 | Jul 2007 | KR |
100815545 | Mar 2008 | KR |
1020090013523 | Feb 2009 | KR |
20110092158 | Aug 2011 | KR |
20140073854 | Jun 2014 | KR |
20140145648 | Dec 2014 | KR |
20150009413 | Jan 2015 | KR |
20150050161 | May 2015 | KR |
20150086075 | Jul 2015 | KR |
20150124011 | Nov 2015 | KR |
20150124013 | Nov 2015 | KR |
20150124014 | Nov 2015 | KR |
20150127937 | Nov 2015 | KR |
101640706 | Jul 2016 | KR |
20160097051 | Aug 2016 | KR |
9523346 | Aug 1995 | WO |
9928800 | Jun 1999 | WO |
200004430 | Jan 2000 | WO |
2005074362 | Aug 2005 | WO |
2007028667 | Mar 2007 | WO |
2011074165 | Jun 2011 | WO |
2012099694 | Jul 2012 | WO |
2012157951 | Nov 2012 | WO |
2013116887 | Aug 2013 | WO |
2014017256 | Jan 2014 | WO |
2014043732 | Mar 2014 | WO |
2014055966 | Apr 2014 | WO |
2014113091 | Jul 2014 | WO |
2014138472 | Sep 2014 | WO |
2015018437 | Feb 2015 | WO |
2015025599 | Feb 2015 | WO |
2015072897 | May 2015 | WO |
2015082017 | Jun 2015 | WO |
2015090398 | Jun 2015 | WO |
2015158240 | Oct 2015 | WO |
2015181995 | Dec 2015 | WO |
2016019996 | Feb 2016 | WO |
2016027957 | Feb 2016 | WO |
2016028021 | Feb 2016 | WO |
2016031702 | Mar 2016 | WO |
2016048077 | Mar 2016 | WO |
2016050215 | Apr 2016 | WO |
2016091312 | Jun 2016 | WO |
2016095966 | Jun 2016 | WO |
Entry |
---|
English translation for reference EP2407847A2 (Year: 2012). |
Patent Application_Translated_KR20090013523 (Year: 2009). |
English translation KR20060081131 (Year: 2006). |
English translation JP2003345437 (Year: 2003). |
Patent Cooperation Treaty, “International Search Report,” and English-language translation thereof issued in International Application No. PCT/AT2016/060108, by European Searching Authority, document of 13 pages, dated Mar. 2, 2017. |
Thrun et al., “Probabilistic Robotics”; 1999, 492 pages. |
Choset et al., “Principles Of Robot Motion”, Theory, Algorithms, and Implementations, Chapter 6—Cell Decompositions, 2004, document of 41 pages. |
Konolige et al., “A Low-Cost Laser Distance Sensor,” 2008 IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, May 19-23, 2008, document of 7 pages. |
Oh et al., “Autonomous Battery Recharging for Indoor Mobile Robots,” Massachusetts Institute Of Technology Press, Aug. 30, 2000, document of 6 pages, XP055321836. |
Siegwart, “Introduction to Autonomous Mobile Robots”, Massachusetts, ISBN 978-0-26-219502-7, (2004), pp. 104-115, 151-163, 250-251, document of 37 pages. http://www.robotee.com/EBooks/Introduction_to_Autonomous_Mobile_Robots.pdf, XP055054850. |
Lymberopoulos et al., “A Realistic Evaluation and Comparison of Indoor Location Technologies: Experiences and Lessons Learned,” IPSN '15, Apr. 14 -16, 2015, Seattle, WA, USA, document of 12 pages. http://dx.doi.org/10.1145/2737095.27. |
Durrant-Whyte et al., “Simultaneous Localization and Mapping (SLAM): Part I The Essential Algorithms”, in: IEEE Robotics and Automation Magazine, vol. 13, No. 2, pp. 99-108, Jun. 2006. |
Mahyuddin et al., “Neuro-fuzzy algorithm implemented in Altera's FPGA for mobile robot's obstacle avoidance mission”, TENCON 2009—2009 IEEE Region 10 Conference, IEEE, Piscataway, NJ, USA, Jan. 23, 2009; document of 6 pages. |
Kim et al., “User-Centered Approach to Path Planning of Cleaning Robots: Analyzing User's Cleaning Behavior.” Proceedings of the 2007 ACM/IEEE Conference on Human-Robot Interaction, Mar. 8-11, 2007, pp. 373-380. |
Neto et al., Human-Machine Interface Based on Electro-Biological Signals for Mobile Vehicles, 2006, IEEE, p. 2954-2959 (Year: 2006). |
Forlizzi, How robotic products become social products: An ethnographic study of cleaning in the home, 2007, IEEE, p. 129-136 (Year: 2007). |
Sick Sensor Intelligence, “LMS200/211/221/291 Laser Measurement Systems”, Jan. 2007, pp. 1-48, XP055581229, http://sicktoolbox.sourceforge.net/docs/sick-lms-technical-description.pdf. |
Vasquez-Gomez et al., “View planning for 3D object reconstruction with a mobile manipulator robot,” 2014 1EEE/RSJ International Conference on Intelligent Robots and Systems, Sep. 14, 2014 IEEE, pp. 4227-4423. |
Office Action issued in Japanese Patent Application No. 2021-209429 dated Feb. 3, 2023, with translation. |
Number | Date | Country | |
---|---|---|---|
20190025838 A1 | Jan 2019 | US |