The present disclosure relates to an autonomous mobile cleaning apparatus, a cleaning method, and a recording medium.
Japanese Unexamined Patent Application Publication No. 2006-277121 discloses a movement path creation apparatus that creates a movement path in accordance with a movement area. Specifically, the movement path creation apparatus creates a movement path using information about an area across which a mobile robot is unable to move.
Japanese Unexamined Patent Application Publication No. 2006-277121 discloses the technique for creating a movement path using information about an area across which the mobile robot is unable to move.
However, a technique for creating a movement path by taking into consideration a course of movement to an area in which the mobile robot has difficulty in moving is not disclosed.
One non-limiting and exemplary embodiment provides an autonomous mobile cleaning apparatus, a cleaning method, and a recording medium with which movement modes that take into consideration the order of cleaning of a target object having a possibility of creating a stuck state and cleaning of a portion other than the target object can be generated and provided to a user.
In one general aspect, the techniques disclosed here feature an autonomous mobile cleaning apparatus including: a main body; a suction unit included in the main body; a driver included in the main body and driving movement of the main body; a controller included in the main body; and a display included in the main body. The controller (a) obtains information about a first target object having a possibility of putting the main body into a stuck state, and the controller (b) accepts information indicating that the first target object is to be set as a cleaning target object, and thereafter, causes the display to display a first display screen that allows selection of a first movement mode or a second movement mode. The autonomous mobile cleaning apparatus is caused to climb over the first target object in a first process, the autonomous mobile cleaning apparatus is caused to clean a first area except for the first target object in a second process, the second process is performed prior to the first process in the first movement mode, and the first process is performed prior to the second process in the second movement mode.
According to the present disclosure, movement modes that take into consideration the order of cleaning of a target object having a possibility of creating a stuck state and cleaning of a portion other than the target object can be generated and provided to a user.
It should be noted that general or specific embodiments may be implemented as a system, a method, an integrated circuit, a computer program, a computer-readable recording medium, or any selective combination thereof. Examples of a computer-readable recording medium include a non-volatile recording medium, such as a compact disc read-only memory (CD-ROM).
Additional benefits and advantages of the disclosed embodiments will become apparent from the specification and drawings. The benefits and/or advantages may be individually obtained by the various embodiments and features of the specification and drawings, which need not all be provided in order to obtain one or more of such benefits and/or advantages.
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings.
Before detailed descriptions of embodiments of the present disclosure are given with reference to the drawings, various aspects of the present disclosure are described below.
A first aspect of the present disclosure provides an autonomous mobile cleaning apparatus including: a main body; a suction unit included in the main body; a driver included in the main body and driving movement of the main body; a controller included in the main body; and a display included in the main body. The controller (a) obtains information about a first target object having a possibility of putting the main body into a stuck state, and the controller (b) accepts information indicating that the first target object is to be set as a cleaning target object, and thereafter, causes the display to display a first display screen that allows selection of a first movement mode or a second movement mode. The autonomous mobile cleaning apparatus is caused to climb over the first target object in a first process, the autonomous mobile cleaning apparatus is caused to clean a first area except for the first target object in a second process, the second process is performed prior to the first process in the first movement mode, and the first process is performed prior to the second process in the second movement mode.
According to the first aspect, movement modes that take into consideration the order of cleaning of a target object having a possibility of creating a stuck state and cleaning of a portion other than the target object can be generated and provided to a user.
A second aspect of the present disclosure provides the autonomous mobile cleaning apparatus according to the first aspect, the autonomous mobile cleaning apparatus further including a memory that stores the information about the first target object. In the autonomous mobile cleaning apparatus, the controller obtains the information about the first target object from the memory.
A third aspect of the present disclosure provides the autonomous mobile cleaning apparatus according to the first or second aspect, in which (c) after obtaining the information about the first target object and before accepting the information indicating that the first target object is to be set as a cleaning target object, the controller causes the display to display a second display screen that allows selection of whether to set the first target object as a cleaning target object, and in a state where the second display screen is displayed, the controller accepts information indicating that the first target object is to be set or is not to be set as a cleaning target object.
A fourth aspect of the present disclosure provides the autonomous mobile cleaning apparatus according to the first or second aspect, in which (d1) in a case where the controller accepts selection of the first movement mode in a state where the first display screen is displayed, the controller causes the main body to move in accordance with the first movement mode, and (d2) in a case where the controller accepts selection of the second movement mode in a state where the first display screen is displayed, the controller causes the main body to move in accordance with the second movement mode.
A fifth aspect of the present disclosure provides the autonomous mobile cleaning apparatus according to any one of the first to fourth aspects, the autonomous mobile cleaning apparatus further including a camera included in the main body. In the autonomous mobile cleaning apparatus, the camera obtains a camera image including information about surrounding of the main body, and in (a), the controller obtains the information about the first target object on the basis of the camera image.
A sixth aspect of the present disclosure provides the autonomous mobile cleaning apparatus according to any one of the first to fifth aspects, the autonomous mobile cleaning apparatus further including a first sensor included in the main body. In the autonomous mobile cleaning apparatus, in (a), the controller obtains the information about the first target object on the basis of information about objects detected by the first sensor.
A seventh aspect of the present disclosure provides the autonomous mobile cleaning apparatus according to any one of the first to sixth aspects, the autonomous mobile cleaning apparatus further including a second sensor that detects the stuck state. In the autonomous mobile cleaning apparatus, in (a), the controller obtains the information about the first target object on the basis of information about the stuck state detected by the second sensor.
An eighth aspect of the present disclosure provides the autonomous mobile cleaning apparatus according to any one of the first to sixth aspects, the autonomous mobile cleaning apparatus further including a second sensor that detects the stuck state. In the autonomous mobile cleaning apparatus, (e1) the controller controls the driver to drive the main body on the basis of selection of the second movement mode, and (e2) in a case where the second sensor detects the stuck state, and thereafter, the controller detects the main body exiting the stuck state, the controller changes the second movement mode to the first movement mode and controls the driver to drive the main body.
A ninth aspect of the present disclosure provides the autonomous mobile cleaning apparatus according to any one of the first to sixth aspects, the autonomous mobile cleaning apparatus further including a second sensor that detects the stuck state. In the autonomous mobile cleaning apparatus, (e1) the controller controls the driver to drive the main body on the basis of selection of the second movement mode, and (e2) in a case where the second sensor detects the stuck state, and thereafter, the controller detects the main body exiting the stuck state, the controller causes the display to display the first display screen.
A tenth aspect of the present disclosure provides the autonomous mobile cleaning apparatus according to any one of the first to sixth aspects, the autonomous mobile cleaning apparatus further including a second sensor that detects the stuck state. In the autonomous mobile cleaning apparatus, (f1) the controller accepts selection of the second movement mode in a state where the first display screen is displayed, (f2) the controller selects one operation control mode from among operation control modes as a first operation control mode and controls the driver in accordance with the first operation control mode and the second movement mode to drive the main body, and (f3) in a case where the second sensor detects the stuck state, and thereafter, the controller detects the main body exiting the stuck state, the controller selects an operation control mode different from the first operation control mode as a second operation control mode from among the operation control modes and controls the driver in accordance with the second operation control mode and the second movement mode to drive the main body, and the second operation control mode is different from the first operation control mode in a movement speed, a movement direction, or rotation or stop of a side brush of the main body.
An eleventh aspect of the present disclosure provides the autonomous mobile cleaning apparatus according to the tenth aspect, in which (f4) in a case where the controller drives the main body in accordance with the second operation control mode and the second movement mode, and thereafter, the second sensor detects the stuck state, the controller changes the second movement mode to the first movement mode and controls the driver to drive the main body.
A twelfth aspect of the present disclosure provides a cleaning method for an autonomous mobile cleaning apparatus, the method including: (a) obtaining information about a first target object having a possibility of putting a main body included in the autonomous mobile cleaning apparatus into a stuck state, the main body including a suction unit; and (b) accepting information indicating that the first target object is to be set as a cleaning target object, and thereafter, causing a display included in the autonomous mobile cleaning apparatus to display a first display screen that allows selection of a first movement mode or a second movement mode. The autonomous mobile cleaning apparatus is caused to climb over the first target object in a first process, the autonomous mobile cleaning apparatus is caused to clean a first area except for the first target object in a second process, the second process is performed prior to the first process in the first movement mode, and the first process is performed prior to the second process in the second movement mode.
According to the twelfth aspect, movement modes that take into consideration the order of cleaning of a target object having a possibility of creating a stuck state and cleaning of a portion other than the target object can be generated and provided to a user.
A thirteenth aspect of the present disclosure provides the cleaning method according to the twelfth aspect, in which the information about the first target object is obtained from a memory included in the autonomous mobile cleaning apparatus.
A fourteenth aspect of the present disclosure provides the cleaning method according to the twelfth or thirteenth aspect, further including (c) after obtaining the information about the first target object and before accepting the information indicating that the first target object is to be set as a cleaning target object, causing the display to display a second display screen that allows selection of whether to set the first target object as a cleaning target object. In the method, in a state where the second display screen is displayed, information indicating that the first target object is to be set or is not to be set as a cleaning target object is accepted.
A fifteenth aspect of the present disclosure provides the cleaning method according to the twelfth or thirteenth aspect, further including: (d1) in a case of accepting selection of the first movement mode in a state where the first display screen is displayed, moving the main body in accordance with the first movement mode; and (d2) in a case of accepting selection of the second movement mode in a state where the first display screen is displayed, moving the main body in accordance with the second movement mode.
A sixteenth aspect of the present disclosure provides the cleaning method according to any one of the twelfth to fifteenth aspects, in which, in (a), the information about the first target object is obtained on the basis of a camera image obtained by a camera included in the main body.
A seventeenth aspect of the present disclosure provides the cleaning method according to any one of the twelfth to sixteenth aspects, in which in (a), the information about the first target object is obtained on the basis of information about objects detected by a first sensor included in the main body.
An eighteenth aspect of the present disclosure provides the cleaning method according to any one of the twelfth to seventeenth aspects, in which, in (a), the information about the first target object is obtained on the basis of information about the stuck state detected by a second sensor that detects the stuck state.
A nineteenth aspect of the present disclosure provides the cleaning method according to any one of the twelfth to seventeenth aspects, further including: (e1) controlling a driver to drive the main body on the basis of selection of the second movement mode, the driver being included in the main body and driving movement of the main body; and (e2) in a case of detecting the main body exiting the stuck state after detection of the stuck state by a second sensor, changing the second movement mode to the first movement mode and controlling the driver to drive the main body, the second sensor being included in the autonomous mobile cleaning apparatus and detecting the stuck state.
A twentieth aspect of the present disclosure provides the cleaning method according to any one of the twelfth to seventeenth aspects, further including: (e1) controlling a driver to drive the main body on the basis of selection of the second movement mode, the driver being included in the main body and driving movement of the main body; and (e2) in a case of detecting the main body exiting the stuck state after detection of the stuck state by a second sensor, causing the display to display the first display screen, the second sensor being included in the autonomous mobile cleaning apparatus and detecting the stuck state.
A twenty-first aspect of the present disclosure provides the cleaning method according to any one of the twelfth to seventeenth aspects, further including: (f1) accepting selection of the second movement mode in a state where the first display screen is displayed; (f2) selecting one operation control mode from among operation control modes as a first operation control mode and controlling a driver on the basis of the first operation control mode and the second movement mode to drive the main body, the driver being included in the main body and driving movement of the main body; and (f3) in a case of detecting the main body exiting the stuck state after detection of the stuck state by a second sensor, selecting an operation control mode different from the first operation control mode as a second operation control mode from among the operation control modes and controlling the driver in accordance with the second operation control mode and the second movement mode to drive the main body, the second sensor being included in the autonomous mobile cleaning apparatus and detecting the stuck state. In the method, the second operation control mode is different from the first operation control mode in a movement speed, a movement direction, or rotation or stop of a side brush of the main body.
A twenty-second aspect of the present disclosure provides the cleaning method according to the twenty-first aspect, further including (f4) in a case of detection of the stuck state by the second sensor after driving the main body in accordance with the second operation control mode and the second movement mode, changing the second movement mode to the first movement mode and controlling the driver to drive the main body.
A twenty-third aspect of the present disclosure provides a non-transitory computer-readable recording medium storing a program for causing a device including a processor to execute processing, the processing being a cleaning method for an autonomous mobile cleaning apparatus, the method including: (a) obtaining information about a first target object having a possibility of putting a main body included in the autonomous mobile cleaning apparatus into a stuck state, the main body including a suction unit; and (b) accepting information indicating that the first target object is to be set as a cleaning target object, and thereafter, causing a display included in the autonomous mobile cleaning apparatus to display a first display screen that allows selection of a first movement mode or a second movement mode. The autonomous mobile cleaning apparatus is caused to climb over the first target object in a first process, the autonomous mobile cleaning apparatus is caused to clean a first area except for the first target object in a second process, the second process is performed prior to the first process in the first movement mode, and the first process is performed prior to the second process in the second movement mode.
According to the twenty-third aspect, movement modes that take into consideration the order of cleaning of a target object having a possibility of creating a stuck state and cleaning of a portion other than the target object can be generated and provided to a user.
The movement path creation apparatus disclosed by Japanese Unexamined Patent Application Publication No. 2006-277121 obtains information about an area across which the mobile robot is unable to move and creates a movement path that does not include the area across which the mobile robot is unable to move. An area across which the mobile robot is unable to move is determined on the basis of information about whether or not the area includes a height level difference that the mobile robot is unable to climb over. Determination as to whether or not the area includes a height level difference that the mobile robot is unable to climb over is performed on the basis of a predetermined possible-height-level-difference attribute or a question asked to a user (see paragraphs [0040] and [0088] and FIG. 5 of Japanese Unexamined Patent Application Publication No. 2006-277121).
The present inventors think that, in a case where the mobile robot is an autonomous mobile cleaning apparatus, a movement path needs to be created not on the basis of whether the mobile robot is able to move across an area of interest but on the basis of whether the user wants to clean an area of interest.
In a case where an area that the user wants to clean is an area in which a cleaning apparatus has difficulty in moving, the cleaning apparatus has a possibility of failing to move in the area. Here, the state where “a cleaning apparatus fails to move” means that the cleaning apparatus becomes unable to move (is stuck). More specifically, this state means that the cleaning apparatus attempts to move onto an area in which the cleaning apparatus has difficulty in moving, and as a result, the cleaning apparatus catches on, for example, a target object located in the area and is stuck. Here, “target object” is an object that is located within an area that the user wants to clean, and the upper surface thereof is also an area that the user wants to clean. It is assumed that the cleaning apparatus usually climbs onto the target object, cleans the upper surface, and descends to the floor from the upper surface.
The stuck state is described below with reference to
Three examples of the stuck state are described below.
Another example case is assumed where, as the target object 131, the rug 131b having the long threads 131a is placed as in the example illustrated in
In a case where the condition in at least one of stuck state 1 to stuck state 3 as described above is satisfied, the cleaning apparatus 10 enters the “stuck state”.
As described above, an area in which the cleaning apparatus has difficulty in moving is also referred to as “area with movement difficulties”.
In a case where the cleaning apparatus 10 attempts to move onto an area with movement difficulties, the user selects one movement method from among plural movement methods, the selection instruction is accepted, and the cleaning apparatus 10 attempts to move onto the area with the selected movement method. Such an operation is repeatedly performed, and as a result, the cleaning apparatus 10 can obtain a movement method with which the cleaning apparatus 10 successfully moves onto the area with movement difficulties.
In the stage where the cleaning apparatus 10 attempts to move onto an area with movement difficulties, there is a possibility that the cleaning apparatus 10 fails in moving and enters a stuck state. In this case, for example, the user or a robot can pick up the cleaning apparatus 10 off the area with movement difficulties to assist the cleaning apparatus 10 in exiting the stuck state, and thereafter, the cleaning apparatus 10 can attempt to move onto the area with movement difficulties with another movement method.
However, a situation where the user or a robot always needs to stay in proximity to the cleaning apparatus 10 in order to assist the cleaning apparatus 10 in exiting a stuck state is not practical from the viewpoint of convenience.
Therefore, the present inventors have conceived a cleaning apparatus that is able to set a movement path in accordance with the situation of the user or robot by allowing selection and acceptance of movement modes taking into consideration a point on a movement path of the cleaning apparatus at which the cleaning apparatus attempts to move to an area with movement difficulties.
Hereinafter, embodiments of the present disclosure will be described with reference to the drawings.
The cleaning apparatus 10 illustrated in
The cleaning apparatus 10 further includes a dust box 60 and a controller 70 in the cleaning apparatus main body 20. The dust box 60 stores dust sucked by the suction unit 50. The controller 70 controls at least the drivers 30 and the suction unit 50. The controller 70 can control the cleaning unit 40.
The cleaning apparatus 10 further includes the wheels 33 and a power supply unit 80. The wheels 33 rotate by following rotational driving by the drivers 30. The power supply unit 80 supplies power to the drivers 30, the cleaning unit 40, the suction unit 50, and so on.
In
The drivers 30 are provided in a pair in the first embodiment. One driver 30 is disposed to the right and to the left of the center in the width direction of the cleaning apparatus main body 20 in plan view. Hereinafter, the left driver 30 is sometimes referred to as a first driver, and the right driver 30 is sometimes referred to as a second driver. Note that the number of the drivers 30 is not limited to two and may be one or three or more. The details of the drivers 30 will be described below.
The cleaning apparatus main body 20 includes a lower casing 100 (see
Preferably, the planar shape of the cleaning apparatus main body 20 is a Reuleaux triangle, a Reuleaux polygon having a shape substantially the same as the shape of a Reuleaux triangle, or a Reuleaux triangle or a Reuleaux polygon having round vertexes. Such a shape can be used to bring features the same as or similar to the geometric features of a Reuleaux triangle to the cleaning apparatus main body 20. That is, a Reuleaux triangle is a shape of constant width, and therefore, can rotate within a rectangle having a constant width (that is, the length of the sides of an equilateral triangle inscribed in the Reuleaux triangle) in any direction while touching the rectangle. Accordingly, the cleaning apparatus main body 20 can draw a rectangular locus (that is, a substantially square locus). In the first embodiment, the cleaning apparatus main body 20 has a planar shape substantially the same as the shape of a Reuleaux triangle, as illustrated in
The cleaning apparatus main body 20 includes perimeter surfaces and vertex portions. In the first embodiment, the perimeter surfaces include a front surface 21 that is present on the forward side of the cleaning apparatus 10 (for example, the upper side in
In the first embodiment, the vertex portions include a right front vertex portion 23 that is defined by the front surface 21 and the right side surface 22, and a left front vertex portion 23 that is defined by the front surface 21 and the left side surface 22. The vertex portions may further include a back vertex portion 24 that is defined by the right side surface 22 and the left side surface 22. As illustrated in
The maximum width of the cleaning apparatus main body 20 is defined by the distance between the vertexes of the vertex portions of the cleaning apparatus main body 20. In the first embodiment, the maximum width of the cleaning apparatus main body 20 is defined by the right front vertex portion 23 and the left front vertex portion 23. In the example illustrated in, for example,
In the cleaning apparatus main body 20, a line segment W (hereinafter referred to as “maximum-width line segment W of the cleaning apparatus main body 20”) that connects the vertex of the right front vertex portion 23 with the vertex of the left front vertex portion 23 and the vicinity of the line segment W are referred to as “portion having the maximum width of the cleaning apparatus main body 20” or “maximum-width portion of the cleaning apparatus main body 20”. The terms “vicinity of the maximum-width line segment W of the cleaning apparatus main body 20” and “portion close to the maximum-width line segment W of the cleaning apparatus main body 20” indicate a portion close to the maximum-width line segment W of the cleaning apparatus main body 20, that is, a portion between the maximum-width line segment W of the cleaning apparatus main body 20 and the center of gravity G (see
Preferably, the maximum-width portion of the cleaning apparatus main body 20 is positioned close to the front surface 21 of the cleaning apparatus main body 20. Preferably, the direction in which the maximum-width line segment W of the cleaning apparatus main body 20 extends is set so as to be substantially orthogonal to the forward direction of the cleaning apparatus main body 20.
As illustrated in
The suction inlet 101 is formed in a portion close to the portion having the maximum width of the cleaning apparatus main body 20, and more preferably, in the portion close to the maximum-width line segment W of the cleaning apparatus main body 20 on the bottom surface of the lower casing 100 of the cleaning apparatus main body 20. This positional relationship is more specifically defined by the positional relationship between the suction inlet 101 and other constituent elements and so on of the cleaning apparatus 10 and, for example, is defined by one of or both the following two types of positional relationships.
The first positional relationship is such that the suction inlet 101 is located closer to the front side of the cleaning apparatus main body 20 than the center of gravity G (see
The second positional relationship is such that the suction inlet 101 is located in a portion closer to the maximum-width line segment W of the cleaning apparatus main body 20 than the drivers 30, preferably, on the maximum-width line segment W of the cleaning apparatus main body 20 or in the vicinity of the maximum-width line segment W, and more preferably, in a portion closer to the front surface 21 than the maximum-width line segment W of the cleaning apparatus main body 20.
In the first embodiment, the width of the suction inlet 101 in the long-side direction is made wider than the distance between the inner side of the right driver 30 and the inner side of the left driver 30. Such a configuration can be implemented by, for example, employing the second positional relationship of the suction inlet 101 described above. With such a configuration, the suction inlet 101 having a larger width can be provided, dust can be directly sucked through the suction inlet 101 with more certainty, and the amount of dust sucked into the suction unit 50 described below can be increased.
The drivers 30 are located in the cleaning apparatus main body 20.
As illustrated in
Each wheel 33 is disposed closer to the outer side of the cleaning apparatus main body 20 in the width direction than the movement motor 31 that applies a torque to the wheel 33. With such a configuration, the gap between the right wheel 33 and the left wheel 33 becomes wider than that in a case where each wheel 33 is disposed closer to the inner side in the width direction than the movement motor 31, which contributes to more stable movement of the cleaning apparatus main body 20.
The driving system of the cleaning apparatus 10 of the first embodiment is a system of parallel two-wheel type. That is, the right driver 30 and the left driver 30 are disposed so as to face each other in the width direction of the cleaning apparatus main body 20. In the first embodiment, as illustrated in
The distance between the rotation axis H and the center of gravity G of the cleaning apparatus 10 is set so as to bring, for example, a predetermined turning ability to the cleaning apparatus 10. The predetermined turning ability is a turning ability that enables the cleaning apparatus main body 20 to draw a locus the same as or similar to a rectangular locus drawn by the outline of a Reuleaux triangle described above. In the first embodiment, the rotation axis H is positioned closer to the back side of the cleaning apparatus main body 20 than the center of gravity G, and the rotation axis H and the center of gravity G are positioned at a predetermined distance. When the cleaning apparatus 10 of parallel two-wheel type employs such a configuration, the cleaning apparatus 10 can draw the locus as described above as the cleaning apparatus main body 20 comes into contact with objects around the cleaning apparatus main body 20.
As illustrated in
The brush driving motor 41 and the gearbox 42 are fixed to the lower casing 100. The gearbox 42 is connected to the output shaft of the brush driving motor 41 and to the main brush 43 and transmits a torque of the brush driving motor 41 to the main brush 43.
The main brush 43 has a length substantially the same as the length of the suction inlet 101 in the long-side direction and is supported by a bearing so as to be rotatable relative to the lower casing 100. The bearing is formed in, for example, one of or both the gearbox 42 and the lower casing 100. In the first embodiment, the rotation direction of the main brush 43 is set to a direction in which the rotation path extends from the front to the back of the cleaning apparatus main body 20 on the floor side, as indicated by the arrow AM in
As illustrated in
The electric fan 51 is used to suck air inside the dust box 60 and discharge the air to the outside of the electric fan 51. The air discharged by the electric fan 51 passes through the space inside the fan case 52 and a space around the fan case 52 within the cleaning apparatus main body 20 and is exhausted to the outside of the cleaning apparatus main body 20.
As illustrated in
As illustrated in
The sensor unit 426 includes an obstacle detecting sensor 71, distance measuring sensors 72, a collision detecting sensor 73, and floor surface detecting sensors 74.
The obstacle detecting sensor 71 detects an obstacle present in front of the cleaning apparatus main body 20 (see
The distance measuring sensors 72 each detect the distance between an object present around the cleaning apparatus main body 20 and the cleaning apparatus main body 20 (see
The collision detecting sensor 73 detects a collision of the cleaning apparatus main body 20 with an object around the cleaning apparatus main body 20 (see
The floor surface detecting sensors 74 each detect a floor surface on which the cleaning apparatus main body 20 is located (see
The obstacle detecting sensor 71, the distance measuring sensors 72, the collision detecting sensor 73, and the floor surface detecting sensors 74 each input a detection signal to the controller 70.
As the obstacle detecting sensor 71, for example, a laser distance-measuring device (laser range finder) that emits a laser beam in a range of 180 degrees at predetermined intervals of, for example, 1 second and measures a distance is used. The obstacle detecting sensor 71 can detect an object, such as a table or a chair, and also can detect whether the target object 131, such as a rug or a mat, is present on the floor on the basis of the distance between the object or the target object 131 and the cleaning apparatus main body 20. In a case where the target object 131 is present, the obstacle detecting sensor 71 can detect the form of the object or the target object 131 and the distance to the object or the target object 131.
As the distance measuring sensors 72 and the floor surface detecting sensors 74, for example, infrared sensors or laser distance-measuring devices (laser range finders) are used. The distance measuring sensors 72 and the floor surface detecting sensors 74 each include a light emitting unit and a light receiving unit. As the collision detecting sensor 73, for example, a contact-type displacement sensor is used. The collision detecting sensor 73 includes, for example, a switch that is disposed in the cleaning apparatus main body 20 and is turned on in response to the bumper 230 being pressed against the cover 210.
As illustrated in
As illustrated in
The sensor unit 426 further includes, for example, a number-of-revolutions sensor 455, which is, for example, an optical encoder, that detects the number of revolutions of each wheel 33 (in other words, each movement motor 31). The number-of-revolutions sensor 455 detects and inputs, to the controller 70, the turning angle, the movement distance, or the amount of movement of the cleaning apparatus 10 (that is, the cleaning apparatus main body 20) on the basis of the measured number of revolutions of each wheel 33 (in other words, each movement motor 31). Accordingly, the number-of-revolutions sensor 455 is a position detecting sensor that detects the relative position of the cleaning apparatus 10 (that is, the cleaning apparatus main body 20) relative to a reference position that is the position of, for example, a charging device for a storage battery 82.
On the basis of the position of the cleaning apparatus 10 detected by the number-of-revolutions sensor 455, the cleaning area CA and positional relationships of objects and so on located within the cleaning area CA are calculated to generate a map MP (see
The relative position can be used as the “present position” of the cleaning apparatus 10 described below.
On the respective sides of the obstacle detecting sensor 71 on the front surface of the cleaning apparatus main body 20, paired cameras 92 are disposed to obtain camera images including information about the surroundings of the cleaning apparatus main body 20. The details of the paired cameras 92 will be described below.
In the example illustrated in
For example, specifically, the hardware of the controller 70 is a microcontroller that includes a central processing unit (CPU), a read-only memory (ROM) that is a storage unit storing predetermined data, such as a program read by the CPU, and a random access memory (RAM) that is an area storage unit in which various memory areas, such as a work area used in data processing by the program, are dynamically formed. As illustrated in
The memory 461 functions as a recording unit for recording, for example, data of images captured by the paired cameras 92, the presence and forms of objects and the distances to the objects obtained by the obstacle detecting sensor 71, the initial position of the cleaning apparatus main body 20, and the amount of movement from the initial position or the present position. To the memory 461, patterns (for example, images) for matching and object information that includes the presence and forms of objects and name information thereof used by the image processing unit 463 can be recorded.
The image processing unit 463 functions as a map generation unit that generates the map MP of the cleaning area CA on the basis of data of images captured by the paired cameras 92 and the presence and forms of objects and the distances to the objects obtained by the obstacle detecting sensor 71.
The image generation unit 464 functions as an image generation unit that generates a distance image on the basis of data of images captured by the paired cameras 92 and the presence and forms of objects and the distances to the objects obtained by the obstacle detecting sensor 71.
The determination unit 465 functions as an obstacle determination unit that determines whether objects are obstacles on the basis of data of images captured by the paired cameras 92 and the presence and forms of the objects and the distances to the objects obtained by the obstacle detecting sensor 71.
The controller 70 further includes, for example, a travel control unit 466, a cleaning control unit 467, an image-capture control unit 468, and a computation unit 469.
The travel control unit 466 controls the operations of the right and left movement motors 31 (that is, the paired wheels 33) in the drivers 30.
The cleaning control unit 467 controls the operations of the brush driving motor 41 in the cleaning unit 40 and the operations of the electric fan 51 in the suction unit 50.
The image-capture control unit 468 controls the paired cameras 92, which are included in an image capturing unit 425.
The computation unit 469 performs computation on the basis of the numbers of revolutions detected by the number-of-revolutions sensor 455 and obtains information about the amount of movement made by the drivers 30 of the cleaning apparatus main body 20 as positional information about the cleaning apparatus main body 20.
The controller 70 has, for example, a movement mode for driving the paired wheels 33 (that is, the paired movement motors 31) to autonomously move the cleaning apparatus 10 (that is, the cleaning apparatus main body 20), a charge mode for charging the storage battery 82 described below via a charging device, and a standby mode for suspending operations. These modes are recorded to the memory 461.
The movement mode includes at least:
(i) a first movement mode in which the cleaning apparatus 10 cleans the cleaning area CA except for a target object, and thereafter, climbs over the target object; and
(ii) a second movement mode in which the cleaning apparatus 10 first climbs over a target object, and thereafter, cleans the cleaning area CA except for the target object,
where, the cleaning area CA includes an area in which the target object is present.
Here, climbing over indicates an operation in which, for example, the cleaning apparatus 10 climbs onto a target object, cleans the upper surface of the target object, and thereafter, descends from the target object. The position at which the cleaning apparatus 10 climbs a target object and the position at which the cleaning apparatus 10 descends from the target object may be different or may be the same. After the cleaning apparatus 10 has climbed onto a target object, the cleaning apparatus 10 may clean the upper surface of the target object while moving in various directions. Alternatively, after the cleaning apparatus 10 has climbed onto a target object, the cleaning apparatus 10 may clean the upper surface of the target object while moving straight without turning, and thereafter, may descend from the target object.
When the image processing unit 463 functions as a map generation unit that generates the map MP of the cleaning area CA, various specific methods for map generation processing are available. For example, as a generation method to be used in a case where the cleaning apparatus 10 generates the map MP and as a method for estimating the own position of the cleaning apparatus 10, a technique called Simultaneous Localization and Mapping (SLAM) can be used. SLAM is a technique for estimating the own position of the cleaning apparatus 10 and creating an environment map simultaneously on the basis of information about the distances to objects detected by the sensor unit 426.
The idea of SLAM is briefly described below.
(1) The position of the observation point on a map is estimated on the basis of the position of the cleaning apparatus 10.
(2) The position of the cleaning apparatus 10 is consecutively estimated by using a technique of, for example, odometry in which the amount of movement is obtained from the number of revolutions of each wheel 33 of the cleaning apparatus 10.
(3) The point registered on the map MP is observed again, and the position of the cleaning apparatus 10 is corrected.
The image processing unit 463 creates simultaneous equations by combining equations used in (1) to (3) described above. When the image processing unit 463 solves the simultaneous equations by using the least square method, the position of the cleaning apparatus 10 and the map MP can be estimated, and a cumulative error can be reduced.
For details, see “Mobile Robot Perception: Mapping and Localization”, written by Masahiro Tomono, “Systems, Control and Information” vol. 60, No. 12, pp. 509 to 514, 2016 issued by The Institute of Systems, Control and Information Engineers.
The generated map MP is recorded to a map database 99 in a database 110 described below, and the estimated own position of the cleaning apparatus 10 is recorded to the memory 461 together with the time of estimation.
The memory 461 retains various types of data recorded thereto regardless of the power on/off of the cleaning apparatus 10. The memory 461 is, for example, a nonvolatile memory, such as a flash memory.
The image processing unit 463 calculates the distances between the cleaning apparatus 10 (that is, the cleaning apparatus main body 20) and objects around the cleaning apparatus 10 (that is, the cleaning apparatus main body 20) by using data of images captured by the paired cameras 92 and the presence and forms of the objects and the distances to the objects obtained by the obstacle detecting sensor 71. The image processing unit 463 uses the calculated distances and the position of the cleaning apparatus 10 (that is, the cleaning apparatus main body 20) detected by the number-of-revolutions sensor 455 of the sensor unit 426 to calculate the cleaning area CA in which the cleaning apparatus 10 (that is, the cleaning apparatus main body 20) is placed and positional relationships of the objects and so on located within the cleaning area CA, thereby generating the map MP (see
The image generation unit 464 generates data of images captured by the paired cameras 92 and a distance image indicating the presence and forms of objects and the distances to the objects obtained by the obstacle detecting sensor 71.
The image generation unit 464 generates the distance image in such a manner that the image generation unit 464 converts data of images captured by the paired cameras 92 and the forms of objects and the distances to the objects obtained by the obstacle detecting sensor 71 to a gray scale that is represented by the luminance or the hue and that can be visually identifiable a predetermined number of dots by a predetermined number of dots, for example, for each dot of the images and displays the resulting image. In the first embodiment, the image generation unit 464 generates the distance image as a monochrome image in which the luminance decreases as the distance increases, that is, a gray-scale image having, for example, 256 levels (that is, 28 represented by 8 bits) in which the color is closer to black as the forward distance from the cleaning apparatus 10 (that is, the cleaning apparatus main body 20) increases, and the color is closer to white as the forward distance decreases. That is, the distance image is an image in which the aggregate of distance data of objects located in front of the cleaning apparatus 10 (that is, the cleaning apparatus main body 20) in the movement direction within an area for which images are captured by the paired cameras 92 is visualized.
The determination unit 465 determines whether objects detected by the obstacle detecting sensor 71 are obstacles on the basis of data of images captured by the paired cameras 92 and the forms of the objects and the distances to the objects obtained by the obstacle detecting sensor 71. Specifically, the determination unit 465 extracts a portion in a predetermined area, for example, in a predetermined rectangular image area in the distance image on the basis of data of images captured by the paired cameras 92 and the forms of the objects and the distances to the objects obtained by the obstacle detecting sensor 71. Subsequently, the determination unit 465 compares the distances to objects within the extracted image area with a set distance that is a predetermined threshold or a variably set threshold. Subsequently, the determination unit 465 determines an object that is located at a distance equal to or shorter than the set distance as an obstacle. In other words, the determination unit 465 determines an object for which the distance from the cleaning apparatus 10 (that is, the cleaning apparatus main body 20) is equal to or shorter than the set distance as an obstacle. The image area is set in accordance with the size of the cleaning apparatus 10 (that is, the cleaning apparatus main body 20) in the up-down direction and in the right-left direction. That is, the size of the image area in the up-down direction and that in the right-left direction are set so as to correspond to an area with which the cleaning apparatus 10 (that is, the cleaning apparatus main body 20) comes into contact when the cleaning apparatus 10 (that is, the cleaning apparatus main body 20) moves straight forward.
The travel control unit 466 controls the magnitudes and orientations of the currents that flow through the paired movement motors 31 to thereby rotate each of the paired movement motors 31 in the forward direction or in the backward direction. The travel control unit 466 thus controls driving of each of the paired movement motors 31 to control driving of each of the paired wheels 33.
The cleaning control unit 467 controls the conduction angle for each of the electric fan 51 and the brush driving motor 41 to thereby control driving of the electric fan 51 and the brush driving motor 41. The cleaning control unit 467 may be provided for each of the electric fan 51 and the brush driving motor 41 separately.
The image-capture control unit 468 includes a control circuit that controls the operations of the shutter of each of the paired cameras 92. The image-capture control unit 468 controls the respective shutters so as to be operated at predetermined time intervals, thereby controlling the paired cameras 92 to capture images at the predetermined time intervals.
As illustrated in
Instead of or in addition to the display 417c disposed on the cleaning apparatus main body 20, display content may be displayed on a display 417d of an external terminal, such as a smartphone, as illustrated in
The displays 417c and 417d can function as an example of an input/output device. Specifically, for example, the displays 417c and 417d are liquid crystal displays or organic electroluminescence (EL) displays, and the surface thereof is constituted by a touch panel. Therefore, various display screens can be displayed on the displays 417c and 417d, and user input can be accepted on the displays 417c and 471d.
The power supply unit 80 is located in the cleaning apparatus main body 20 and supplies power to, for example, a communication unit 423, the image capturing unit 425, the drivers 30, the cleaning unit 40, the suction unit 50, and the sensor unit 426. The power supply unit 80 is disposed closer to the back side of the cleaning apparatus main body 20 than the center of gravity G of the cleaning apparatus 10 and closer to the back side of the cleaning apparatus main body 20 than the suction unit 50, and includes elements, such as a power supply case 81. In the first embodiment, specifically, the hardware of the power supply unit 80 includes the power supply case 81, which is fixed to the lower casing 100, the storage battery 82 accommodated in the power supply case 81, and a main switch 83 for switching between supply and stop of power from the power supply unit 80 to each element described above.
As the storage battery 82, for example, a secondary battery is used. The storage battery 82 is accommodated in the cleaning apparatus main body 20 and is electrically connected to charging terminals (not illustrated) that function as connection portions on the respective sides of the back portion on the lower surface of the cleaning apparatus main body 20 so as to be exposed. When the charging terminals are electrically and mechanically connected to a charging device, the storage battery 82 is charged via the charging device.
The cleaning apparatus 10 further includes the paired cameras 92 that obtain camera images including information about the surroundings of the cleaning apparatus main body 20 in accordance with image-capture control by the image-capture control unit 468.
The paired cameras 92 constitute the image capturing unit 425 capturing images and are disposed to the right and to the left of the obstacle detecting sensor 71 on the front surface 21 of the cleaning apparatus main body 20. That is, in the first embodiment, the paired cameras 92 are disposed on the front surface 21 of the cleaning apparatus main body 20 to the right and to the left of the center line L in the width direction of the cleaning apparatus 10 (that is, the cleaning apparatus main body 20) at substantially the same predetermined angles (for example, an acute angle) relative to the center line L. In other words, the paired cameras 92 are disposed substantially symmetrically in the width direction of the cleaning apparatus main body 20, and the center position between the paired cameras 92 substantially matches the center position in the width direction that crosses, (for example, is orthogonal to) the front-back direction that is the movement direction of the cleaning apparatus 10 (that is, the cleaning apparatus main body 20). Further, the paired cameras 92 are disposed at substantially the same positions in the up-down direction, that is, at substantially the same heights. Accordingly, in a state where the cleaning apparatus 10 is placed on a floor, the altitudes of the paired cameras 92 from the floor are set to substantially the same altitudes. The paired cameras 92 are disposed away from each other at positions shifted from each other (for example, at positions shifted form each other in the right-left direction). The paired cameras 92 are digital cameras that capture digital images of a front scene in the movement direction of the cleaning apparatus main body 20 at a predetermined horizontal angle of view (for example, 105°) at predetermined time intervals, namely, at very short time intervals of, for example, several tens milliseconds or several seconds. Further, the fields of view of the paired cameras 92 overlap, and the image-capture areas of a pair of images captured by the paired cameras 92 overlap in the right-left direction in an area that includes a front position on the extending center line L, the center line L being a center line in the width direction of the cleaning apparatus 10 (that is, the cleaning apparatus main body 20). In the first embodiment, it is assumed that the paired cameras 92 capture images of, for example, a visible-light area. Images captured by the paired cameras 92 can be compressed in a predetermined data format by, for example, an image processing circuit not illustrated.
Images captured by the paired cameras 92 are input to the image processing unit 463 of the controller 70, and the controller 70 obtains information about objects including the target object 131, such as the presence and forms of the objects.
For example, in the image processing unit 463, when images captured by the cameras 92 are input to a learner in the image processing unit 463 that has performed leaning in advance, object information that includes the presence and forms of objects and name information thereof can be obtained from the camera images. Alternatively, in the image processing unit 463, when matching of patterns (for example, images) retained in advance in the image processing unit 463 with camera images is performed, object information that includes the presence and forms of objects and name information thereof can be obtained.
As described above, in a case where object information is obtained from camera images, the position at a predetermined distance from the own position of the cleaning apparatus 10 in the orientation of the cleaning apparatus 10 (or the cameras 92) when image capturing is performed, that is, the distance between an object and the cleaning apparatus 10, is obtained as the “position of the object”.
As examples of object information obtained on the basis of camera images, an example of a camera image that includes information about the surroundings of the cleaning apparatus main body, from which information about legs 131e of a chair 131c is obtained, is illustrated in
The database 110 is connected to, for example, the communication unit 423, the controller 70, the image capturing unit 425, and the sensor unit 426 and includes the map database 99 and a path database 102.
To the map database 99, map information about the cleaning area CA is recorded. As the map information about the cleaning area CA, map information about the cleaning area CA created in advance may be recorded or map information about the cleaning area CA created by the cleaning apparatus 10 may be recorded.
To the path database 102, movement paths P of the cleaning apparatus 10 in the map information about the cleaning area CA are recorded, and information about path generation rules described below is also recorded. Plural movement paths P generated on the basis of path generation rules are recorded in advance to the path database 102, as described below, and a user is allowed to select at least one movement path P from among the movement paths P, and the selection instruction is accepted. Here, the movement paths P are paths along which the cleaning apparatus main body 20 moves and performs cleaning.
The cleaning apparatus 10 may further include the communication unit 423 that communicates with an external device 417 constituted by an external terminal device, such as a personal computer (PC) or a smartphone.
The communication unit 423 includes, for example, a wireless local area network (LAN) device 447, a transmission unit not illustrated, and a reception unit not illustrated. The wireless LAN device 447 functions as a wireless communication unit for wireless communication with the external device 417 via a home gateway 414 and a network 415 and as a cleaning-apparatus-signal reception unit. The transmission unit is formed of, for example, an infrared light emitting device and transmits a radio signal (for example, an infrared signal) to, for example, a charging device. The reception unit is formed of, for example, a phototransistor and receives a radio signal (for example, an infrared signal) from, for example, a charging device not illustrated or a remote controller not illustrated.
The wireless LAN device 447 is used to transmit various types of information from the cleaning apparatus 10 to the network 415 via the home gateway 414 and to receive various types of information from the network 415 via the home gateway 414 and, for example, is built in the cleaning apparatus main body 20.
The home gateway 414 is also called, for example, an access point, is installed within a building, and is connected to the network 415 via, for example, a wired line.
A server 416 is a computer (for example, a cloud server) connected to the network 415 and is able to save various types of data.
The external device 417 is a general-purpose device, such as a PC (for example, a tablet terminal (for example, a tablet PC)) 417a or a smartphone (or a mobile phone) 417b, capable of performing wired or wireless communication with the network 415 via, for example, the home gateway 414 within a building and capable of performing wired or wireless communication with the network 415 outside a building.
The external device 417 includes the display 417d (see
In a cleaning apparatus 10B according to a first modification of the first embodiment illustrated in
One of the gearboxes 42 (for example, the right one in plan view of the cleaning apparatus main body 20) is connected to the output shaft of the brush driving motor 41, the main brush 43, and a corresponding one of the side brushes 44 and transmits a torque of the brush driving motor 41 to the main brush 43 and the corresponding one of the side brushes 44. The other gearbox 42 (for example, the left one in plan view of the cleaning apparatus main body 20) is connected to the main brush 43 and the other side brush 44 and transmits a torque of the main brush 43 to the other side brush 44.
In the first modification of the first embodiment, the paired side brushes 44 each include a brush shaft 44A fixed to a corresponding one of the two front vertex portions 23 of the cleaning apparatus main body 20, and bundles of bristles 44B fixed to the brush shaft 44A. The side brushes 44 are positioned in the cleaning apparatus main body 20 such that part of a locus drawn by each of the side brushes 44, which is capable of collecting dust into the suction inlet 101, rotating (this locus is hereinafter referred to as a rotational locus drawn by the side brush 44 rotating one revolution) is within the maximum-width portion of the cleaning apparatus main body 20. In the first modification of the first embodiment, the number of the bundles of bristles 44B fixed to each brush shaft 44A is three, and the bundles of bristles 44B are fixed to the brush shaft 44A at predetermined angular intervals.
Each brush shaft 44A has a rotation axis that extends in a direction the same as or substantially the same as the height direction of the cleaning apparatus main body 20, is supported by the cleaning apparatus main body 20 so as to be rotatable relative to the cleaning apparatus main body 20, and is disposed closer to the front side of the cleaning apparatus main body 20 than the center line of the suction inlet 101 in the long-side direction.
Each bundle of bristles 44B is formed of plural bristles, and is fixed to a corresponding one of the brush shafts 44A so as to extend in a direction the same as or substantially the same as the radial direction of the brush shaft 44A. In the first modification of the first embodiment, the length of each bundle of bristles 44B are set such that the distal end of the bundle of bristles 44B extends outward beyond the periphery of the cleaning apparatus main body 20.
The rotation direction of each side brush 44 is set to a direction in which the rotational locus of the side brush 44 extends from the front to the back of the cleaning apparatus main body 20 on a side closer to the center of the cleaning apparatus main body 20 in the width direction, as indicated by the arrows AM in
Now, a method for controlling the cleaning apparatus 10 by the controller 70 is described.
The controller 70 is disposed on the power supply unit 80 (see
In the controller 70, the determination unit 465 determines whether an object that can hinder the cleaning apparatus 10 from moving is present within a predetermined area in front of the cleaning apparatus main body 20 on the basis of a detection signal that is input from the obstacle detecting sensor 71 of the sensor unit 426 and that includes the presence and forms of objects and the distances to the objects.
The controller 70 calculates the distances between the periphery of the cleaning apparatus main body 20 and objects present in an area around each of the right and left front vertex portions 23 of the cleaning apparatus main body 20 on the basis of detection signals input from the respective right and left distance measuring sensors 72.
The controller 70 determines whether the cleaning apparatus main body 20 has collided with an object around the cleaning apparatus main body 20 on the basis of a detection signal input from the collision detecting sensor 73.
The controller 70 determines whether a floor surface that is the cleaning area CA is present under the cleaning apparatus main body 20 on the basis of detection signals input from the floor surface detecting sensors 74.
The controller 70 uses one or more of the results of determination and calculation described above to control the movement motors 31, the brush driving motor 41, and the electric fan 51 so that the floor surface that is the cleaning area CA is cleaned by the cleaning apparatus 10.
Now, a method for controlling movement of the cleaning apparatus 10 by the controller 70 is described with reference to
The movement control method for the cleaning apparatus 10 includes obtaining map information by the controller 70 in step S100, obtaining object information about objects in the environment by the controller 70 in step S200, setting the movement path P by the controller 70 in step S300, and movement control of the cleaning apparatus main body 20 by the controller 70 and driving control of the cleaning unit 40 by the controller 70 in step S400.
Here, the map information is, for example, the two-dimensional map MP illustrated in
The object information includes at least the positions of objects including a target object on the two-dimensional map MP, camera images or the forms of the objects, and name information about the objects. The object information is recorded to the memory 461. The positions of objects including a target object are recorded to the memory 461 in association with the time of recording. The controller 70 may obtain the map MP that includes the object information and that is associated with the positions of objects from the map database 99 and from the memory 461 as the map information.
The controller 70 obtains map information about the cleaning area CA from the map database 99.
In the controller 70, the image processing unit 463 obtains object information about objects within the cleaning area CA from camera images. For example, the rectangle indicated by the reference numeral 131 in
Specifically, for example, in the controller 70, the image processing unit 463 obtains object information that includes, for example, the presence and forms of objects within the cleaning area CA and name information thereof from camera images, as illustrated in
The controller 70 sets a movement path.
Specifically, the controller 70 obtains information about path generation rules from the path database 102.
Subsequently, the controller 70 generates movement paths in the cleaning area CA on the basis of the map information about the cleaning area CA obtained in step S100 and the path generation rules.
Specific examples of generated movement paths P are illustrated in
In an example path generation rule, the controller 70 controls movement of the cleaning apparatus 10 so that the distance between the cleaning apparatus 10 and a wall of the room detected by the distance measuring sensors 72 is within a predetermined range in the cleaning area CA. With such a path generation rule, a frame-shaped movement path P that goes around along the walls of the room can be generated, as illustrated in
In another example path generation rule, the controller 70 controls the cleaning apparatus 10 to randomly move within the cleaning area CA. With such a path generation rule, a random-shaped movement path P can be generated, as illustrated in
In another example path generation rule, the controller 70 controls movement of the cleaning apparatus 10 so that the cleaning apparatus 10 moves within the cleaning area CA along a spiral centered on a specified position. With such a path generation rule, a spiral movement path P can be generated, as illustrated in
The controller 70 records the generated movement paths P to the path database 102.
Instead of generating the movement paths P as described above, the controller 70 may obtain information about the initial position of the cleaning apparatus main body 20 from the memory 461 and generate the movement paths P in the cleaning area CA on the basis of the information about the initial position, the map information about the cleaning area CA, and the path generation rules.
Here, an example of the information about the initial position is the present position of the cleaning apparatus main body 20 recorded to the memory 461. Here, the present position is the position at which the cleaning apparatus 10 stops when an instruction for cleaning is input to the cleaning apparatus 10.
The controller 70 obtains the present position of the cleaning apparatus main body 20 from the sensor unit 426. Here, as the present position, information about the position of the cleaning apparatus 10 that is moving is obtained. Alternatively, the controller 70 obtains information about the present position of the cleaning apparatus main body 20 from the memory 461 to which information obtained by the sensor unit 426 is recorded. For example, the present position of the cleaning apparatus main body 20 is recorded to the memory 461 in association with the time. The controller 70 obtains from the memory 461 the present position of the cleaning apparatus main body 20 at the latest time recorded to the memory 461 as the initial position.
Another example of the information about the initial position is a position that is determined in advance as the initial position of the cleaning apparatus main body 20 (for example, a position at which charging by a charging device is performed). The controller 70 obtains the information about the initial position of the cleaning apparatus main body 20 from the memory 461.
In a case where the controller 70 uses the information about the initial position to generate the movement paths P in the cleaning area CA for setting, the controller 70 can generate the frame-shaped movement path P, the random-shaped movement path P, and the spiral movement path P described above having the initial position as the starting point and set the movement path P.
The movement path setting by the controller 70 in step S300 includes accepting setting of a cleaning target object in step S301, accepting a movement mode in step S302, and accepting a movement path in step S303.
The order of steps S301 to S303 is not limited to this. For example, step S302 and step S303 are flipped in the order, or the steps may be performed in the order of steps S303, S301, and S302.
The controller 70 accepts setting of a cleaning target object. Here, the image processing unit 463 of the controller 70 detects objects 133 and 134 from data of images captured by the paired cameras 92. Subsequently, the controller 70 accepts an instruction from a user for selecting whether to set the detected objects 133 and 134 as cleaning target objects. To accept a selection instruction from a user, a display screen illustrated in
If the user wants to set the object 133 as a cleaning target object, the user presses the YES button. If the user does not want to set the object 133 as a cleaning target object, the user presses the NO button. It is assumed here that the controller 70 accepts a user's selection instruction indicating that the object 133 is to be set as a cleaning target object. That is, the controller 70 accepts a setting indicating that the object 133 is to be set as a cleaning target object.
Similarly, the controller 70 causes the display 417c to display an image of the object 134 and display for prompting the user to select whether to set the object 134 as a cleaning target object (not illustrated). The controller 70 accepts a selection instruction indicating that the object 134 is to be set or is not to be set as a cleaning target object. It is assumed here that the controller 70 accepts a user's selection instruction indicating that the object 134 is not to be set as a cleaning target object. That is, the controller 70 accepts a setting indicating that the object 134 is not to be set as a cleaning target object.
If the user wants to set the object 133 as a cleaning target object, the user presses the YES button. If the user does not want to set the object 133 as a cleaning target object, the user presses the NO button. It is assumed here that the controller 70 accepts a selection instruction indicating that the object 133 is to be set as a cleaning target object. That is, the controller 70 accepts a setting indicating that the object 133 is to be set as a cleaning target object, that is, the object 133 is to be cleaned.
In a case where no objects are set as cleaning target objects, step S302 is skipped (not illustrated), and the flow proceeds to step S303.
In a case where the object 133 is set in advance as a cleaning target object, that is, the object 133 is set in advance so as to be cleaned, accepting user's selection indicating that the object 133 is to be cleaned or is not to be cleaned is not necessary. For example, a setting indicating that the object 133 is to be set as a cleaning target object may be set on the basis of the item type of the object 133.
Subsequently, the controller 70 accepts a movement mode. An example display screen on the display 417c illustrated in
If the user wants to clean the object 133 that is a cleaning target object first, the user presses the YES button. If the user does not want to clean the object 133 that is a cleaning target object first, the user presses the NO button. At this time, the example display screen on the display 417c illustrated in
If the user wants to clean the object 133 that is a cleaning target object first, the user presses the A button. If the user wants to clean the object 133 that is a cleaning target object last, the user presses the B button. It is assumed here that the controller 70 accepts a user's selection instruction indicating that the object 133 that is a cleaning target object is to be cleaned first.
Selecting the option of cleaning the object 133 first in
Subsequently, the controller 70 accepts a movement path. For example, the user selects one movement path P from among the frame-shaped movement path P, the random-shaped movement path P, and the spiral movement path P described above having the initial position as the starting point. The controller 70 accepts the user's selection. Examples of the accepted movement path are illustrated in
Subsequently, in the controller 70, the travel control unit 466 controls the drivers 30 to move the cleaning apparatus main body 20 along the selected movement path P from the initial position as the starting point. While the cleaning apparatus main body 20 is moving along the movement path P, in the controller 70, the cleaning control unit 467 drives the cleaning unit 40 to clean the cleaning area CA.
At this time, the controller 70 obtains positional information about the cleaning apparatus main body 20 from data of images captured by the paired cameras 92. The controller 70 obtains the present position of the cleaning apparatus main body 20 on the map MP of the cleaning area CA on the basis of the obtained positional information about the cleaning apparatus main body 20 and the map MP of the cleaning area CA in the map database 99.
That is, the controller 70 obtains, as positional information about the cleaning apparatus main body 20, the initial position of the cleaning apparatus main body 20 from data of images captured by the paired cameras 92. In the controller 70, the computation unit 469 performs calculation from the numbers of revolutions detected by the number-of-revolutions sensor 455 by using odometry to obtain information about the amount of movement by the drivers 30 of the cleaning apparatus main body 20 from the initial position. The obtained initial position and amount of movement can be recorded to the memory 461 or to the map database 99.
Accordingly, the computation unit 469 adds the amount of movement of the cleaning apparatus main body 20 to the initial position of the cleaning apparatus main body 20 to thereby obtain the present position of the cleaning apparatus main body 20 that is moving.
For example, the controller 70 may record the present position of the cleaning apparatus main body 20 to the map MP of the cleaning area CA in the map database 99. The controller 70 can record the present position of the cleaning apparatus main body 20 to the map MP in the map database 99 at predetermined time intervals.
In the controller 70, the travel control unit 466 controls driving of the drivers 30 to move the cleaning apparatus main body 20 so that the movement locus of the present position of the cleaning apparatus main body 20 matches the selected movement path P. Accordingly, the controller 70 can move the cleaning apparatus main body 20 along the selected movement path P from the initial position as the starting point by the travel control unit 466 controlling the drivers 30.
Although the overall operations of the cleaning apparatus 10 have been described above, all of the steps are not necessary in the first embodiment. Now, minimum-configured steps in the first embodiment are described with reference to
The cleaning method that is performed by the cleaning apparatus 10 can be constituted by at least the following operations, namely, obtaining object information about objects in the environment by the controller 70 in step S200, accepting a cleaning target object by the controller 70 in step S500, and accepting a movement mode by the controller 70 in step S302.
First, the image processing unit 463 (a) obtains information about the object 133 within the cleaning area CA. That is, as described above, in the controller 70, the image processing unit 463 obtains object information about objects within the cleaning area CA from camera images. Here, the object 133 is a target object having a possibility of putting the cleaning apparatus main body 20 into a stuck state. Specifically, information indicating that the object 133 is an object that put the cleaning apparatus main body 20 into a stuck state in the past is recorded to the memory 461 or to the map database 99. Information indicating that the object 133 is an object having a predetermined height or more may be recorded to the memory 461 or to the map database 99. Alternatively, information indicating that the object 133 is a rug that is laid on a floor, which may be a carpet, may be recorded to the memory 461 or to the map database 99. Information indicating that the object 133 is an object that is set in advance by a user as a target object having a possibility of creating a stuck state may be recorded to the memory 461.
Next, the controller 70 (b) accepts information about setting of a cleaning target object indicating that the object 133 is to be set or is not to be set as a cleaning target object. In a specific example of accepting, the operation in step S301 described above and illustrated in
Next, the controller 70 (c) accepts information indicating the time when the object 133 that is a cleaning target object is to be cleaned, as illustrated in
The controller 70 causes the display 417c or 417d to display a first display screen that allows selection of (i) the first movement mode in which the cleaning apparatus 10 cleans the cleaning area CA except for the object 133 that is a cleaning target object, and thereafter, cleans the object 133 that is a cleaning target object while climbing over the object 133 or (ii) the second movement mode in which the cleaning apparatus 10 cleans the object 133 that is a cleaning target object first while climbing over the object 133, and thereafter, cleans the cleaning area CA except for the object 133 that is a cleaning target object.
In a case where the controller 70 accepts information indicating that the object 133 is not to be set as a cleaning target object, cleaning ends when the cleaning apparatus 10 finishes cleaning the cleaning area CA except for the object 133 and, for example, the cleaning apparatus 10 returns to the reference position.
With the cleaning apparatus and the cleaning method thus configured according to the first embodiment, movement modes that take into consideration the order of cleaning of the object 133 having a possibility of creating a stuck state and cleaning of a portion other than the object 133 can be generated and provided to a user.
As the second modification of the first embodiment, the following operations are performed.
Specifically, as illustrated in
Subsequently, in a state where the second display screen is displayed on the display 417c or 417d, as illustrated in
As a third modification of the first embodiment, as illustrated in
Specifically, in a case where the controller 70 (d1) accepts selection of the first movement mode in a state where the first display screen as illustrated in
On the other hand, in a case where the controller 70 (d2) accepts selection of the second movement mode in the state where the first display screen as illustrated in
While the cleaning apparatus 10 is cleaning the object 133 that is a cleaning target object, depending on the object 133, the cleaning apparatus 10 has a possibility of entering a stuck state as illustrated in
As illustrated in
First, detection of a stuck state by the controller 70 in step S601 is described.
The controller 70 can detect the movement motors 31 stopping rotating or substantially stopping rotating when the numbers of revolutions detected by the number-of-revolutions sensor 455 become 0 or close to 0. The controller 70 detects the state where the movement motors 31 stop rotating or substantially stop rotating to thereby detect the cleaning apparatus 10 entering a stuck state. Here, the number-of-revolutions sensor 455 functions as an example of a second sensor.
In this case, the following is assumed. As illustrated in
In the controller 70, the image processing unit 463 obtains a camera image at a time (which is assumed to be time t1 here) a predetermined period (for example, several seconds) before time t3 at which the stuck state is detected, the predetermined period being indicated by the reference numeral 502. In the camera image at time t1, the cleaning target object that has created the stuck state is to be present. That is, in the camera image at time t1, the rug 131b in front of the cleaning apparatus 10 is present such that the rug 131b is away from the cleaning apparatus 10 in the movement direction, as illustrated in, for example,
In the controller 70, the image processing unit 463 compares the camera image at time t3 with the camera image at time t1, and the controller 70 assumes that the rug 131b is a cleaning target that has created the stuck state and identifies the position and form of the rug 131b. That is, the controller 70 obtains information about the object 133 on the basis of information about the stuck state of the cleaning apparatus main body 20 detected by the number-of-revolutions sensor 455.
Next, detection of an exit from the stuck state by the controller 70 in step S602 is performed as follows. In a case where the cleaning apparatus 10 enters a stuck state due to the rug 131b, for example, the user needs to pick up the cleaning apparatus 10 off the rug 131b and place the cleaning apparatus 10 at a separate position. At this time, when the cleaning apparatus 10 is picked up off the rug 131b or the floor 132, coming-off detecting switches 75 of the cleaning apparatus 10 operate, and the movement of the cleaning apparatus 10 is stopped.
Specifically, as illustrated in
Next, in re-determination of a movement mode by the controller 70 in step S603, the user is prompted to select the time when the object 133 that is a cleaning target object and that has created the stuck state is to be cleaned on the display screen on the display 417c as illustrated in
Selection of A “re-clean the object 133 that is a cleaning target object first” on the screen illustrated in
In the re-determination of a movement mode in step S603, the controller 70 may automatically select the second movement mode to re-clean the object 133 first without prompting the user to make selection and accepting the selection instruction (not illustrated).
As described below with reference to
In the re-determination of a movement mode in step S603, the following operations may be further performed.
In a case where the object 133 that has created a stuck state is to be re-cleaned, cleaning at the same speed in the same direction is highly likely to create a stuck state again. Therefore, the controller 70 uses an operation control mode in which at least one of an increase in the movement speed, a change in the angle of approach (for example, 45 degrees) relative to an edge of the object 133, and rotation stop of the side brushes 44 is selected to change the operation.
That is, for example, when the movement speed is set to the movement speed “high” that is higher than the movement speed “medium”, which is a speed in usual cleaning, the cleaning apparatus 10 might not enter a stuck state at an edge of the object 133 and may be able to swiftly run across the edge of the object 133. When the cleaning apparatus 10 moves and approaches an edge of the object 133 in a diagonal direction at the angle of approach set to 30 degrees or 60 degrees instead of 90 degrees, which is an angle in a usual case, the cleaning apparatus 10 may be able to climb over the edge of the object 133, as illustrated in
The controller 70 may automatically select one of the operation control modes illustrated in
After step S302 in the operation described above, the following operations can be performed, as illustrated in
(f1) In step S302, the controller 70 accepts selection of the second movement mode in a state where the first display screen is displayed.
(f2) Subsequently, in step S302c, the controller 70 selects one operation control mode from among the operation control modes as a first operation control mode.
Subsequently, in step S302d, in the controller 70, the travel control unit 466 controls the drivers 30 to drive the cleaning apparatus main body 20 in accordance with the first operation control mode and the second movement mode.
(f3) Subsequently, in step S302e, in a case where the number-of-revolutions sensor 455 detects the cleaning apparatus main body 20 entering a stuck state, and thereafter, the coming-off detecting switches 75 detect the cleaning apparatus main body 20 exiting the stuck state, the controller 70 selects an operation control mode different from the first operation control mode from among the operation control modes as a second operation control mode.
Subsequently, in step S302f, in the controller 70, the travel control unit 466 controls the drivers 30 to drive the cleaning apparatus main body 20 in accordance with the second operation control mode and the second movement mode.
Here, the second operation control mode is different from the first operation control mode in the movement speed, the movement direction, or rotation or stop of the side brushes 44 of the cleaning apparatus main body 20.
After step S302f in the operation described above, the following operations can be performed, as illustrated in
(f4) In step S302f, the controller 70 drives the cleaning apparatus main body 20 in accordance with the second operation control mode and the second movement mode.
Subsequently, in step S302g, in a case where the controller 70 detects the cleaning apparatus main body 20 entering a stuck state on the basis of a detection value obtained by the number-of-revolutions sensor 455, the controller 70 changes the second movement mode to the first movement mode.
Subsequently, in step S302h, the controller 70 may control the drivers 30 to drive the cleaning apparatus main body 20.
Subsequently, determination of a movement path by the controller 70 in step S604 is performed similarly to step S300.
Subsequently, control by the controller 70 in step S605 is performed similarly to step S400.
According to the embodiment described above, movement modes that take into consideration an order of cleaning of the object 133 having a possibility of creating a stuck state and cleaning of a portion other than the object 133 can be generated and provided to a user.
The present disclosure has been described with reference to the embodiment and modifications described above; however, the present disclosure is not limited to the embodiment and modifications described above, as a matter of course. The following cases are included in the present disclosure.
In the embodiment or modifications of the present disclosure, the planar shape of the cleaning apparatus 10 is not limited to a Reuleaux triangle or a Reuleaux polygon, and may be a round shape as indicated by a cleaning apparatus 100 illustrated in
Specifically, the controller 70 is, in part or in whole, a computer system constituted by, for example, a microprocessor, a ROM, a RAM, a hard disk unit, a display unit, a keyboard, and a mouse. To the RAM or the hard disk unit, a computer program is recorded. When the microprocessor operates in accordance with the computer program, the function of each unit is implemented. The computer program is composed of instruction codes each indicating an instruction to be given to the computer for implementing a predetermined function.
For example, a program executing unit, such as a CPU, can read and execute a software program recorded to a recording medium, such as a hard disk or a semiconductor memory, to thereby implement each constituent element. Software for implementing some or all of the elements constituting the controller 70 in the embodiment or modifications described above is a program for causing a computer to perform a cleaning method for an autonomous mobile cleaning apparatus, the method including: (a) obtaining information about a first target object having a possibility of putting a main body included in the autonomous mobile cleaning apparatus into a stuck state, the main body including a suction unit; and (b) accepting information indicating that the first target object is to be set as a cleaning target object, and thereafter, causing a display included in the autonomous mobile cleaning apparatus to display a first display screen that allows selection of a first movement mode or a second movement mode. The autonomous mobile cleaning apparatus is caused to climb over the first target object in a first process, the autonomous mobile cleaning apparatus is caused to clean a first area except for the first target object in a second process, the second process is performed prior to the first process in the first movement mode, and the first process is performed prior to the second process in the second movement mode.
This program may be downloaded from, for example, a server and executed, or may be recorded to a predetermined recording medium (for example, an optical disk or a magnetic disk, such as a CD-ROM, or a semiconductor memory), read from the recording medium, and executed.
The program may be executed by a single computer or plural computers. That is, central processing or distributed processing may be performed.
Any of the embodiments or modifications described above can be combined as appropriate to produce the effects of each of the combined embodiments or modifications. Embodiments can be combined, examples can be combined, or any embodiment and any example can be combined. Further, features in different embodiments or different examples can be combined.
The autonomous mobile cleaning apparatus, the cleaning method that is performed by the autonomous mobile cleaning apparatus, and the program for the autonomous mobile cleaning apparatus according to the present disclosure are applicable to autonomous mobile cleaning apparatuses, cleaning methods that are performed by autonomous mobile cleaning apparatuses, and programs for autonomous mobile cleaning apparatuses that are used in various environments in addition to autonomous mobile cleaning apparatuses for home use and autonomous mobile cleaning apparatuses for professional use.
Number | Date | Country | Kind |
---|---|---|---|
2017-191438 | Sep 2017 | JP | national |
2018-065568 | Mar 2018 | JP | national |