Embodiments described herein relate generally to a vacuum cleaner including a plurality of image pickup means for picking up images on a traveling-direction side of a main casing.
Conventionally, a so-called autonomous-traveling type vacuum cleaner (cleaning robot) which cleans a floor surface as a cleaning-object surface while autonomously traveling on the floor surface has been known.
Such a vacuum cleaner is required to avoid obstacles during its traveling. For this reason, the vacuum cleaner uses sensors, such as ultrasonic sensors and infrared sensors, for detection of obstacles which obstruct traveling. However, in the cases where these sensors are used, presence of obstacles can be detected, but it is not easy to detect the size and shape of the obstacles.
In such an autonomous-traveling type vacuum cleaner, in the cases where the vacuum cleaner is unable to detect the size and shape of an obstacle, it is required to travel while even avoiding areas in which such a vacuum cleaner can essentially travel. Thus, the area to be cleaned is limited, causing stagnation in cleaning.
Therefore, in the autonomous-traveling type vacuum cleaners, in order to enable a smoother cleaning, detection of the shape of an obstacle is desired.
PTL 1: Japanese Laid-open Patent Publication No. 2007-163223
PTL 2: Japanese Laid-open Patent Publication No. 2013-235351
An object of the present invention is to provide a vacuum cleaner capable of detecting the shape of an object with high precision.
The vacuum cleaner according to the embodiment has a main casing, driving wheels, a control unit, a plurality of cameras, a distance image generation part, and a shape acquisition part. The driving wheels enable the main casing to travel. The control unit controls drive of the driving wheels to thereby make the main casing autonomously travel. The cameras are disposed apart from each other in the main casing to pick up images on a traveling-direction side of the main casing. The distance image generation part generates a distance image of an object positioned on the traveling-direction side based on the images picked up by the cameras. The shape acquisition part acquires shape information of the picked-up object from the distance image generated by the distance image generation part.
Each of
Hereinbelow, the constitution of an embodiment will be described with reference to the accompanying drawings.
In
Further, the vacuum cleaner 11 includes a hollow main casing 20, a traveling part 21 to make the main casing 20 travel on a floor surface, a cleaning unit 22 for cleaning dust and dirt on the floor surface or the like, a communication part 23 for performing communication with an external device including the charging device 12, an image pickup part 25 for picking up images, a sensor part 26, control means (a control unit) 27 which is a controller for controlling the traveling part 21, the cleaning unit 22, the communication part 23, the image pickup part 25 or the like, and a secondary battery 28 for supplying electric power to the traveling part 21, the cleaning unit 22, the communication part 23, the image pickup part 25, the sensor part 26, the control means 27 or the like. In addition, the following description will be given on the assumption that a direction extending along the traveling direction of the vacuum cleaner 11 (main casing 20) is assumed as a back-and-forth direction (directions of arrows FR and RR shown in
The main casing 20 is formed into a flat columnar shape (disc shape) or the like from a synthetic resin, for example. That is, the main casing 20 includes a side surface portion 20a, and an upper surface portion 20b (
The traveling part 21 includes driving wheels 34, 34 as a plurality (pair) of driving parts, motors 35, 35 (
Each of the driving wheels 34 makes the vacuum cleaner 11 (main casing 20) travel (autonomously travel) in an advancing direction and a retreating direction on the floor surface, that is, serves for traveling use, and the driving wheels 34, having an unshown rotational axis extending along a left-and-right widthwise direction, are disposed symmetrical to each other in the widthwise direction.
Each of the motors 35 (
The swing wheel 36, which is positioned at a generally central and front portion of the lower surface portion 20c of the main casing 20 in the widthwise direction, is a driven wheel swingable along the floor surface.
The cleaning unit 22 includes an electric blower 41 which is positioned, for example, within the main casing 20 to suck dust and dirt along with air through the suction port 31 and discharge exhaust air through the exhaust port 32, a rotary brush 42 as a rotary cleaner which is rotatably attached to the suction port 31 to scrape up dust and dirt, as well as a brush motor 43 (
The communication part 23 shown in
The wireless LAN device 47 performs transmission and reception of various types of information with the network 15 from the vacuum cleaner 11 via the home gateway 14.
The image pickup part 25 includes a plurality of cameras 51a, 51b, for example as one and the other image pickup means (image pickup part bodies), and a lamp 53, such as an LED and the like, as illumination means (an illumination part) for illumination for these cameras 51a, 51b.
As shown in
The lamp 53 serves to emit illuminating light for image pickup by the cameras 51a, 51b, and is disposed at an intermediate position between the cameras 51a, 51b, that is, at a position on the center line L in the side surface portion 20a of the main casing 20. That is, the lamp 53 is distanced generally equally from the cameras 51a, 51b. Further, the lamp 53 is disposed at a generally equal position in the up-and-down direction, that is, a generally equal height position, to the cameras 51a, 51b. Accordingly, the lamp 53 is disposed at a generally center portion in the widthwise direction between the cameras 51a, 51b. In this embodiment, the lamp 53 is designed to emit light containing the visible light region.
The sensor part 26 shown in
The control means 27 is a microcomputer including, for example, a CPU which is a control means main body (control unit main body), a ROM which is a storage part in which fixed data such as programs to be read by the CPU have been stored, a RAM which is an area storage part for dynamically forming various memory areas such as a work area serving as a working region for data processing by programs or the like (where these component members are not shown). The control means 27 further includes, for example, a memory 61 which is storage means (a storage section) for storing therein image data picked up by the cameras 51a, 51b or the like, an image generation part 62 as distance image generation means (a distance image generation part) for calculating a distance (depth) to an object (feature point) from the cameras 51a, 51b based on images picked up by the cameras 51a, 51b, and then generating a distance image based on the calculated distance to the object, a shape acquisition part 63 as shape acquisition means for acquiring a shape of the object picked up in the distance image generated by the image generation part 62, a discrimination part 64 as discrimination means for discriminating control in accordance with the shape of the object acquired by the shape acquisition part 63, and an image processing part 65 as map generation means (a map generation part) for generating a map of a cleaning area based on obstacle discrimination implemented by the discrimination part 64 or the like. The control means 27 further includes a travel control part 66 for controlling operation of the motors 35, 35 (driving wheels 34, 34 (
The memory 61 is, for example, a nonvolatile memory such as a flash memory for holding various types of stored data regardless of whether the vacuum cleaner 11 is powered on or off.
The image generation part 62 uses a known method to calculate a distance to an object (feature point) based on images picked up by the cameras 51a, 51b and the distance between the cameras 51a, 51b, and also generates a distance image showing the calculated distance to the object (feature point). That is, the image generation part 62 applies triangulation, for example, based on a distance from the cameras 51a, 51b to an object (feature point) 0 and the distance between the cameras 51a, 51b (
The shape acquisition part 63 sets a specified distance D or a specified distance range DR with respect to the distance image generated by the image generation part 62, so as to acquire shape information of an object O positioned at the distance D or within the specified distance range DR (
The discrimination part 64 discriminates, based on shape information of an object at the specified distance or within the specified distance range and shape information such as that of a narrow space positioned between objects acquired by the shape acquisition part 63, whether or not the object is an obstacle for traveling or whether or not the object is one onto which the vacuum cleaner 11 (main casing 20) can run, and also discriminates whether there is a need to change travel control of the vacuum cleaner 11 (main casing 20) or not. Further, the discrimination part 64 may discriminate whether to change cleaning control, based on shape information of the object at the specified distance or within the specified distance range, and the shape information of the narrow space positioned between the objects or the like acquired by the shape acquisition part 63. Such discrimination will be detailed later.
The image processing part 65 calculates a cleaning area in which the vacuum cleaner 11 (main casing 20) is disposed and positional relations of objects or the like positioned within this cleaning area based on the shape information of the objects acquired by the shape acquisition part 63 and a position of the vacuum cleaner 11 (main casing 20) detected by the rotational speed sensor 55 of the sensor part 26, and generates a map. In addition, the image processing part 65 is not an essential element.
The travel control part 66 controls magnitude and a direction of currents flowing through the motors 35, 35 to make the motors 35, 35 make a forward or reverse rotation, thereby controlling the drive of the motors 35, 35, and by controlling the drive of the motors 35, 35, the travel control part 66 controls the drive of the driving wheels 34, 34 (
The cleaning control part 67 controls conduction angles of the electric blower 41, the brush motor 43 and the side brush motors 45, independently of one another, to control the drive of the electric blower 41, the brush motor 43 (rotary brush 42 (
The image pickup control part 68 includes a control circuit for controlling operation of shutters of the cameras 51a, 51b, and operates the shutters at specified time intervals, thus exerting control to pick up images by the cameras 51a, 51b at specified time intervals.
The illumination control part 69 controls turn-on and -off of the lamp 53 via a switch or the like. The illumination control part 69 in this embodiment includes a sensor for detecting brightness around the vacuum cleaner 11, and makes the lamp 53 lit when the brightness detected by the sensor is a specified level or lower, and if otherwise, keeps the lamp 53 unlit.
Also, the secondary battery 28 is electrically connected to charging terminals 71, 71 as connecting parts exposed on both sides of a rear portion in the lower surface portion 20c of the main casing 20 shown in
The home gateway 14 shown in
The server 16 is a computer (cloud server) connected to the network 15 and is capable of storing therein various types of data.
The external device 17 is a general-purpose device, for example a PC (tablet terminal (tablet PC)) 17a or a smartphone (mobile phone) 17b, which is enabled to make wired or wireless communication with the network 15 via the home gateway 14, for example, inside a building, and enabled to make wired or wireless communication with the network 15 outside the building. This external device 17 has at least a display function of displaying images.
Next, operation of the above-described embodiment will be described.
In general, work of a vacuum cleaner device is roughly divided into cleaning work for carrying out cleaning by the vacuum cleaner 11, and charging work for charging the secondary battery 28 with the charging device 12. The charging work is implemented by a known method using a charging circuit, such as a constant current circuit contained in the charging device 12, for example. Accordingly, only the cleaning work will be described below. Also, image pickup work for picking up an image of a specified object by at least one of the cameras 51a, 51b in response to an instruction from the external device 17 or the like may be included.
In the vacuum cleaner 11, at a timing such as an arrival at a preset cleaning start time or reception of a cleaning-start instruction signal transmitted by a remote control or the external device 17, for example, the control means 27 is switched over from the standby mode to the traveling mode, and the control means 27 (travel control part 66) drives the motors 35, 35 (driving wheels 34, 34) to make the vacuum cleaner 11 move from the charging device 12 by a specified distance.
Then, in the vacuum cleaner 11, the image processing part 65 generates a map of a cleaning area. When generating the map, in overview, the vacuum cleaner 11 calculates a distance to an object present in images picked up by the cameras 51a, 51b while traveling along an outer wall of the cleaning area or the like, and swirling at the position. Then, the vacuum cleaner 11 discriminates the wall and/or obstacle based on the calculated distance to generate a map based on the current position of the vacuum cleaner 11 (map generation mode). In the case where the control means 27 have discriminated that the whole cleaning area has been mapped, the control means 27 is switched over from the map generation mode to a cleaning mode which will be described later. In addition, the map, once stored in the memory 61 or the like upon its generation, may only be read from the memory 61 for the next and subsequent cleaning, eliminating the need for generating a map for each event of cleaning, but in view of cases where a cleaning area different to the map stored in the memory 61 is to be cleaned or where the cleaning area, even if unchanged from the stored one, is changed in terms of layout of objects or the like, the map may be generated as required in response to user's instruction, or at specified periods or the like, for example, or otherwise, the once generated map may be updated from time to time based on distance measurement of objects during the cleaning work.
Next, based on the generated map, the vacuum cleaner 11 performs cleaning while autonomously traveling in the cleaning area (cleaning mode). During this autonomous traveling, in overview, the vacuum cleaner 11 calculates a distance to an object in images picked by the cameras 51a, 51b while traveling forward to discriminate a wall and/or an obstacle based on the calculated distance and the generated map, and performs cleaning by the cleaning unit 22 while traveling and avoiding the wall and/or obstacle.
To explain in detail referring to the flowchart shown in
Then, the discrimination part 64 discriminates whether or not an object is present ahead of the vacuum cleaner 11 (main casing 20) at a specified distance or within a specified distance range based on the shape information of the object acquired by the shape acquisition part 63 (step 6). Specifically, the discrimination part 64 discriminates whether or not at least a part of the object is positioned within a specified image range in a distance image based on the width dimension and height dimension of the object and the horizontal or up-and-down distance between the objects acquired by the shape acquisition part 63. The image range corresponds to the external shape (up-and-down and left-and-right magnitudes) of the vacuum cleaner 11 (main casing 20) in the case where the vacuum cleaner 11 (main casing 20) is positioned at a specified distance D from the cameras 51a, 51b, or at a specified position within a specified distance range DR (for example, at the position closest to the cameras 51a, 51b within the specified distance range DR). Accordingly, presence of an obstacle at the specified distance or within the specified distance range in the image range means that an object is positioned on the path of the vacuum cleaner 11 (main casing 20) if the vacuum cleaner 11 (main casing 20) travels forward as is.
In this step 6, upon discriminating that no object is present, it is assumed that there is space ahead where the vacuum cleaner 11 (main casing 20) can travel. So, the discrimination part 64 discriminates whether or not the space is large, that is, whether or not there is a narrow space ahead of the vacuum cleaner 11 (main casing 20) based on the shape information of the object acquired by the shape acquisition part 63 (step 7). Specifically, the discrimination part 64 discriminates that a narrow space is positioned when the horizontal gap between the objects and the up-and-down gap between the objects acquired by the shape acquisition part 63 are respectively within their corresponding specified ranges. In more detail, the discrimination part 64 discriminates that a narrow space is positioned when at least one of the difference between the horizontal gap between the objects acquired by the shape acquisition part 63 and the width dimension of the vacuum cleaner 11 (main casing 20), and the difference between the up-and-down gap between the objects acquired by the shape acquisition part 63 and the height dimension of the vacuum cleaner 11 (main casing 20) is less than its corresponding specified value. Then, the control means 27 (travel control part 66) controls the traveling speed of the vacuum cleaner 11 (main casing 20) in accordance with the discrimination.
Specifically in step 7, upon discriminating that a narrow space is not present ahead of the vacuum cleaner 11 (main casing 20) (neither of the differences described above is less than its corresponding specified value (both are equal to or above their corresponding specified values)), since it is assumed that there is wide enough space ahead of the vacuum cleaner 11 (main casing 20), or that no object is present ahead of the vacuum cleaner 11 (main casing 20), the traveling speed of the vacuum cleaner 11 (main casing 20) is set to a first speed which is a relatively high speed (step 8), and processing is returned to step 1. When performing the travel control in step 8, the control means 27 (travel control part 66) controls the drive of the motors 35, 35 (driving wheels 34, 34) so that the vacuum cleaner 11 (main casing 20) travels at the first speed. Accordingly, in the case where the vacuum cleaner 11 (main casing 20) is already traveling at the first speed, processing is returned to step 1 without any change in traveling speed.
Also, in step 7, upon discriminating that there is a narrow space ahead of the vacuum cleaner 11 (main casing 20) (at least one of the differences described above is less than its specified value), since it is assumed that the vacuum cleaner 11 (main casing 20) can enter the narrow space positioned ahead of the vacuum cleaner 11 (main casing 20), the traveling speed of the vacuum cleaner 11 (main casing 20) is set to a second speed which is a relatively low speed (step 9), processing is returned to step 1. When performing the travel control in step 9, the control means 27 (travel control part 66) controls the drive of the motors 35, 35 (driving wheels 34, 34) so that the vacuum cleaner 11 (main casing 20) travels at the second speed, in order to ensure that the vacuum cleaner 11 (main casing 20) can enter the narrow space which has less extra space and is very close in shape with respect to the external shape of the vacuum cleaner 11 (main casing 20) while traveling carefully so as not to collide with objects surrounding the narrow space. Accordingly, in the case where the vacuum cleaner 11 (main casing 20) is already traveling at the second speed, processing is returned to step 1 without any change in traveling speed.
On the other hand, in step 6, upon discriminating that an object is present, two cases are assumed: one is the case where the object positioned ahead of the vacuum cleaner 11 (main casing 20) is an obstacle for traveling; and the other is the case where the object is not an obstacle for traveling, accordingly, the discrimination part 64 discriminates whether or not the object is an obstacle based on the shape information of the object acquired by the shape acquisition part 63 (step 10). Specifically, the discrimination part 64 discriminates whether or not the object is an obstacle based on the height dimension of the object acquired by the shape acquisition part 63. In more detail, when the height dimension of the object acquired by the shape acquisition part 63 is equal to or higher than a specified height, the discrimination part 64 discriminates that the object is an obstacle. This specified height is set to, for example, a height onto which the vacuum cleaner 11 (main casing 20) runs without stranding.
Then, in step 10, upon discriminating that the object is not an obstacle (the height dimension of the object is less than the specified height), since the object is assumed to be a relatively-low-height object onto which the vacuum cleaner 11 (main casing 20) can run, for example, an entrance mat, or a rug or a carpet partially arranged in the cleaning area, or the like, the discrimination part 64 changes the cleaning control by the control means 27 (cleaning control part 67) (step 11), and then processing is returned to step 1 and the vacuum cleaner 11 (main casing 20) runs onto the object without avoiding the object. As for changing of the cleaning control in step 11, the control means 27 (cleaning control part 67) changes, for example, increasing/decreasing of input to the electric blower 41 of the cleaning unit 22, turning on/off or speed of the drive of the rotary brush 42 (brush motor 43) and/or the side brushes 44, 44 (side brush motors 45, 45). For example, in this step 11, the control means 27 increases the input to the electric blower 41 and increases the driving speed of the rotary brush 42 (brush motor 43), resulting in providing improved cleaning performance. In addition, the setting of the cleaning control may vary, for example, due to a plurality of threshold values set with respect to height dimensions of objects, that is, depending on whether the height dimension of an object is large (thick) or small (thin).
On the one hand, in step 10, upon discriminating that the object is an obstacle (the height dimension of the object is more than a specified height dimension), since it is assumed that there is an object to be avoided or a narrow space which the vacuum cleaner 11 (main casing 20) cannot enter ahead of the vacuum cleaner 11 (main casing 20), the discrimination part 64 changes the traveling direction of the vacuum cleaner 11 (main casing 20) by the control means 27 (travel control part 66) (step 12), and processing is returned to step 1. The change of the traveling direction includes, based on the shape information of the object acquired by the shape acquisition part 63, appropriately changing the traveling direction for avoidance operation in accordance with the shape information, and selecting one or plural specified avoidance operation routines having been set in advance. As for the avoidance operation, for example, the control means 27 (travel control part 66) controls the drive of the motors 35, 35 (driving wheels 34, 34) to temporarily stop the vacuum cleaner 11 (main casing 20) with respect to an object and make the vacuum cleaner 11 (main casing 20) swing at the stopped position or a position to which the vacuum cleaner 11 (main casing 20) retreats by a specified distance, so that the vacuum cleaner 11 (main casing 20) avoids the object O while traveling in a crank shape as shown in
As a result, while autonomously traveling all over the floor surface including the narrow space NS (
As for the cleaning unit 22, dust and dirt on the floor surface are collected to the dust collecting unit 46 via the suction port 31 by the electric blower 41, the rotary brush 42 (brush motor 43) or the side brushes 44 (side brush motors 45) driven by the control means 27 (cleaning control part 67). Then, in the vacuum cleaner 11, in the case where the cleaning of the mapped cleaning area ends or in a specified condition such as the capacity of the secondary battery 28 is decreased to a specified level during the cleaning work, the control means 27 (travel control part 66) controls the operation of the motors 35, 35 (driving wheels 34, 34) to return to the charging device 12, the specified level being insufficient for completion of cleaning or image pickup (the voltage of the secondary battery 28 has decreased to around a discharge termination voltage). Thereafter, when the charging terminals 71, 71 and terminals for charging of the charging device 12 are docked together, the cleaning work is ended and the control means 27 is switched over to the standby mode or the charging mode.
In addition, data of images stored in the memory 61 are transmitted to the server 16 via the home gateway 14 and the network 15 by means of the wireless LAN device 47, upon a return of the vacuum cleaner 11 to the charging device 12, from time to time during the cleaning work, at specified time intervals, in the event of a request from the external device 17, or the like, for example. In addition, when data that have been transmitted completely are deleted from the memory 61 or overwritten when storing new data, the capacity of the memory 61 can be used efficiently.
The server 16 is enabled to store image data transmitted from the vacuum cleaner 11 and the image data may be downloaded in response to a request (access) from the external device 17.
Then, on the external device 17, an image downloaded from the server 16 is displayed.
In accordance with the above-described embodiment, the distance image of the objects positioned on the traveling-direction side of the vacuum cleaner 11 (main casing 20) is generated by the image generation part 62 based on images picked up by the cameras 51a, 51b, and the shape information of the picked-up objects is acquired by the shape acquisition part 63 from the generated distance image, thereby enabling to detect not only presence of objects but also shapes of objects with high precision.
Also, the shape acquisition part 63 acquires the horizontal gaps of the objects positioned on the traveling-direction side of the vacuum cleaner 11 (main casing 20), thereby enabling to easily discriminate whether or not there is any space in the horizontal direction which the vacuum cleaner 11 (main casing 20) can enter.
Similarly, the shape acquisition part 63 acquires the up-and-down gaps of the objects positioned on the traveling-direction side of the vacuum cleaner 11 (main casing 20), thereby enabling to easily discriminate whether or not there is any space in the up-and-down direction which the vacuum cleaner 11 (main casing 20) can enter.
Therefore, based on the shape information such as of the acquired horizontal and up-and-down gaps, the shape acquisition part 63 acquires whether or not there is a narrow space NS (
Then, the control means 27 controls the operation of the motors 35, 35 (driving wheels 34, 34) so as to control the traveling of the vacuum cleaner 11 (main casing 20) in accordance with the shape information acquired by the shape acquisition part 63, thereby enabling to avoid in advance any obstacle which obstructs the traveling or the like prior to a collision, and making the vacuum cleaner 11 (main casing 20) travel along in the periphery of the obstacle without traveling unnecessarily long way to avoid the obstruct, resulting in enabling to improve traveling performance. This allows thorough cleaning in the cleaning area, resulting in improving cleaning performance.
Specifically, the control means 27 controls the operation of the motors 35, 35 (driving wheels 34, 34) so as to control the traveling speed of the vacuum cleaner 11 (main casing 20) in accordance with the shape information of the object acquired by the shape acquisition part 63. For example, in the case where there is relatively large space ahead of the vacuum cleaner 11 (main casing 20), or the like, that is, any obstacle which obstructs traveling is not present, such control can make the vacuum cleaner 11 (main casing 20) travel at a relatively high speed for efficient cleaning, while in the case where the vacuum cleaner 11 (main casing 20) enters the narrow space NS (
Similarly, the control means 27 controls the operation of the motors 35, 35 (driving wheels 34, 34) so as to control the traveling direction of the vacuum cleaner 11 (main casing 20) in accordance with the shape information of the object acquired by the shape acquisition part 63. Such control can provide, for example, different avoidance operation or the like in accordance with the shape of the obstacle positioned ahead of the vacuum cleaner 11 (main casing 20).
Accordingly, the travel control of the vacuum cleaner 11 (main casing 20), which is suitable for the shape of an object, can be performed.
Also, in the case where the shape acquisition part 63 acquires the shape information of the object at the specified distance acquired from the distance image, the travel control can be performed so that the vacuum cleaner 11 (main casing 20) does not collide with the object positioned at the specified distance when the vacuum cleaner 11 (main casing 20) travels from the current position to a position apart by the specified distance.
Further, in the case where the shape acquisition part 63 acquires the shape of the object within the specified distance range acquired from the distance image, the travel control can be performed so that the vacuum cleaner 11 (main casing 20) does not collide with the object positioned within the distance range when the vacuum cleaner 11 (main casing 20) travels from the current position to the position where the front part thereof is positioned the specified distance range.
Also, the control means 27 controls the operation of the cleaning unit 22 in accordance with the shape information of the object acquired by the shape acquisition part 63. Such control can improve cleaning efficiency by providing cleaning conditions suitable for a floor surface upon discriminating that the type of the floor surface is changed, such as when discriminating that the vacuum cleaner 11 (main casing 20) runs onto, for example, a rug, a carpet or the like, based on the shape information of the object acquired by the shape acquisition part 63.
In addition, in the above-described embodiment, the travel control of the vacuum cleaner 11 (main casing 20) by the control means 27 (travel control part 66) based on the shape information of the object acquired by the shape acquisition part 63 is not limited to the form described above, and may be set arbitrarily in accordance with a shape of an object.
Also, the cameras 51a, 51b may be disposed at generally equal positions in the left-and-right direction on the main casing 20 (side surface portion 20a), that is, disposed above and below with each other.
Further, at least one of the cameras 51a, 51b may also be an infrared camera for picking up images of infrared regions.
Also, for image display, in addition to a constitution including processing for enabling image display on the external device 17 by the control means 27, processing for enabling image display on the external device 17 with an exclusive-use program (application) installed respectively in the external device 17 for example, is also possible, or preprocessing may be done by the control means 27 or the server 16 in advance, and then image display is enabled with a general-purpose program such as a browser of the external device 17 or the like. That is, a constitution in which display control means (a display control part) implements the image display may be adopted through a program stored in the server 16, a program installed on the external device 17, or the like.
Further, although data of images or the like temporarily stored in the memory 61 are transmitted to the server 16 and stored in the server 16, the data may be stored in the memory 61 as it is or stored in the external device 17.
Also, images picked up by the cameras 51a, 51b or distance images generated by the image generation part 62 may also be displayed, for example, on a display part provided in the vacuum cleaner 11 itself without being limited to the external device 17. In this case, there is no need to transmit data from the memory 61 to the server 16 via the home gateway 14 and the network 15, allowing the constitution and control of the vacuum cleaner 11 to be further simplified.
Then, although the image generation part 62, the shape acquisition part 63, the discrimination part 64, the cleaning control part 67, the image pickup control part 68 and the illumination control part 69 are each provided in the control means 27, these members may also be provided as independent members, respectively, or two or more among these members may be arbitrarily combined with one another.
Further, the distance calculation by the image generation part 62 may be applied not only during cleaning work but also to any arbitrary use during traveling of the vacuum cleaner 11 (main casing 20).
Three or more of the image pickup means may be set. That is, it is sufficient that there are plural units of image pickup means, regardless of the number of the units.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions, and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
A control method for a vacuum cleaner in which a main casing having a plurality of image pickup means is enabled to autonomously travel, the plurality of image pickup means being disposed apart from each other for picking up images on a traveling-direction side, the method comprising the steps of: generating a distance image of an object positioned on the traveling-direction side based on images picked up by the plurality of image pickup means; and acquiring shape information of the picked-up object from the generated distance image.
The control method for a vacuum cleaner as described above, comprising the step of acquiring shape information of an object at a specified distance acquired from the distance image.
The control method for a vacuum cleaner as described above, comprising the step of acquiring shape information of an object within a specified distance range acquired from the distance image.
The control method for a vacuum cleaner as described above, comprising the step of acquiring whether or not a narrow space is positioned on the traveling-direction side of the main casing based on the shape information of the object.
The control method for a vacuum cleaner as described above, comprising the step of acquiring a horizontal gap to objects positioned on the traveling-direction side of the main casing.
The control method for a vacuum cleaner as described above, comprising the step of acquiring an up-and-down gap to an object positioned on the traveling-direction side of the main casing.
The control method for a vacuum cleaner as described above, comprising the step of controlling traveling of the main casing in accordance with the shape information of the object acquired by shape acquisition means.
The control method for a vacuum cleaner as described above, comprising the step of controlling traveling speed of the main casing in accordance with the shape information of the object acquired by the shape acquisition means.
The control method for a vacuum cleaner as described above, comprising the step of controlling a traveling direction of the main casing in accordance with the shape information of the object acquired by the shape acquisition means.
The control method for a vacuum cleaner as described above, comprising the step of controlling operation of a cleaning unit in accordance with the acquired shape information of the object.
Number | Date | Country | Kind |
---|---|---|---|
2015-200186 | Oct 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/079290 | 10/3/2016 | WO | 00 |