The present invention relates generally to robotic systems for use in human environments, and more particularly to automated and semiautomated systems for assisting in the organization of human scale objects which may be contained in structures such as movable bins.
Personal robots, such as those available under the tradenames Roomba (RTM) and PR2 (RTM) by suppliers such as iRobot (RTM) and Willow Garage (RTM), respectively, have been utilized in human environments to assist with human-scale tasks such as vacuuming and grasping various items, but neither of these personal robotic systems, nor others that are available, are well suited for operating in human environments such as elderly care facilities, hotels, or hospitals in a manner wherein they may be utilized to move objects around in a highly efficient manner via the incorporation and use of containers such as plastic or metal bins to isolate and carry groups of objects. In particular, there is a need for reliable and controllable systems that are capable of autonomous, semi-autonomous, and/or teleoperational activity in such environments wherein an objective is the movement of other human scale objects, such as almost any object or objects of reasonable mass and/or size that may be placed in a bin that may otherwise be manipulated and carried manually by a human. The embodiments described herein are intended to meet these and other objectives.
One embodiment is directed to a personal robotic system, comprising: an electromechanical mobile base defining a cross-sectional envelope when viewed in a plane substantially parallel to a plane of a floor upon which the mobile base is operated; a torso assembly movably coupled to the mobile base; a head assembly movably coupled to the torso; a releasable bin-capturing assembly movably coupled to the torso; and a controller operatively coupled to the mobile base, torso assembly, head assembly, and bin-capturing assembly, and configured to capture a bin with the bin-capturing assembly and move the torso assembly relative to the mobile base so that the captured bin fits as closely as possible within the cross-sectional envelope of the mobile base. The system further may comprise a sensor operatively coupled to the controller and configured to sense one or more factors regarding an environment in which the mobile base is navigated. The sensor may comprise a sonar sensor. The sonar sensor may be coupled to the mobile base. The sensor may comprise a laser range finder. The laser rangefinder may be configured to scan a forward field of view that is greater than about 90 degrees. The laser rangefinder may be configured to scan a forward field of view that is about 180 degrees. The sonar sensor may be coupled to the mobile base. The sensor may comprise an image capture device. The image capture device may comprise a 3-D camera. The image capture device may be coupled to the head assembly. The image capture device may be coupled to the mobile base. The image capture device may be coupled to the releasable bin-capturing assembly. The image capture device may be coupled to the torso assembly. The mobile base may comprise a differential drive configuration having two driven wheels. Each of the driven wheels may be operatively coupled to an encoder that is operatively coupled to the controller and configured to provide the controller with input information regarding a driven wheel position. The controller may be configured to operate the driven wheels to navigate the mobile base based at least in part upon the input information from the driven wheel encoders. The controller may be configured to operate the mobile base based at least in part upon signals from the sensor. The torso assembly may be movably coupled to the mobile base such that the torso may be controllably elevated and lowered along an axis substantially perpendicular to the plane of the floor. The torso assembly may be movably coupled to the mobile base such that the torso may be controllably moved along an axis substantially parallel to the plane of the floor. The head assembly may comprise an image capture device. The image capture device may comprise a 3-D camera. The image capture device may be movably coupled to the head assembly such that it may be controllably panned or tilted relative to the head assembly. The bin-capturing assembly may comprise an under-ledge capturing surface configured to be interfaced with a ledge geometry feature of the bin. The capturing surface may comprise a rail. The rail and ledge geometry feature of the bin may be substantially straight. The system further may comprise a wireless transceiver configured to enable a teleoperating operator to remotely connect with the controller from a remote workstation, and to operate at least the mobile base. The controller may be configured to navigate, observe the environment, and engage with one or more bins based at least in part upon teleoperation signals through the wireless transceiver from the teleoperating operator. The controller may be configured to use the image capture device to automatically recognize the bin. One or more tags may be coupled to the bin, the tags being configured to be recognizable and readable by the controller using the image capture device. At least one of the one of more tags may be configured to assist the controller in determining the identification of the bin. At least one of the one or more tags may be configured to assist the controller in determining the geometric pose of the bin. The one or more tags may be selected from the group consisting of a QR code, an AR tag, a 2-D barcode, and a 3-D barcode. The one or more tags may be passive. The one or more tags may be actively-powered. The controller may be configured to use the image capture device to automatically recognize one or more tags associated with a location in the nearby environment. At least one of the one of more tags may be configured to assist the controller in determining the identification of the location. At least one of the one or more tags may be configured to assist the controller in determining the geometric pose of the location. The one or more tags may be selected from the group consisting of a QR code, an AR tag, a 2-D barcode, and a 3-D barcode. The one or more tags may be passive. The one or more tags may be actively-powered. The controller may be configured to use the image capture device to automatically recognize one or more tags associated with an object in the nearby environment. At least one of the one of more tags may be configured to assist the controller in determining the identification of the object. Atleast one of the one or more tags may be configured to assist the controller in determining the geometric pose of the object. The one or more tags may be selected from the group consisting of a QR code, an AR tag, a 2-D barcode, and a 3-D barcode. The one or more tags may be passive. The one or more tags may be actively-powered.
Another embodiment is directed to a method for managing bins of physical objects in a human environment, comprising: providing a personal robotic system comprising an electromechanical mobile base defining a cross-sectional envelope when viewed in a plane substantially parallel to a plane of a floor upon which the mobile base is operated; a torso assembly movably coupled to the mobile base; a head assembly movably coupled to the torso; a releasable bin-capturing assembly movably coupled to the torso; and a controller operatively coupled to the mobile base, torso assembly, head assembly, and bin-capturing assembly; and operating the personal robotic system to capture a bin with the bin-capturing assembly and move the torso assembly relative to the mobile base so that the captured bin fits as closely as possible within the cross-sectional envelope of the mobile base. The method further may comprise providing a sensor operatively coupled to the controller and configured to sense one or more factors regarding an environment in which the mobile base is navigated. The sensor may comprise a sonar sensor. The sonar sensor may be coupled to the mobile base. The sensor may comprise a laser range finder. The laser rangefinder may be configured to scan a forward field of view that is greater than about 90 degrees. The laser rangefinder may be configured to scan a forward field of view that is about 180 degrees. The sonar sensor may be coupled to the mobile base. The sensor may comprise an image capture device. The image capture device may comprise a 3-D camera. The image capture device may be coupled to the head assembly. The image capture device may be coupled to the mobile base. The image capture device may be coupled to the releasable bin-capturing assembly. The image capture device may be coupled to the torso assembly. The mobile base may comprise a differential drive configuration having two driven wheels. Each of the driven wheels may be operatively coupled to an encoder that is operatively coupled to the controller and configured to provide the controller with input information regarding a driven wheel position. The controller may be configured to operate the driven wheels to navigate the mobile base based at least in part upon the input information from the driven wheel encoders. The controller may be configured to operate the mobile base based at least in part upon signals from the sensor. The torso assembly may be movably coupled to the mobile base such that the torso may be controllably elevated and lowered along an axis substantially perpendicular to the plane of the floor. The torso assembly may be movably coupled to the mobile base such that the torso may be controllably moved along an axis substantially parallel to the plane of the floor. The head assembly may comprise an image capture device. The image capture device may comprise a 3-D camera. The image capture device may be movably coupled to the head assembly such that it may be controllably panned or tilted relative to the head assembly. The bin-capturing assembly may comprise an under-ledge capturing surface configured to be interfaced with a ledge geometry feature of the bin. The capturing surface may comprise a rail. The rail and ledge geometry feature of the bin may be substantially straight. The method further may comprise providing a wireless transceiver configured to enable a teleoperating operator to remotely connect with the controller from a remote workstation, and to operate at least the mobile base. The controller may be configured to navigate, observe the environment, and engage with one or more bins based at least in part upon teleoperation signals through the wireless transceiver from the teleoperating operator. The controller may be configured to use the image capture device to automatically recognize the bin. One or more tags may be coupled to the bin, the tags being configured to be recognizable and readable by the controller using the image capture device. At least one of the one of more tags may be configured to assist the controller in determining the identification of the bin. At least one of the one or more tags may be configured to assist the controller in determining the geometric pose of the bin. The one or more tags may be selected from the group consisting of a QR code, an AR tag, a 2-D barcode, and a 3-D barcode. The one or more tags may be passive. The one or more tags may be actively-powered. The controller may be configured to use the image capture device to automatically recognize one or more tags associated with a location in the nearby environment. At least one of the one of more tags may be configured to assist the controller in determining the identification of the location. At least one of the one or more tags may be configured to assist the controller in determining the geometric pose of the location. The one or more tags may be selected from the group consisting of a QR code, an AR tag, a 2-D barcode, and a 3-D barcode. The one or more tags may be passive. The one or more tags may be actively-powered. The controller may be configured to use the image capture device to automatically recognize one or more tags associated with an object in the nearby environment. At least one of the one of more tags may be configured to assist the controller in determining the identification of the object. Atleast one of the one or more tags may be configured to assist the controller in determining the geometric pose of the object. The one or more tags may be selected from the group consisting of a QR code, an AR tag, a 2-D barcode, and a 3-D barcode. The one or more tags may be passive. The one or more tags may be actively-powered.
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring back to
Various exemplary embodiments of the invention are described herein. Reference is made to these examples in a non-limiting sense. They are provided to illustrate more broadly applicable aspects of the invention. Various changes may be made to the invention described and equivalents may be substituted without departing from the true spirit and scope of the invention. In addition, many modifications may be made to adapt a particular situation, material, composition of matter, process, process act(s) or step(s) to the objective(s), spirit or scope of the present invention. Further, as will be appreciated by those with skill in the art that each of the individual variations described and illustrated herein has discrete components and features which may be readily separated from or combined with the features of any of the other several embodiments without departing from the scope or spirit of the present inventions. All such modifications are intended to be within the scope of claims associated with this disclosure.
Any of the devices described for carrying out the subject diagnostic or interventional procedures may be provided in packaged combination for use in executing such interventions. These supply “kits” may further include instructions for use and be packaged in trays or containers as commonly employed for such purposes.
The invention includes methods that may be performed using the subject devices. The methods may comprise the act of providing such a suitable device. Such provision may be performed by the end user. In other words, the “providing” act merely requires the end user obtain, access, approach, position, set-up, activate, power-up or otherwise act to provide the requisite device in the subject method. Methods recited herein may be carried out in any order of the recited events which is logically possible, as well as in the recited order of events.
Exemplary aspects of the invention, together with details regarding material selection and manufacture have been set forth above. As for other details of the present invention, these may be appreciated in connection with the above-referenced patents and publications as well as generally known or appreciated by those with skill in the art. The same may hold true with respect to method-based aspects of the invention in terms of additional acts as commonly or logically employed.
In addition, though the invention has been described in reference to several examples optionally incorporating various features, the invention is not to be limited to that which is described or indicated as contemplated with respect to each variation of the invention. Various changes may be made to the invention described and equivalents (whether recited herein or not included for the sake of some brevity) may be substituted without departing from the true spirit and scope of the invention. In addition, where a range of values is provided, it is understood that every intervening value, between the upper and lower limit of that range and any other stated or intervening value in that stated range, is encompassed within the invention.
Also, it is contemplated that any optional feature of the inventive variations described may be set forth and claimed independently, or in combination with any one or more of the features described herein. Reference to a singular item, includes the possibility that there are plural of the same items present. More specifically, as used herein and in claims associated hereto, the singular forms “a,” “an,” “said,” and “the” include plural referents unless the specifically stated otherwise. In other words, use of the articles allow for “at least one” of the subject item in the description above as well as claims associated with this disclosure. It is further noted that such claims may be drafted to exclude any optional element. As such, this statement is intended to serve as antecedent basis for use of such exclusive terminology as “solely,” “only” and the like in connection with the recitation of claim elements, or use of a “negative” limitation.
Without the use of such exclusive terminology, the term “comprising” in claims associated with this disclosure shall allow for the inclusion of any additional element—irrespective of whether a given number of elements are enumerated in such claims, or the addition of a feature could be regarded as transforming the nature of an element set forth in such claims. Except as specifically defined herein, all technical and scientific terms used herein are to be given as broad a commonly understood meaning as possible while maintaining claim validity.
The breadth of the present invention is not to be limited to the examples provided and/or the subject specification, but rather only by the scope of claim language associated with this disclosure.
The present application is a continuation application of U.S. patent application Ser. No. 16/269,493, filed on Feb. 6, 2019, which is a continuation application of U.S. patent application Ser. No. 15/966,383, filed on Apr. 30, 2018 now abandoned, which is a continuation application of U.S. patent application Ser. No. 15/652,931, filed on Jul. 18, 2017 now abandoned, which is a continuation application of U.S. patent application Ser. No. 15/272,334, filed on Sep. 21, 2016 now abandoned, which is a continuation application of U.S. patent application Ser. No. 14/316,718, filed on Jun. 26, 2014 now abandoned, which claims the benefit under 35 U.S.C. § 119 to U.S. Provisional Application Ser. No. 61/957,254 filed Jun. 26, 2013. The foregoing application is hereby incorporated by reference into the present application in its entirety.
Number | Date | Country | |
---|---|---|---|
61957254 | Jun 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16269493 | Feb 2019 | US |
Child | 16573355 | US | |
Parent | 15966383 | Apr 2018 | US |
Child | 16269493 | US | |
Parent | 15652931 | Jul 2017 | US |
Child | 15966383 | US | |
Parent | 15272334 | Sep 2016 | US |
Child | 15652931 | US | |
Parent | 14316718 | Jun 2014 | US |
Child | 15272334 | US |