The presently disclosed subject matter relates to waste management, and more particularly, to a waste management system for auditing a fill status of a customer waste container by a waste services provider during performance of a waste service activity.
Waste service vehicles and waste container delivery vehicles used in the waste collection, disposal and recycling industry often have on-board computers, location devices and interior and exterior safety and non-safety related cameras installed on the exterior and interior thereof. These systems can provide waste services providers and their field managers with information related to the waste service vehicle, location of the waste service vehicle, waste service confirmation, customer service issues, service routing issues, customer site information and safety issues and concerns.
A common concern for waste services providers is overloaded customer containers. Containers with trash or recyclables protruding from the top or sides can result in scattered contents and possible injury or accidents. Waste services providers have sought improved means for identifying and addressing and/or preventing overloaded containers. Waste services providers have also sought improved means for auditing the status of customer waste containers, including but not limited to the container fill status, during performance of a waste service activity.
Prior auditing means were typically performed visually by the driver or other employees at the site of the customer waste container, and were often inefficient and/or provided inaccurate results.
Improvements in this field of technology are desired.
Various illustrative embodiments of a system for auditing the fill status of a customer waste container by a waste services provider during performance of a waste service activity are disclosed herein.
In certain aspects, the system can include an optical sensor disposed on a waste collection vehicle and configured to capture image data of the customer waste container that is indicative of the fill status of the container. The system can also include a memory storage area and a processor in communication with the memory storage area. The processor can be configured to, in real time during performance of the waste service activity: receive the image data from the optical sensor; compare the fill status from the image data to a predetermined overload threshold condition for the customer waste container stored in the memory storage area; determine, based on the comparison, whether the fill status has met or exceeded the predetermined overload threshold condition; if the fill status has met or surpassed the predetermined overload threshold condition, generate an action proposal; and execute an action from the action proposal, wherein the action can comprise one or more of: (i) a customer communication, (ii) a customer billing adjustment, and (iii) a container recovery instruction for the customer waste container for delivery to the waste collection vehicle. The comparing and determining can be performed using machine learning based on programmed data associated with recognition of the fill status. The container recovery instruction for the customer waste container for delivery to the waste collection vehicle can include an instruction to collect the customer waste container and deliver it to the waste collection vehicle for removal from the customer location. The processor can be configured to identify the location of the waste collection vehicle and associate the location of the waste collection vehicle with the identity of the specific customer at that location. The steps of identifying and associating can performed prior to receiving the image data from the optical sensor, or performed subsequent to receiving the image data from the optical sensor.
Various illustrative embodiments of a method for auditing the fill status of a customer waste container by a waste service provider during performance of a waste service activity are also disclosed herein.
In certain aspects, the method can include certain of the following steps: capturing image data of the customer waste container that is indicative of the fill status of the container; comparing the fill status from the image data to a predetermined overload threshold condition for the customer waste container; determining, based on the comparison, whether the fill status has met or exceeded the predetermined overload threshold condition; if the fill status has met or surpassed the predetermined overload threshold condition, generating an action proposal; and executing an action from the action proposal, wherein the action comprises one or more of: a customer communication, a customer billing adjustment, and a container recovery instruction for the customer waste container for delivery to the waste collection vehicle.
A better understanding of the presently disclosed subject matter can be obtained when the following detailed description is considered in conjunction with the drawings and figures herein, wherein:
While the presently disclosed subject matter will be described in connection with the preferred embodiment, it will be understood that it is not intended to limit the presently disclosed subject matter to that embodiment. On the contrary, it is intended to cover all alternatives, modifications, and equivalents, as may be included within the spirit and the scope of the presently disclosed subject matter as defined by the appended claims.
The presently disclosed subject matter relates to systems and methods for auditing the status of a customer waste container by a waste services provider using video/still images captured by one or more optical sensors mounted on a waste collection vehicle used in the waste collection, disposal and recycling industry. The presently disclosed systems and methods are directed to overcoming the issues and problems of the prior art.
In the illustrative embodiment shown in
In certain illustrative embodiments, the communication between communications device 50 provided on-board waste service vehicle 15 and central server 35 may be provided on a real time basis such that during the collection route, data is transmitted from each waste service vehicle 15 to central server 35. Alternatively, communication device 50 may be configured to temporarily store or cache data during the collection route and transfer the data to the central server 35 on return of waste service vehicle 15 to the location of the waste collection company.
In certain illustrative embodiments, as illustrated in
Location device 65 can be configured to determine the location of waste service vehicle 15 always while waste service vehicle 15 is inactive, in motion and operating and performing service related and nonservice related activities. For example, location device 65 can be a GPS device that can communicate with the waste collection company. A satellite 75 or other communications device can be utilized to facilitate communications. For example, location device 65 can transmit location information, such as digital latitude and longitude, to onboard computer 60 via satellite 75. Thus, location device 65 can identify the location of waste service vehicle 15, and therefore the location of the customer site where container 20 is located, after vehicle 15 has arrived at the customer site.
In certain illustrative embodiments, optical sensor 70 can be configured to capture still or video images of containers 20 as well as other service related and non-service related activity outside of the waste service vehicle 15. Optical sensor 70 can be, for example, a video camera. Optical sensor 70 can be disposed on waste collection vehicle 15 and configured to capture image data of customer waste container 20 that is indicative of the fill status of container 20. The images collected by optical sensor 70 may be transmitted to and stored by onboard computer 60, and/or delivered to central server 35.
For example, in certain illustrative embodiments one or more optical sensors 70 can be installed throughout the waste collection vehicle 15 including, but not limited to, high definition cameras, monitors and such other sensors mounted to the front (interior and exterior of the cab), exterior right side, exterior left side, exterior rear and exterior/interior hopper area of the service vehicle. Optical sensor 70 can periodically or continuously record, or record upon demand, desired activities outside the vehicle 15. The recorded images and data can be stored on onboard computer 60 using a recording device (such as a digital video recorder) and be also be transmitted and stored remotely away from waste service vehicle 15 or in the “cloud” via cellular and/or other wireless transmissions and/or communicate vis network 45. The images can be available for review in immediate real-time or passive review later by an end-user.
In the illustrative embodiment of
In certain illustrative embodiments, central server 35 can include standard components such as processor 75 and user interface 80 for inputting and displaying data, such as a keyboard and mouse or a touch screen, associated with a standard laptop or desktop computer. Central server 35 also includes a communication device 85 for wireless communication with onboard computer 60.
Central server 35 may include software 90 that communicates with one or more memory storage areas 95. Memory storage areas 95 can be, for example, multiple data repositories which stores pre-recorded data pertaining to a plurality of customer accounts. Such information may include customer location, route data, items expected to be removed from the customer site, and/or billing data. For example, using the location (e.g., street address, city, state, and zip code) of a customer site, software 90 may find the corresponding customer account in memory storage areas 95. Database 96 for data storage can be in memory storage area 95 and/or supplementary external storage devices as are well known in the art.
While a “central server” is described herein, a person of ordinary skill in the art will recognize that embodiments of the present invention are not limited to a client-server architecture and that the server need not be centralized or limited to a single server, or similar network entity or mainframe computer system. Rather, the server and computing system described herein may refer to any combination of devices or entities adapted to perform the computing and networking functions, operations, and/or processes described herein without departing from the spirit and scope of embodiments of the present invention.
In certain illustrative embodiments, a system is provided for auditing the fill status of a customer waste container 20 by a waste services provider during performance of a waste service activity. Optical sensor 70 is disposed on waste collection vehicle 15 and configured to capture image data of customer container 20 that is indicative of the fill status of container 20. Central server 35 may utilize memory storage area 95, and processor 75 in communication with memory storage area 95 to, in real time during performance of the waste service activity, receive the image data from optical sensor 70, compare the fill status from the image data to a predetermined overload threshold condition (or “POTC”) for customer container 20 stored in memory storage area 95, and determine, based on the comparison, whether the fill status has met or exceeded the predetermined overload threshold condition. If the fill status has met or surpassed the predetermined overload threshold condition, an action proposal can be generated, from which one or more actions can be executed. The actions can comprise, for example, one or more of: (i) a customer communication, (ii) a customer billing adjustment, and (iii) a container recovery instruction for customer container 20 for delivery to waste collection vehicle 15.
The presently disclosed waste management system can allow a waste service provider to audit the status of a customer waste container 20 during performance of a waste service activity. In certain illustrative embodiments, the system and method disclosed herein can also be utilized to perform “audits” in industries other than the waste industry, where auditing of containers using optical sensors and associated computer functionality are utilized.
In certain illustrative embodiments, software 90 can execute the flow of the method steps of
In the illustrative embodiment shown in
In the illustrative embodiment shown in
In the illustrative embodiment shown in
In certain illustrative embodiments, and as illustrated in
In certain illustrative embodiments, the POTC can be customer specific. Alternatively, the POTC does not need to be associated with any particular customer, and can instead be a standard condition established by the waste service provider based on any number of conditions and requirements. If desired, a user of the system can double check or confirm the determination of “overloaded” status made by the processor by soliciting a visual confirmation from the driver onsite.
In certain illustrative embodiments, processor 75 may automatically review the accumulated images and determine whether predetermined overload threshold condition (“POTC”) is met or exceeded based on machine learning and in association with programmed recognition patterns. In particular, processor 75 may be taught to recognize, for example, patterns of shapes, or sizes, that indicate trash protruding from the top or sides of container 20, based on existing images in the database. Object recognition software may also be used for this purpose. In the flowchart of
In certain illustrative embodiments, the comparing and determining are performed using machine learning based on a set of programmed data associated with the predetermined overload threshold condition for the exemplary customer waste container. The set of programmed data can include a plurality of images of the exemplary customer waste container. Each image of the exemplary customer waste container can display a different exemplary fill status condition, where a first subsection of the plurality of images is pre-identified, based on the exemplary fill status condition, as meeting or exceeding the predetermined overload threshold condition, and where a second subsection of the plurality of images is pre-identified, based on the exemplary fill status condition, as not meeting or exceeding the predetermined overload threshold condition. The pre-identification of an image in the set of programmed data as meeting or exceeding, or not meeting or exceeding, the predetermined overload threshold condition can be based upon one or more features in the image such as an open or closed status of the lid of the exemplary customer waste container, a presence or absence of waste on the ground adjacent the exemplary customer waste container, or an identification of excess waste in a defined region above the rim of the exemplary customer waste container when the lid of the exemplary customer waste container is at least partially open. The processor can be trained, using machine learning or via programming, to recognize and identify the fill status of the customer waste container based on the image data received from the optical sensor. The recognition and identifying of the fill status of the customer waste container can also be based upon one or more features in the image data such an open or closed status of the lid of the customer waste container, a presence or absence of waste on the ground adjacent the customer waste container, and an identification of excess waste in a defined region above the rim of the customer waste container when the lid of the customer waste container is at least partially open. The processor can be trained, using machine learning, to match the recognized fill status of the customer waste container with the image of the exemplary customer waste container in the set of programmed data that has a corresponding fill status, and then to categorize the fill status of the customer waste container as either (i) meeting or exceeding, or (ii) not meeting or exceeding, the predetermined overload threshold condition.
In certain illustrative embodiments, one or more action proposals can be generated based on the identifications above. The actions proposals can include, for example, recommendations to (i) remove excess waste from customer container, (ii) remove and replace container, (iii) provide additional containers, (iv) provide reporting, education and/or instructions to customer, or (v) to adjust customer billing. In addition, historical account information and attributes of target customer and “like” customers can be collected, and the action proposals for target customers can be determined and ranked based on lifetime value impact scoring. Additional information can also be collected from the Internet or other outside sources. Scoring of target customer can be impacted based on prior proposals or interactions as well as preferences/acceptances of “like” customers to similar action proposals, and restrictions or constraints from target customer's attributes can be applied. Action proposals can be delivered to appropriate user/system for acceptance, and thereupon, the action proposal can be executed/applied, which can include charging the customer for the overage, notifying the customer of the overage through a proactive warning and notification process (including still images and/or video), and noting the overage incident on the customer's account.
In certain illustrative embodiments, a method is provided for collecting, processing, and applying data from a waste service vehicle to increase customer lifetime value through targeted action proposals. The method can include the steps of: collecting information (such as image, video, collection vehicle, driver inputs) at a target service location; matching customer account to a target service location; processing information from the target service location and historical customer account record to create an action proposal; and executing an action from the action proposal. The information that can be processed can include a variety of gathered information, for example, information regarding safety, receptacle condition, receptacle contents, fill status, site conditions, obstructions (temporary or permanent), service, service quality (verification, receptacle identification, receptacle contents), service audit (size, frequency, location, and quantity), service exceptions (unable to service, site obstructions), site damage, theft/poaching/no customer, sustainability, material diversion/audits, dangerous/hazardous materials, savings, site service times, bin locations and ancillary services (locks, gates, etc).
The presently disclosed subject matter has a variety of practical applications, as well as provides solutions to a number of technological and business problems of the prior art. For example, accuracy in customer billing is improved. A critical component of providing waste services to residential and commercial customers is accuracy in the customer's statement. The presently disclosed system and method allows the waste services provider to determine if the waste container or bin is overloaded resulting in the customer requiring additional service beyond the capacity of the container or bin. Improved management and education of customers regarding service requirements also occurs. An obligation of the waste service provider is to manage and educate the residential and commercial customer of the waste collection company's service requirements. The system and method of recording and auditing service related and non-service related activities outside of the waste collection vehicle allows the end-user to educate the customer on closing the container or bin lid to reduce capturing of precipitation, to reduce litter/blight, to reduce unauthorized dumping or use, and to reduce animals and vermin from intruding the container or bin, as well as the dangers and hardships associated with overloading a container or bin.
Improvements in employee and public safety also occur. An obligation of the waste service provider is to provide a safe working environment for its employees and its customers. The presently disclosed system and method allows the end-user to: (i) improve safety and protects its employees and equipment by reducing overloaded containers, which damage equipment, cause collection vehicle fires, cause other property damage from falling debris and otherwise puts its employees at risk; (ii) improve safety by identifying and abating dangerous stops and hard to service accounts, which result in vehicle accidents and employee injuries; (iii) improve safety and reduce vehicle accidents by having safe access to containers and bins; and (v) improve safety by identifying and correcting overloaded containers and bins at customer's service location.
Improved customer service can also be provided. The cornerstone of a successful waste collection provider is providing excellent customer service. The system and method disclosed herein allows the end-user to: (i) proactively notify the customer of waste collection service requirements to ensure safe and efficient waste collection; (ii) demonstrate a container is overloaded or unavailable for service and assist the customer in efforts to receive timely service; (iii) educate the customer on proper recycling and management of waste in each of the containers or bins; and (iv) proactively remove or repair damaged and/or leaking containers and bins.
Operational Improvements can also occur. Operational improvements result in more efficient waste collection services and ultimately improved earnings, safety and employee morale. The system and method disclosed herein allow the end-user to: (i) reduce overloaded containers resulting in less equipment damage, employee injuries and time off-route; (ii) improved route efficiencies by servicing readily accessible containers and bins; and (iii) supporting frontline employees by holding customer to waste collector's service requirements.
Those skilled in the art will appreciate that certain portions of the subject matter disclosed herein may be embodied as a method, data processing system, or computer program product. Accordingly, these portions of the subject matter disclosed herein may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware. Furthermore, portions of the subject matter disclosed herein may be a computer program product on a computer-usable storage medium having computer readable program code on the medium. Any suitable computer readable medium may be utilized including hard disks, CD-ROMs, optical storage devices, or other storage devices. Further, the subject matter described herein may be embodied as systems, methods, devices, or components. Accordingly, embodiments may, for example, take the form of hardware, software or any combination thereof, and/or may exist as part of an overall system architecture within which the software will exist. The present detailed description is, therefore, not intended to be taken in a limiting sense.
It is to be understood that the present invention is not limited to the embodiment(s) described above and illustrated herein, but encompasses any and all variations falling within the scope of the appended claims.
This application is a continuation application and claims the benefit, and priority benefit, of U.S. patent application Ser. No. 16/549,531, filed Aug. 23, 2019, the disclosure and contents of which are incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3202305 | Hierpich | Aug 1965 | A |
5072833 | Hansen et al. | Dec 1991 | A |
5230393 | Mezey | Jul 1993 | A |
5245137 | Bowman et al. | Sep 1993 | A |
5278914 | Kinoshita et al. | Jan 1994 | A |
5489898 | Shigekusa et al. | Feb 1996 | A |
5762461 | Frohlingsdorf | Jun 1998 | A |
5837945 | Cornwell et al. | Nov 1998 | A |
6097995 | Tipton et al. | Aug 2000 | A |
6408261 | Durbin | Jun 2002 | B1 |
6448898 | Kasik | Sep 2002 | B1 |
6510376 | Burnstein et al. | Jan 2003 | B2 |
6563433 | Fujiwara | May 2003 | B2 |
6729540 | Ogawa | May 2004 | B2 |
6811030 | Compton et al. | Nov 2004 | B1 |
7146294 | Waitkus, Jr. | Dec 2006 | B1 |
7330128 | Lombardo et al. | Feb 2008 | B1 |
7383195 | Mallett et al. | Jun 2008 | B2 |
7406402 | Waitkus, Jr. | Jul 2008 | B1 |
7501951 | Maruca et al. | Mar 2009 | B2 |
7511611 | Sabino et al. | Mar 2009 | B2 |
7536457 | Miller | May 2009 | B2 |
7659827 | Gunderson et al. | Feb 2010 | B2 |
7804426 | Etcheson | Sep 2010 | B2 |
7817021 | Date et al. | Oct 2010 | B2 |
7870042 | Maruca et al. | Jan 2011 | B2 |
7878392 | Mayers et al. | Feb 2011 | B2 |
7957937 | Waitkus, Jr. | Jun 2011 | B2 |
7994909 | Maruca et al. | Aug 2011 | B2 |
7999688 | Healey et al. | Aug 2011 | B2 |
8020767 | Reeves et al. | Sep 2011 | B2 |
8056817 | Flood | Nov 2011 | B2 |
8146798 | Flood et al. | Apr 2012 | B2 |
8185277 | Flood et al. | May 2012 | B2 |
8269617 | Cook et al. | Sep 2012 | B2 |
8314708 | Gunderson et al. | Nov 2012 | B2 |
8330059 | Curotto | Dec 2012 | B2 |
8332247 | Bailey et al. | Dec 2012 | B1 |
8373567 | Denson | Feb 2013 | B2 |
8374746 | Plante | Feb 2013 | B2 |
8384540 | Reyes et al. | Feb 2013 | B2 |
8417632 | Robohm et al. | Apr 2013 | B2 |
8433617 | Goad et al. | Apr 2013 | B2 |
8485301 | Grubaugh et al. | Jul 2013 | B2 |
8508353 | Cook et al. | Aug 2013 | B2 |
8542121 | Maruca et al. | Sep 2013 | B2 |
8550252 | Borowski et al. | Oct 2013 | B2 |
8564426 | Cook et al. | Oct 2013 | B2 |
8564446 | Gunderson et al. | Oct 2013 | B2 |
8602298 | Gonen | Dec 2013 | B2 |
8606492 | Botnen | Dec 2013 | B1 |
8630773 | Lee et al. | Jan 2014 | B2 |
8645189 | Lyle | Feb 2014 | B2 |
8674243 | Curotto | Mar 2014 | B2 |
8676428 | Richardson et al. | Mar 2014 | B2 |
8714440 | Flood et al. | May 2014 | B2 |
8738423 | Lyle | May 2014 | B2 |
8744642 | Nemat-Nasser et al. | Jun 2014 | B2 |
8803695 | Denson | Aug 2014 | B2 |
8818908 | Altice et al. | Aug 2014 | B2 |
8849501 | Cook et al. | Sep 2014 | B2 |
8854199 | Cook et al. | Oct 2014 | B2 |
8862495 | Ritter | Oct 2014 | B2 |
8880279 | Plante | Nov 2014 | B2 |
8930072 | Lambert et al. | Jan 2015 | B1 |
8952819 | Nemat-Nasser | Feb 2015 | B2 |
8970703 | Thomas et al. | Mar 2015 | B1 |
8996234 | Tamari et al. | Mar 2015 | B1 |
9047721 | Botnen | Jun 2015 | B1 |
9058706 | Cheng | Jun 2015 | B2 |
9098884 | Borowski et al. | Aug 2015 | B2 |
9098956 | Lambert et al. | Aug 2015 | B2 |
9111453 | Alselimi | Aug 2015 | B1 |
9158962 | Nemat-Nasser et al. | Oct 2015 | B1 |
9180887 | Nemat-Nasser et al. | Nov 2015 | B2 |
9189899 | Cook et al. | Nov 2015 | B2 |
9226004 | Plante | Dec 2015 | B1 |
9235750 | Sutton et al. | Jan 2016 | B1 |
9238467 | Hoye et al. | Jan 2016 | B1 |
9240079 | Lambert et al. | Jan 2016 | B2 |
9240080 | Lambert et al. | Jan 2016 | B2 |
9245391 | Cook et al. | Jan 2016 | B2 |
9247040 | Sutton | Jan 2016 | B1 |
9251388 | Flood | Feb 2016 | B2 |
9268741 | Lambert et al. | Feb 2016 | B1 |
9275090 | Denson | Mar 2016 | B2 |
9280857 | Lambert et al. | Mar 2016 | B2 |
9292980 | Cook et al. | Mar 2016 | B2 |
9298575 | Tamari et al. | Mar 2016 | B2 |
9317980 | Cook et al. | Apr 2016 | B2 |
9330287 | Graczyk et al. | May 2016 | B2 |
9341487 | Bonhomme | May 2016 | B2 |
9342884 | Mask | May 2016 | B2 |
9344683 | Nemat-Nasser et al. | May 2016 | B1 |
9347818 | Curotto | May 2016 | B2 |
9358926 | Lambert et al. | Jun 2016 | B2 |
9373257 | Bonhomme | Jun 2016 | B2 |
9389147 | Lambert et al. | Jul 2016 | B1 |
9390568 | Nemat-Nasser et al. | Jul 2016 | B2 |
9396453 | Hynes et al. | Jul 2016 | B2 |
9401985 | Sutton | Jul 2016 | B2 |
9403278 | Van Kampen et al. | Aug 2016 | B1 |
9405992 | Badholm et al. | Aug 2016 | B2 |
9418488 | Lambert | Aug 2016 | B1 |
9428195 | Surpi | Aug 2016 | B1 |
9442194 | Kurihara et al. | Sep 2016 | B2 |
9463110 | Nishtala et al. | Oct 2016 | B2 |
9466212 | Stumphauzer, II et al. | Oct 2016 | B1 |
9472083 | Nemat-Nasser | Oct 2016 | B2 |
9495811 | Herron | Nov 2016 | B2 |
9501690 | Nemat-Nasser et al. | Nov 2016 | B2 |
9520046 | Call et al. | Dec 2016 | B2 |
9525967 | Mamlyuk | Dec 2016 | B2 |
9546040 | Flood et al. | Jan 2017 | B2 |
9573601 | Hoye et al. | Feb 2017 | B2 |
9574892 | Rodoni | Feb 2017 | B2 |
9586756 | O'Riordan et al. | Mar 2017 | B2 |
9589393 | Botnen | Mar 2017 | B2 |
9594725 | Cook et al. | Mar 2017 | B1 |
9595191 | Surpi | Mar 2017 | B1 |
9597997 | Mitsuta et al. | Mar 2017 | B2 |
9604648 | Tamari et al. | Mar 2017 | B2 |
9633318 | Plante | Apr 2017 | B2 |
9633576 | Reed | Apr 2017 | B2 |
9639535 | Ripley | May 2017 | B1 |
9646651 | Richardson | May 2017 | B1 |
9650051 | Hoye et al. | May 2017 | B2 |
9679210 | Sutton et al. | Jun 2017 | B2 |
9685098 | Kypri | Jun 2017 | B1 |
9688282 | Cook | Jun 2017 | B2 |
9702113 | Kotaki et al. | Jul 2017 | B2 |
9707595 | Ripley | Jul 2017 | B2 |
9721342 | Mask | Aug 2017 | B2 |
9734717 | Surpi et al. | Aug 2017 | B1 |
9754382 | Rodoni | Sep 2017 | B1 |
9766086 | Rodoni | Sep 2017 | B1 |
9778058 | Rodoni | Oct 2017 | B2 |
9803994 | Rodoni | Oct 2017 | B1 |
9824336 | Borges et al. | Nov 2017 | B2 |
9824337 | Rodoni | Nov 2017 | B1 |
9829892 | Rodoni | Nov 2017 | B1 |
9834375 | Jenkins et al. | Dec 2017 | B2 |
9852405 | Rodoni et al. | Dec 2017 | B1 |
10029685 | Hubbard et al. | Jul 2018 | B1 |
10152737 | Lyman | Dec 2018 | B2 |
10198718 | Rodoni | Feb 2019 | B2 |
10204324 | Rodoni | Feb 2019 | B2 |
10210623 | Rodoni | Feb 2019 | B2 |
10255577 | Steves et al. | Apr 2019 | B1 |
10311501 | Rodoni | Jun 2019 | B1 |
10332197 | Kekalainen et al. | Jun 2019 | B2 |
10354232 | Tomlin, Jr. et al. | Jul 2019 | B2 |
10382915 | Rodoni | Aug 2019 | B2 |
10410183 | Bostick et al. | Sep 2019 | B2 |
10594991 | Skolnick | Mar 2020 | B1 |
10625934 | Mallady | Apr 2020 | B2 |
10628805 | Rodatos | Apr 2020 | B2 |
10750134 | Skolnick | Aug 2020 | B1 |
10855958 | Skolnick | Dec 2020 | B1 |
10911726 | Skolnick | Feb 2021 | B1 |
11074557 | Flood | Jul 2021 | B2 |
11128841 | Skolnick | Sep 2021 | B1 |
11140367 | Skolnick | Oct 2021 | B1 |
11172171 | Skolnick | Nov 2021 | B1 |
11222491 | Romano et al. | Jan 2022 | B2 |
11373536 | Savchenko | Jun 2022 | B1 |
11386362 | Kim | Jul 2022 | B1 |
11425340 | Skolnick | Aug 2022 | B1 |
11475416 | Patel et al. | Oct 2022 | B1 |
11475417 | Patel et al. | Oct 2022 | B1 |
11488118 | Patel et al. | Nov 2022 | B1 |
11616933 | Skolnick | Mar 2023 | B1 |
11673740 | Leon | Jun 2023 | B2 |
11715150 | Rodoni | Aug 2023 | B2 |
11727337 | Savchenko | Aug 2023 | B1 |
11790290 | Kim et al. | Oct 2023 | B1 |
11928693 | Savchenko et al. | Mar 2024 | B1 |
11977381 | Patel et al. | May 2024 | B1 |
12015880 | Skolnick | Jun 2024 | B1 |
20020069097 | Conrath | Jun 2002 | A1 |
20020077875 | Nadir | Jun 2002 | A1 |
20020125315 | Ogawa | Sep 2002 | A1 |
20020194144 | Berry | Dec 2002 | A1 |
20030014334 | Tsukamoto | Jan 2003 | A1 |
20030031543 | Elbrink | Feb 2003 | A1 |
20030069745 | Zenko | Apr 2003 | A1 |
20030191658 | Rajewski | Oct 2003 | A1 |
20030233261 | Kawahara et al. | Dec 2003 | A1 |
20040039595 | Berry | Feb 2004 | A1 |
20040167799 | Berry | Aug 2004 | A1 |
20050038572 | Krupowicz | Feb 2005 | A1 |
20050080520 | Kline et al. | Apr 2005 | A1 |
20050182643 | Shirvanian | Aug 2005 | A1 |
20050209825 | Ogawa | Sep 2005 | A1 |
20050234911 | Hess et al. | Oct 2005 | A1 |
20050261917 | Forget Shield | Nov 2005 | A1 |
20060235808 | Berry | Oct 2006 | A1 |
20070150138 | Plante | Jun 2007 | A1 |
20070260466 | Casella et al. | Nov 2007 | A1 |
20070278140 | Mallett | Dec 2007 | A1 |
20080010197 | Scherer | Jan 2008 | A1 |
20080065324 | Muramatsu et al. | Mar 2008 | A1 |
20080077541 | Scherer et al. | Mar 2008 | A1 |
20080202357 | Flood | Aug 2008 | A1 |
20080234889 | Sorensen | Sep 2008 | A1 |
20090014363 | Gonen et al. | Jan 2009 | A1 |
20090024479 | Gonen et al. | Jan 2009 | A1 |
20090055239 | Waitkus, Jr. | Feb 2009 | A1 |
20090083090 | Rolfes et al. | Mar 2009 | A1 |
20090126473 | Porat | May 2009 | A1 |
20090138358 | Gonen et al. | May 2009 | A1 |
20090157255 | Plante | Jun 2009 | A1 |
20090161907 | Healey et al. | Jun 2009 | A1 |
20100017276 | Wolff et al. | Jan 2010 | A1 |
20100071572 | Carroll | Mar 2010 | A1 |
20100119341 | Flood et al. | May 2010 | A1 |
20100175556 | Kummer et al. | Jul 2010 | A1 |
20100185506 | Wolff et al. | Jul 2010 | A1 |
20100217715 | Lipcon | Aug 2010 | A1 |
20100312601 | Lin | Dec 2010 | A1 |
20110108620 | Wadden et al. | May 2011 | A1 |
20110137776 | Goad et al. | Jun 2011 | A1 |
20110208429 | Zheng et al. | Aug 2011 | A1 |
20110225098 | Wolff et al. | Sep 2011 | A1 |
20110260878 | Rigling | Oct 2011 | A1 |
20110279245 | Hynes et al. | Nov 2011 | A1 |
20110316689 | Reyes et al. | Dec 2011 | A1 |
20120029980 | Paz et al. | Feb 2012 | A1 |
20120029985 | Wilson et al. | Feb 2012 | A1 |
20120047080 | Rodatos | Feb 2012 | A1 |
20120262568 | Ruthenberg | Oct 2012 | A1 |
20120265589 | Whittier | Oct 2012 | A1 |
20120310691 | Carlsson et al. | Dec 2012 | A1 |
20130024335 | Lok | Jan 2013 | A1 |
20130039728 | Price et al. | Feb 2013 | A1 |
20130041832 | Rodatos | Feb 2013 | A1 |
20130075468 | Wadden et al. | Mar 2013 | A1 |
20130332238 | Lyle | Dec 2013 | A1 |
20130332247 | Gu | Dec 2013 | A1 |
20140060939 | Eppert | Mar 2014 | A1 |
20140112673 | Sayama | Apr 2014 | A1 |
20140114868 | Wan et al. | Apr 2014 | A1 |
20140172174 | Poss et al. | Jun 2014 | A1 |
20140214697 | Mcsweeney | Jul 2014 | A1 |
20140236446 | Spence | Aug 2014 | A1 |
20140278630 | Gates et al. | Sep 2014 | A1 |
20140379588 | Gates | Dec 2014 | A1 |
20150095103 | Rajamani et al. | Apr 2015 | A1 |
20150100428 | Parkinson, Jr. | Apr 2015 | A1 |
20150144012 | Frybarger | May 2015 | A1 |
20150278759 | Harris et al. | Oct 2015 | A1 |
20150294431 | Fiorucci et al. | Oct 2015 | A1 |
20150298903 | Luxford | Oct 2015 | A1 |
20150302364 | Calzada et al. | Oct 2015 | A1 |
20150307273 | Lyman | Oct 2015 | A1 |
20150324760 | Borowski | Nov 2015 | A1 |
20150326829 | Kurihara et al. | Nov 2015 | A1 |
20150348252 | Mask | Dec 2015 | A1 |
20150350610 | Loh | Dec 2015 | A1 |
20160021287 | Loh | Jan 2016 | A1 |
20160044285 | Gasca et al. | Feb 2016 | A1 |
20160179065 | Shahabdeen | Jun 2016 | A1 |
20160187188 | Curotto | Jun 2016 | A1 |
20160224846 | Cardno | Aug 2016 | A1 |
20160232498 | Tomlin, Jr. et al. | Aug 2016 | A1 |
20160239689 | Flood | Aug 2016 | A1 |
20160247058 | Kreiner et al. | Aug 2016 | A1 |
20160292653 | Gonen | Oct 2016 | A1 |
20160300297 | Kekalainen | Oct 2016 | A1 |
20160321619 | Inan | Nov 2016 | A1 |
20160334236 | Mason et al. | Nov 2016 | A1 |
20160335814 | Tamari et al. | Nov 2016 | A1 |
20160372225 | Lefkowitz et al. | Dec 2016 | A1 |
20160377445 | Rodoni | Dec 2016 | A1 |
20160379152 | Rodoni | Dec 2016 | A1 |
20160379154 | Rodoni | Dec 2016 | A1 |
20170008671 | Whitman et al. | Jan 2017 | A1 |
20170011363 | Whitman et al. | Jan 2017 | A1 |
20170029209 | Smith | Feb 2017 | A1 |
20170046528 | Lambert | Feb 2017 | A1 |
20170061222 | Hoye et al. | Mar 2017 | A1 |
20170076249 | Byron et al. | Mar 2017 | A1 |
20170081120 | Liu et al. | Mar 2017 | A1 |
20170086230 | Azevedo et al. | Mar 2017 | A1 |
20170109704 | Lettieri et al. | Apr 2017 | A1 |
20170116583 | Rodoni | Apr 2017 | A1 |
20170116668 | Rodoni | Apr 2017 | A1 |
20170118609 | Rodoni | Apr 2017 | A1 |
20170121107 | Flood et al. | May 2017 | A1 |
20170124533 | Rodoni | May 2017 | A1 |
20170154287 | Kalinowski et al. | Jun 2017 | A1 |
20170176986 | High et al. | Jun 2017 | A1 |
20170193798 | Call et al. | Jul 2017 | A1 |
20170200333 | Plante | Jul 2017 | A1 |
20170203706 | Reed | Jul 2017 | A1 |
20170221017 | Gonen | Aug 2017 | A1 |
20170243269 | Rodini et al. | Aug 2017 | A1 |
20170243363 | Rodini | Aug 2017 | A1 |
20170277726 | Huang et al. | Sep 2017 | A1 |
20170308871 | Tallis | Oct 2017 | A1 |
20170330134 | Botea et al. | Nov 2017 | A1 |
20170344959 | Bostick et al. | Nov 2017 | A1 |
20170345169 | Rodoni | Nov 2017 | A1 |
20170350716 | Rodoni | Dec 2017 | A1 |
20170355522 | Salinas et al. | Dec 2017 | A1 |
20170364872 | Rodoni | Dec 2017 | A1 |
20180012172 | Rodoni | Jan 2018 | A1 |
20180025329 | Podgorny et al. | Jan 2018 | A1 |
20180075417 | Gordon et al. | Mar 2018 | A1 |
20180158033 | Woods et al. | Jun 2018 | A1 |
20180194305 | Reed | Jul 2018 | A1 |
20180224287 | Rodini | Aug 2018 | A1 |
20180245940 | Dong et al. | Aug 2018 | A1 |
20180247351 | Rodoni | Aug 2018 | A1 |
20190005466 | Rodoni | Jan 2019 | A1 |
20190019167 | Candel et al. | Jan 2019 | A1 |
20190050879 | Zhang et al. | Feb 2019 | A1 |
20190056416 | Rodoni | Feb 2019 | A1 |
20190065901 | Amato et al. | Feb 2019 | A1 |
20190121368 | Bussetti et al. | Apr 2019 | A1 |
20190196965 | Zhang et al. | Jun 2019 | A1 |
20190197498 | Gates et al. | Jun 2019 | A1 |
20190210798 | Schultz | Jul 2019 | A1 |
20190217342 | Parr et al. | Jul 2019 | A1 |
20190244267 | Rattner et al. | Aug 2019 | A1 |
20190311333 | Kekalainen et al. | Oct 2019 | A1 |
20190360822 | Rodoni | Nov 2019 | A1 |
20190385384 | Romano et al. | Dec 2019 | A1 |
20200082167 | Shalom et al. | Mar 2020 | A1 |
20200082354 | Kurani | Mar 2020 | A1 |
20200109963 | Zass | Apr 2020 | A1 |
20200175556 | Podgorny | Jun 2020 | A1 |
20200189844 | Sridhar | Jun 2020 | A1 |
20200191580 | Christensen et al. | Jun 2020 | A1 |
20200401995 | Aggarwala et al. | Dec 2020 | A1 |
20210024068 | Lacaze et al. | Jan 2021 | A1 |
20210060786 | Ha | Mar 2021 | A1 |
20210188541 | Kurani et al. | Jun 2021 | A1 |
20210217156 | Balachandran et al. | Jul 2021 | A1 |
20210345062 | Koga et al. | Nov 2021 | A1 |
20210371196 | Krishnamurthy et al. | Dec 2021 | A1 |
20220118854 | Davis et al. | Apr 2022 | A1 |
20230117427 | Turner et al. | Apr 2023 | A1 |
Number | Date | Country |
---|---|---|
2632738 | May 2016 | CA |
2632689 | Oct 2016 | CA |
101482742 | Jul 2009 | CN |
101512720 | Aug 2009 | CN |
105787850 | Jul 2016 | CN |
105929778 | Sep 2016 | CN |
106296416 | Jan 2017 | CN |
209870019 | Dec 2019 | CN |
69305435 | Apr 1997 | DE |
69902531 | Apr 2003 | DE |
102012006536 | Oct 2013 | DE |
577540 | Oct 1996 | EP |
1084069 | Aug 2002 | EP |
2028138 | Feb 2009 | EP |
2447184 | Sep 2008 | GB |
2508209 | May 2014 | GB |
3662616 | Jun 2005 | JP |
2012-206817 | Oct 2012 | JP |
2013-142037 | Jul 2013 | JP |
9954237 | Oct 1999 | WO |
2007067772 | Jun 2007 | WO |
2007067775 | Jun 2007 | WO |
2012069839 | May 2012 | WO |
2012172395 | Dec 2012 | WO |
2016074608 | May 2016 | WO |
2016187677 | Dec 2016 | WO |
2017070228 | Apr 2017 | WO |
2017179038 | Oct 2017 | WO |
2018182858 | Oct 2018 | WO |
2018206766 | Nov 2018 | WO |
2018215682 | Nov 2018 | WO |
2019051340 | Mar 2019 | WO |
2019150813 | Aug 2019 | WO |
Entry |
---|
US 9,092,921 B2, 07/2015, Lambert et al. (withdrawn) |
Ali, Tariq et al.; IoT-Based Smart Waste Bin Monitoring and Municipal Solid Waste Manaement System for Smart Cities; Arabian Journal for Science and Engineering; Jun. 4, 2020; 14 pages. |
Alfeo, Antonio Luca et al.; Urban Swarms: A new approch for autonomous waste management; Mar. 1, 2019; 8 pages. |
Jwad, Zainab Adnan et al.; An Optimization Approach for Waste Collection Routes Based on GIS in Hillah-Iraq; 2018; 4 pages; Publisher: IEEE. |
Chaudhari, Sangita S. et al.; Solid Waste Collection as a Service using IoT-Solution for Smart Cities; 2018; 5 pages; Publisher: IEEE. |
Nilopherjan, N. et al.; Automatic Garbage Volume Estimation Using SIFT Features Through Deep Neural Networks and Poisson Surface Reconstruction; International Journal of Pure and Applied Mathematics; vol. 119, No. 14; 2015; pp. 1101-1107. |
Ghongane, Aishwarya et al.; Automatic Garbage Tracking and Collection System; International Journal of Advanced Technology in Engineering and Science; vol. 5, No. 4; Apr. 2017; pp. 166-173. |
Rajani et al.; Waste Management System Based on Location Intelligence; 4 pages; Poojya Doddappa Appa Colleage of Engineering, Kalaburgi. |
Waste Management Review; A clear vison on waste collections; Dec. 8, 2015; 5 pages; http://wastemanagementreiew.com/au/a-clear-vison-on-waste-collections/. |
Waste Management Surveillance Solutiosn; Vehicle Video Cameral; Aug. 23, 2017; 6 pages; http://vehiclevideocameras.com/mobile-video-applications/waste-management-camera.html. |
Rich, John I.; Truck Equipment: Creating a Safer Waste Truck Environment; Sep. 2013; pp. 18-20; WasteAdvantage Magainze. |
Town of Prosper; News Release: Solid Waste Collection Trucks Equipped wit “Third Eye,” video system aborad trash and recycling trucks to improve service; Jan. 13, 2017; 1 page; U.S. |
Product News Network; Telematics/Live Video System Increases Driver Safety/Productivity; Mar. 30, 2015; 3 pages; Thomas Industrial Network, Inc. |
Karidis, Arlene; Waste Pro to Install Hight-Tech Camera Systems in all Trucks to Address Driver Safety; Mar. 10, 2016; 2 pages; Wastedive.com. |
Greenwalt, Megan; Finnish Company Uses IoT to Digitize Trash Bins; Sep. 14, 2016; 21 pages; www.waste360.com. |
Georgakopoulos, Chris; Cameras Cut Recycling Contamination; The Daily Telegraph; Apr. 7, 2014; 2 pages. |
Van Dongen, Matthew; Garbage ‘Gotcha’ Videos on Rise in City: Residents Irked Over Perceived Infractions; Nov. 18, 2015; 3 pages; The Spectator. |
The Advertiser; Waste Service Drives Innovation; Jan. 25, 2016; 2 pages; Fairfax Media Publications Pty Limited; Australia. |
rwp-wasteportal.com; Waste & Recycling Data Portal and Software; 16 pages; printed Oct. 3, 2019. |
Bhargava, Hermant K. et al.; A Web-Based Decision Support System for Waste Disposal and Recycling; pp. 47-65; 1997; Comput.Environ. And Urban Systems, vol. 21, No. 1; Pergamon. |
Kontokasta, Constantine E. et al.; Using Machine Learning and Small Area Estimation to Predict Building-Level Municipal Solid Waste Generation in Cities; pp. 151-162; 2018; Computer, Envieonment and Urban Systems; Elsevier. |
Ferrer, Javier et al.; BIN-CT: Urban Waste Collection Based on Predicting the Container Fill Level; Apr. 23, 2019; 11 pages; Elsevier. |
Vu, Hoang Lan et al.; Waste Management: Assessment of Waste Characteristics and Their Impact on GIS Vechicle Collection Route Optimization Using ANN Waste Forecasts; Environmental Systems Engineering; Mar. 22, 2019; 13 pages; Elsevier. |
Hina, Syeda Mahlaqa; Municipal Solid Waste Collection Route Optimization Using Geospatial Techniques: A Case Study of Two Metropolitan Cities of Pakistan; Feb. 2016; 205 pages; U.S. |
Kannangara, Miyuru et al.; Waste Management: Modeling and Prediction of Regional Municipal Soid Waste Generation and Diversion in Canada Using Machine Learning Approaches; Nov. 30, 2017; 3 pages; Elsevier. |
Tan, Kah Chun et al.; Smart Land: AI Waste Sorting System; University of Malaya; 2 pages; Keysight Techonogies. |
Oliveira, Veronica et al.; Journal of Cleaner Production: Artificial Neural Network Modelling of the Amount of Separately-Collected Household Packaging Waste; Nov. 8, 2018; 9 pages; Elsevier. |
Zade, Jalili Ghazi et al.; Prediction of Municipal Solid Waste Generation by Use of Artificial Neural Network: A Case Study of Mashhad; Winter 2008; 10 pages; Int. J. Environ. Res., 2(1). |
Sein, Myint Myint et al.; Trip Planning Query Based on Partial Sequenced Route Algorithm; 2019 IEEE 8th Global Conference; pp. 778-779. |
A.F., Thompson et al.; Application of Geographic Information System to Solid Waste Management; Pan African International Conference on Information Science, Computing and Telecommunications; 2013; pp. 206-211. |
Malakahmad, Amirhossein et al.; Solid Waste Collection System in Ipoh City, A Review; 2011 International Conference on Business, Engineering and Industrial Applications; pp. 174-179. |
Ma, Enlin et al.; Review of Cutting-Edge Sensing Technologies for Urban Underground Construction; Measurement 167; Jan. 2021; pp. 1-16. |
Burnley, S.J. et al.; Assessing the composition of municipal solid waste in Wales; May 2, 2006; pp. 264-283; Elsevier B.V. |
Lokuliyana, Shashika et al.; Location based garbage management system with IoT for smart city; 13th ICCSE; Aug. 8-11, 2018; pp. 699-703. |
Number | Date | Country | |
---|---|---|---|
Parent | 16549531 | Aug 2019 | US |
Child | 17967170 | US |