The presently disclosed subject matter relates to managing service and non-service related activities associated with a waste collection, disposal or recycling vehicle.
Waste service vehicles (as well as waste container delivery vehicles) used in the waste collection, disposal and recycling industry often have on-board computers, location devices and interior and exterior safety and non-safety related cameras installed on the exterior and interior thereof. These systems provide field managers with limited data related to the waste service vehicle, location of the waste service vehicle, waste service confirmation, customer service issues, service routing issues, customer site information and safety issues and concerns.
Commercial, Residential, Industrial (roll-off services) and Container Delivery waste service providers typically have assigned waste service routes for management of municipal solid waste (MSW, waste, trash or traditional garbage), recycling (residential curbside source separated or single stream) organics (source separated residential green waste and source separated residential food waste and commercial food waste) and commercial recycling (source separated dry waste including cardboard, metals, polymers, paper, other fibers, glass, woods and other inerts, etc.) for each of the waste service provider customers whether the customers are in a competitive subscription open market pursuant to a service contract/agreement or within an exclusive or non-exclusive municipal franchise system. On occasion, waste services may be provided on an ad-hoc basis resulting from On-Call waste services or instances where services are required to be provided off-route because of a missed-pickup or emergency where the customer requires immediate service, etc.
Improvements in this field of technology are desired.
A system for managing waste service activities and nonservice activities outside of the waste servicing collection vehicle is provided which can include one or more of: a waste vehicle equipped with an onboard board computer (OBC) and digital video recorder (DVR); a waste vehicle equipped with one or more optical sensors; triangulation of customer location for commercial, residential and industrial collections using an OBC linked to GPS, on-board DVR linked to GPS and other off-the-shelf mapping geo-coding applications to establish latitude/longitude for each customer container, bin, cart and box; one or more optical sensors configured to capture continuous video recordings from the time the waste collection truck driver commences the DOT pre-trip inspection at the facility location until the collection waste truck driver returns to the facility location and performs the DOT post-trip inspection; one or more optical sensors configured to the OBC, DVR and back office hardware and software system (“System”); one or more optical sensors that, while configured to the OBC, DVR & System, are likewise configured to a signal or signals emanating from electronic or mechanical device on the truck to include, but not limited to proximity switches, limit switches, mechanical sensors, electronic control module (ECM), programmable logic computer (PLC), arms, hoppers and/or blades and such other vehicle devices (hereinafter referred to as “Devices”); one or more optical sensors that, while configured to the OBC, DVR & System, are likewise configured to vehicle movement including acceleration, deceleration, g-force, stopping and starting (hereinafter referred to as “Vehicle Movement”); one or more optical sensors that, while configured to the OBC, DVR & System, are likewise configured to a driver initiated triggering event (hereinafter referred to as “Driver Initiated Event”); one or more optical sensors that, while configured to the OBC, DVR & System, are likewise configured to an external or outside of the vehicle initiated triggering event (hereinafter referred to as “External Initiated Event”); and one or more of Devices, Vehicle Movement, Driver Initiated Events, and External Initiated Events (hereinafter referred to as (“Triggering Events”) that are configured to the OBC and DVR with the Triggering Events causing one or more images or video segments to be captured on the continuous video recording (“Chapters”) resulting in a Still Image or up to 60 second Video Clip assigned to one or more Optical Sensors, and wherein such Chapters are configured to be displayed on a Portal for review and are available in sequential order based on Date, Time and Truck ID.
A method of linking Company and Customer information including Customer container/bin/cart/box location to Chapters is also provided which can include one or more of: each of the Company information including, but not limited to, Business Unit/Site, Truck ID, Company Code and Route ID are downloaded into the OBC; each of the Customer locations including locations of the container/bin/cart/box are assigned a latitude/longitude and downloaded into the OBC; the OBC contains Customer information including, but not limited to Customer ID, Customer name, Customer address, Customer service levels, Franchise name/Open Market with the Customer information likewise link to the Customer location; the OBC likewise contains service requirements assigned to the Customer and waste collection municipal franchise system; and resulting Chapters created from the continuous video recording through Triggering Events are linked/connected to the Company and Customer with each Chapter containing Customer Name, Customer Address, Service Level, Sub-Line of Business, Route ID, Customer Account Number, Franchise Name/Open Market, Service Time and Service Date.
A method of viewing Company information, Customer information and Chapter developed from Triggering Events through the Optical Sensor(s) stored on the OBC and DVR within an Event to Review Portal or Display is also provided which can include one or more of: a visual display of the Still Image and/or up to 60-second Video Clip loaded into Events to Review Portal viewable by Use Case and then Business Unit and Truck ID; the visual display of the Still Image and/or up to 60-second Video Clip with Customer Name, Customer Address, Service Level, Sub-Line of Business, Route ID, Customer Account Number, Franchise Name/Open Market, Service Time and Service Date below the Still Image and/or Video Clip; the ability for the end-user to select an Image aligned to a Use-Case resulting in a red-box highlighting the Image; the ability for the end-user to send the Image via web-address, email and/or text to the User's various departments (e.g. Customer Service, Billing, Operations, etc.) and to the Customer; and the ability for the end-user to Submit the highlighted image to an Advance Report module for use by the Company.
A method for reviewing Use Cases resulting from Chapters and such other waste service activities and nonservice activities outside of the waste servicing collection vehicle is also provided which can include one or more of: selecting Use-Cases from the Events to Review portal; determining if Images meet the criteria under each of the Use-Cases; and submitting the selected Chapter for further review within Advanced Reports.
A method for viewing Advanced Reports and such other Reports resulting from Images from Triggering Events being Submitted from the Events to Review Portal is also provided which can include one or more of: a series of selected Images by the end-user from the Events to Review Portal images being selected; selected Images being available for further review in the Advanced Reporting screen; selected Images in the Advanced Reporting screen being sent to Customer Service, Billing and Operations; and selected Images in the Advanced Reporting screen being sent to Customers via US Mail, web-address, email and/or text.
A system is also provided for managing service activities performed by a waste service vehicle, wherein the system can include: an optical sensor disposed on-board the waste service vehicle and configured to capture a continuous video recording of an area outside of the cab of the waste service vehicle, wherein the continuous video recording is captured during the entirety of a service operations period for the waste service vehicle; a recording device disposed on-board the waste servicing vehicle and configured to store the continuous video recording from the optical sensor; a computing device disposed on-board the waste service vehicle and configured to identify a physical location of a waste service customer; and a central computing device that is not on-board the waste service vehicle and is operatively linked to the optical sensor, recording device and computing device; wherein, upon the occurrence of a pre-defined triggering event, the central computing device is configured to capture a chapter from the continuous video recording, and the central computing device is configured to display the chapter on an electronic viewing portal in association with a visual indication of the physical location of the waste service customer and one or more of date, time, and an identification number for the waste service vehicle. In certain aspects, the pre-defined triggering event comprises one or more of: signal or signals emanating from an electronic or mechanical device on the waste service vehicle, wherein the electronic or mechanical device is one or more of a proximity switch, a limit switch, a mechanical sensor, an electronic control module, a programmable logic computer, an arm, a hopper, a blade or a waste service vehicle device; a movement of the waste service vehicle, wherein the movement comprises acceleration, deceleration, g-force, stopping or starting; a driver initiated triggering event; and a triggering event that is initiated external to or outside of the vehicle. In certain aspects, the service operations period for the waste service vehicle begins when a driver commences an inspection of the vehicle at a facility location prior to performing one or more service activities, and ends when the driver performs an inspection of the waste service vehicle after performing the one or more service activities. In certain aspects, the step of identifying a physical location of a waste service customer during the service operations period comprises identifying a location for one or more of commercial, residential or industrial collections using a GPS linked to the onboard computer and to the on-board recording device to establish latitude and/or longitude for one or more customer waste containers. In certain aspects, the chapter that is captured from the continuous video recording comprises one or more of a still image or a video clip of 60 seconds or less. In certain aspects, the recording device on-board the waste servicing vehicle comprises a digital video recorder. In certain aspects, the on-board computer is configured to: store downloaded waste service company information comprising one or more of a waste site, a business unit, a truck identification number, a company code and a route identification number; store downloaded customer location information comprising latitude/longitude for one or more customer locations including locations of the waste container; and store downloaded customer identifying information comprising one or more of a customer identification number, a customer name, a customer address, one or more customer service levels, a franchise name, and a service requirement assigned to the customer and to a waste collection municipal franchise system; and the central computing device is configured to: link the customer identifying information to the customer location information, and link the chapters from the continuous video recording to one or more of the waste service company information, the customer identifying information, and the customer location information.
A better understanding of the presently disclosed subject matter can be obtained when the following detailed description is considered in conjunction with the drawings and figures herein, wherein:
While the presently disclosed subject matter will be described in connection with the preferred embodiment, it will be understood that it is not intended to limit the presently disclosed subject matter to that embodiment. On the contrary, it is intended to cover all alternatives, modifications, and equivalents, as may be included within the spirit and the scope of the presently disclosed subject matter as defined by the appended claims.
The presently disclosed subject matter relates generally to systems and methods for using video/still images captured by one or more continuously recording optical sensors mounted on waste collection vehicles used in the waste collection, disposal and recycling industry for operational and customer service related purposes. Optical sensors can be integrated into the in-cab monitor as well as the onboard computer, digital video recorder and other external devices. In certain illustrative embodiments, it is desired to virtually connect (in real-time) the waste service provider to the waste service vehicle and ultimately to the waste service customer being serviced for management of waste collection, disposal and recycling in immediate real-time or at a date in the future. The disclosed system is directed to overcoming the issues, problems and opportunities described herein and/or other issues and problems of the prior art.
In certain illustrative embodiments, a system is disclosed for managing some or all service related and nonservice related activities outside of the waste servicing vehicle. The system has a location device configured to determine the location of the service vehicle always while the service vehicle is inactive, in motion and operating and performing service related and nonservice related activities. The service vehicle has optical sensors and such other sensors installed throughout the service vehicle including, but not limited to, high definition cameras, monitors and such other sensors mounted to the front (interior and exterior of the cab), exterior right side, exterior left side, exterior rear and exterior/interior hopper area of the service vehicle. Optical sensors and other sensors are continuously recording all activities from each of the optical sensors with the images and data being stored on an onboard computer and recording device (such as a digital video recorder) and being transmitted and stored remotely away from the waste service vehicle. The onboard computer and recording device are configured to detect motion, g-force, speed, vehicle deceleration, distance from assigned points within a service area and engagement of the service vehicle equipment including service arms resulting in container and bin lifts, engagement of other vehicle mechanical devices and all such other services being performed by the service vehicle. Additionally, the onboard computer and recording device are configured to detect signals from external devices.
In certain illustrative embodiments, the onboard computer and recording device are configured to create notations, cyphers, codes, or chapters (hereinafter referred to as “Chapter(s)”) resulting from one or more optical sensors, while configured to the Onboard Computer (OBC), Digital Video Recorder (DVR) & User's remote back-office hardware and software (System), and are likewise configured to receive a signal or signals emanating from electronic or mechanical devices on the waste collection truck including but not limited to:
In certain illustrative embodiments, the optical sensor takes the videos and sends to the DVR and OBC. The OBC and DVR are configured in such a way that Triggering Events create Chapters (Videos and Still Images), which are stored on a back office system and made available for view through Advanced Reports and other Reports in a viewing portal.
In certain illustrative embodiments, the onboard computer and recording device are configured to create Chapters resulting from one or more optical sensors, while configured to the OBC, DVR & System, and are likewise configured to detect vehicle movement including but not limited to:
In certain illustrative embodiments, the onboard computer and recording device are configured to create Chapters resulting from one or more optical sensors, while configured to the OBC, DVR & System, and are likewise configured to detect a waste collection vehicle driver initiated event (hereinafter referred to as “Driver Initiated Event”).
In certain illustrative embodiments, the onboard computer and recording device are configured to create Chapters resulting from one or more optical sensors, while configured to the OBC, DVR & System, and are likewise configured to detect an external or outside of the waste collection vehicle initiated event (hereinafter referred to as “External Initiated Event”).
In certain illustrative embodiments, one or more Devices, Vehicle Movements, Driver Initiated Events and External Initiated Event (hereinafter collectively referred to as “Triggering Events”) are configured to the OBC and DVR with Triggering Events causing Chapters to be captured on the continuous video recording resulting in a Still Image or up to 60-second Video Clip assigned to one or more Optical Sensors.
Triggering Events resulting in Chapters on the continuous optical sensor recordings may be stored on an onboard vehicle computer and/or digital video recorder and transmitted to remote storage device(s).
The Chapters on the continuous optical sensor recordings and such other sensors are configured to provide immediate and/or passive Still Images and immediate and/or passive limited Video-Clip images for remote viewing and auditing for all service related and nonservice related activities from the waste service vehicles.
In certain illustrative embodiments, the most efficient and optimal method for providing waste services is through routed waste collection optimizing route density and service productivity. Other efficiencies and productivity are gained, service costs are reduced, optimal waste services are provided and waste service company margins are improved when waste service customer are managed pursuant to service expectations designated by the waste service provider and/or the municipal franchise system. These service expectations include, but are not limited to:
Other efficiencies and productivity are gained, service costs are reduced, optimal waste services are provided and waste service company margins are improved when waste service companies effectively manage and service their service routes, service stops and Customers. Effectively servicing and managing routes, stops and Customers can include, but is not limited to:
In certain illustrative embodiments, truck mounted cameras can be used to capture video, still images and/or monitoring from a monitor display inside the waste service vehicle.
For example, in certain illustrative embodiments, OBCs and DVRs are installed in the waste service vehicle with the OBC and DVR configured to detect vehicle location at all times, motion, g-force, speed, vehicle deceleration, distance from assigned points within a service area and engagement of the service vehicle equipment including service arms resulting in container and bin lifts, engagement of other vehicle mechanical devices and all such other services being performed by the service vehicle.
In
For example, in certain illustrative embodiments, continuous video feeds from each of the optical sensor may be used to review certain service related and non-service related activities. Certain predefined Triggering Events can result in a Chapter (defined hereinabove) within the continuous optical sensor recordings.
In certain illustrative embodiments, waste service vehicle and/or optical sensor data can be linked (i) to a specific customer (and associated customer data such as account number, service address, service level, etc.) and/or (ii) to route/location/destination specific information. For example, with respect to customer linking, service confirmations can be collected at a point of service. With respect to route event linking, a service can be reviewed for the purposes of determining contamination with the recycling stream by capturing video/camera images from the hopper camera. The presently disclosed system and method can connect one, more than one or every service and non-service related event to an actual customer or event on the waste service route. This step can be performed manually by a human operator, or it can be an automated process.
As referenced in
In certain illustrative embodiments, the presently disclosed systems and methods provide functionality for end-user and such other user designees to utilize video and still images from one or more vehicle cameras for operational purposes relating to servicing and managing waste collection customers as well as non-service related activities.
For example, multiple user vehicle mounted optical sensors can be used to capture video and still images that are prompted by specific vehicle movement and actions (e.g., movement of the vehicle or movement of the mechanical arm of the truck and engagement of the hopper) and/or prompted by manual manipulation by the driver resulting in the optical sensor(s) capturing specific footage related to customer services and user defined “Use Cases” (again, as further described below).
Also, multiple user vehicle mounted optical sensors can be used to capture video and still images that link videos and still images associated with user's latitude/longitude with customers' containers and bin location and latitude/longitude resulting from user's defined triggering event with user customer information being matched or verified against videos and still images generated through digital video recording.
For example, in certain illustrative embodiments, waste vehicle optical sensors and video recordings and still images are associated with specific user-defined scenarios or “Use Cases” (e.g., overloaded waste containers, contamination of waste containers, waste container maintenance, open lids, identifying waste collection infringement with a municipal franchise system, etc.). Use Cases are further defined in greater detail below. However, in certain illustrative embodiments, the recordings can be indifferent as to the Use Cases and are configured to be used for any one or all user-defined Use Cases and future user-defined Use Cases.
By way of further explanation,
Each discreet Chapter or image can contain one or more of the following information items displayed in an Events to Review and Service Events Portal and Display with one or more of the qualifying Naming Conventions (drop down menus for end-users) allowing the end-user to select by a variety of options to review Chapters or images associated with service related and non-service related activities:
The Source Data for the Event to Review is depicted in
Also, multiple user vehicle mounted optical sensors can be used to capture video and still image Chapters that associated with specific scenarios or “Use Cases” and related software functionality, user camera interface and user customers. The functionality of the process includes, but is not limited to the following:
Also, multiple user vehicle mounted optical sensors can be used to capture video and still images that are available for view and auditing in the display screen (e.g., Events to Review Screen, Service Events Screen, Advanced Reports) and such other portals and screens within the system as defined by user.
Also, multiple user vehicle optical sensors can be used to capture video and still images that, with limited driver interface with the OBC, DVR and user optical sensors, can capture one or more of the following:
Also, multiple user vehicle mounted optical sensors can be used to capture video and still images that are available for Customer Service, Operations, Dispatchers and such other user designees to view a live video feed when there is a potential service issue and address the issue with the driver and customer, as needed.
In certain illustrative embodiments, the system and method provides for a variety of pre-defined Use Cases by Line of Business (Commercial, Residential and Industrial) resulting in the end-user being able to review videos and still images, identify, tag (with a red box or other tagging means) and submit to Advanced Reports or such other reporting methods instances relating to one or more of the following occurrences described in
Management and auditing of Use Cases are more fully explained hereinbelow (such explanations are not all inclusive).
The images in
The images in
The images in
The images in
The images in
The images in
The image in
The image in
The image in
The image in
The image in
The image in
The images in
The images in
The image in
The image in
The image in
The image in
The image in
The image in
The images in
The images in
The images in
In certain illustrative embodiments, within the Events to Review process and management of Use-Cases facilitated by the end-user,
In certain illustrative embodiments, the system and method provide an end-user and/or auditor with the ability to review videos and still images, identify, tag (with a red box or other tagging means) and submit to Advanced Reports or such other reporting methods. A sample of the Advanced Reports are depicted in
Below is a non-exhaustive listing tied to Use Cases, which are submitted and available in Advanced Reports with such instances relating to one or more of the following occurrences and Use Cases:
In certain illustrative embodiments, the system and method provide a user and/or auditor with the ability to operate OBUs, DVRs and user Cameras related to one or more of the following functions:
In certain illustrative embodiments, the system and method provide an end-user with the ability to capture video and still images of service events as described above and send videos and still images to Customer Services, Operations, Dispatch, such other User designees and to customers directly via US Mail, web-address, email and text.
In certain illustrative embodiments, a method is provided of managing service and non-service related activities associated with a waste collection, disposal and/or recycling vehicle (see
In certain illustrative embodiments, a system is provided for managing service and nonservice activities in connection with waste collection, disposal and/or recycling. For example, the system can include a waste vehicle equipped with an onboard board computer (OBC) and digital video recorder (DVR). The waste vehicle can also be equipped with one or more optical sensors. The vehicle can be configured to provide triangulation of customer location for commercial, residential and industrial collections using an OBC linked to GPS, an on-board DVR linked to GPS and other off-the-shelf mapping geo-coding applications to establish latitude/longitude for each customer container, bin, cart and box. One or more optical sensors can be configured to capture continuous video recordings from the time the waste collection truck driver commences the DOT pre-trip inspection at the facility location until the collection waste truck driver returns to the facility location and performs the DOT post-trip inspection. One or more optical sensors can also be configured to the user's OBC, DVR and back office hardware and software system (“System”). One or more optical sensors, while configured to the OBC, DVR & System, can also be configured to a signal or signals emanating from electronic or mechanical devices on the truck which can include, but is not limited to, proximity switches, limit switches, mechanical sensors, electronic control module (ECM), programmable logic computer (PLC), arms, hoppers and/or blades and such other vehicle devices (hereinafter referred to as “Devices”). One or more optical sensors, while configured to the OBC, DVR & System, can also be configured to detect vehicle movement including acceleration, deceleration, g-force, stopping and starting (hereinafter referred to as “Vehicle Movement”). One or more optical sensors, while configured to the OBC, DVR & System, can also be configured to detect a driver initiated triggering event (hereinafter referred to as “Driver Initiated Event”). One or more optical sensors, while configured to the OBC, DVR & System, can also be configured to detect a vehicle initiated triggering event that is external or outside of the vehicle (hereinafter referred to as “External Initiated Event”). One or more of the Devices, Vehicle Movement, Driver Initiated Event, and External Initiated Event (hereinafter referred to as “Triggering Events”) can be configured to the OBC and DVR and can cause a Chapter to be captured on the continuous video recording resulting in a Still Image or up to 60 second Video Clip that is assigned to one or more optical sensors. The Chapters can be configured to associate with a viewing portal for review and can be made available in sequential order based on Date, Time and Truck ID.
In certain illustrative embodiments, a method is provided that involves the use and analysis of one or more of Company information and Customer information. The Company information can include, but is not limited to, Business Unit/Site, Truck ID, Company Code and Route ID, and can be downloaded into the OBC. The Customer locations can include locations of the container/bin/cart/box, and can be assigned a latitude/longitude and downloaded into the OBC. The OBC can contain Customer information including, but not limited to Customer ID, Customer name, Customer address, Customer service levels, Franchise name/Open Market with the Customer information likewise link to the Customer location. The OBC can also contain service requirements assigned to the Customer and waste collection municipal franchise system. Resulting Chapters can be created from the continuous video recording through Triggering Events which are linked/connected to the Company and Customer with each Chapter containing Customer Name, Customer Address, Service Level, Sub-Line of Business, Route ID, Customer Account Number, Franchise Name/Open Market, Service Time and Service Date.
In certain illustrative embodiments, a method of viewing Company information, Customer information and Chapter developed from Triggering Events through the Optical Sensor(s) stored on the OBC and DVR within an Event to Review Portal or Display is provided. A visual display of the Still Image and/or up to 60-second Video Clip can be loaded into the Events to Review Portal and viewable by Use Case and then Business Unit and Truck ID. A visual display can be provided of the Still Image and/or up to 60-second Video Clip with Customer Name, Customer Address, Service Level, Sub-Line of Business, Route ID, Customer Account Number, Franchise Name/Open Market, Service Time and Service Date below the Still Image and/or Video Clip. The end-user can select an Image aligned to a Use-Case resulting in a red-box highlighting the Image. The end-user can also send the Image via web-address, email and/or text to the User's various departments (e.g. Customer Service, Billing, Operations, etc.) and to the Customer. The end-user can also Submit the highlighted image to an Advance Report module for use by the Company.
In certain illustrative embodiments, a method of reviewing Use Cases resulting from Chapters and such other waste service activities and nonservice activities outside of the waste servicing collection vehicle is provided. One or more Use-Cases can be selected from the Events to Review portal. A user can determine if the Images meet the criteria under each of the Use-Cases. The selected Chapter can then be submitted for further review within Advanced Reports.
In certain illustrative embodiments, a method for viewing Advanced Reports and such other Reports resulting from Images from Triggering Events being Submitted from the Events to Review Portal is provided. The end-user can select a series of selected Images from the Events to Review Portal. The selected Images can be made available for further review in the Advanced Reporting screen. The selected Images in the Advanced Reporting screen can be sent to Customer Service, Billing and Operations. The selected Images in the Advanced Reporting screen can also be sent to Customers via US Mail, web-address, email and/or text.
Consideration of Use Cases in management of collection waste services may be categorized or bundled as follows (See
Those skilled in the art will appreciate that portions of the subject matter disclosed herein may be embodied as a method, data processing system, or computer program product. Accordingly, these portions of the subject matter disclosed herein may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware. Furthermore, portions of the subject matter disclosed herein may be a computer program product on a computer-usable storage medium having computer readable program code on the medium. Any suitable computer readable medium may be utilized including hard disks, CD-ROMs, optical storage devices, or other storage devices. Further, the subject matter described herein may be embodied as systems, methods, devices, or components. Accordingly, embodiments may, for example, take the form of hardware, software or any combination thereof, and/or may exist as part of an overall system architecture within which the software will exist. The present detailed description is, therefore, not intended to be taken in a limiting sense.
It is to be understood that the present invention is not limited to the embodiment(s) described above and illustrated herein, but encompasses any and all variations falling within the scope of the appended claims.
This application is a continuation application and claims the benefit, and priority benefit, of U.S. patent application Ser. No. 17/479,106, filed Sep. 20, 2021, which is a continuation application of, and claims the benefit and priority benefit, of U.S. patent application Ser. No. 17/144,027, filed Jan. 7, 2021, now issued as U.S. Pat. No. 11,128,841, which is a continuation application of, and claims the benefit and priority benefit, of U.S. patent application Ser. No. 16/920,037, filed Jul. 2, 2020, now issued as U.S. Pat. No. 10,911,726, which is a continuation application of, and claims the benefit and priority benefit, of U.S. patent application Ser. No. 16/809,335, filed Mar. 4, 2020, now issued as U.S. Pat. No. 10,750,134, which is a continuation application of, and claims the benefit and priority benefit, of U.S. patent application Ser. No. 16/243,257, filed Jan. 9, 2019, now issued as U.S. Pat. No. 10,594,991, which claims the benefit and priority benefit, of U.S. Provisional Patent Application Ser. No. 62/615,360, filed Jan. 9, 2018, the contents of which are incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
3202305 | Herpich | Aug 1965 | A |
5072833 | Hansen et al. | Dec 1991 | A |
5230393 | Mezey | Jul 1993 | A |
5245137 | Bowman et al. | Sep 1993 | A |
5278914 | Kinoshita et al. | Jan 1994 | A |
5489898 | Shigekusa et al. | Feb 1996 | A |
5762461 | Frohlingsdorf | Jun 1998 | A |
5837945 | Cornwell et al. | Nov 1998 | A |
6097995 | Tipton et al. | Aug 2000 | A |
6408261 | Durbin | Jun 2002 | B1 |
6448898 | Kasik | Sep 2002 | B1 |
6510376 | Burnstein et al. | Jan 2003 | B2 |
6563433 | Fujiwara | May 2003 | B2 |
6729540 | Ogawa | May 2004 | B2 |
6811030 | Compton et al. | Nov 2004 | B1 |
7146294 | Waitkus, Jr. | Dec 2006 | B1 |
7330128 | Lombardo et al. | Feb 2008 | B1 |
7383195 | Mallett et al. | Jun 2008 | B2 |
7406402 | Waitkus, Jr. | Jul 2008 | B1 |
7501951 | Maruca et al. | Mar 2009 | B2 |
7511611 | Sabino et al. | Mar 2009 | B2 |
7536457 | Miller | May 2009 | B2 |
7659827 | Gunderson et al. | Feb 2010 | B2 |
7804426 | Etcheson | Sep 2010 | B2 |
7817021 | Date et al. | Oct 2010 | B2 |
7870042 | Maruca et al. | Jan 2011 | B2 |
7878392 | Mayers et al. | Feb 2011 | B2 |
7957937 | Waitkus, Jr. | Jun 2011 | B2 |
7994909 | Maruca et al. | Aug 2011 | B2 |
7999688 | Healey et al. | Aug 2011 | B2 |
8020767 | Reeves et al. | Sep 2011 | B2 |
8056817 | Flood | Nov 2011 | B2 |
8146798 | Flood et al. | Apr 2012 | B2 |
8185277 | Flood et al. | May 2012 | B2 |
8269617 | Cook et al. | Sep 2012 | B2 |
8314708 | Gunderson et al. | Nov 2012 | B2 |
8330059 | Curotto | Dec 2012 | B2 |
8332247 | Bailey et al. | Dec 2012 | B1 |
8373567 | Denson | Feb 2013 | B2 |
8374746 | Plante | Feb 2013 | B2 |
8384540 | Reyes et al. | Feb 2013 | B2 |
8417632 | Robohm et al. | Apr 2013 | B2 |
8433617 | Goad et al. | Apr 2013 | B2 |
8485301 | Grubaugh et al. | Jul 2013 | B2 |
8508353 | Cook et al. | Aug 2013 | B2 |
8542121 | Maruca et al. | Sep 2013 | B2 |
8550252 | Borowski et al. | Oct 2013 | B2 |
8564426 | Cook et al. | Oct 2013 | B2 |
8564446 | Gunderson et al. | Oct 2013 | B2 |
8602298 | Gonen | Dec 2013 | B2 |
8606492 | Botnen | Dec 2013 | B1 |
8630773 | Lee et al. | Jan 2014 | B2 |
8645189 | Lyle | Feb 2014 | B2 |
8674243 | Curotto | Mar 2014 | B2 |
8676428 | Richardson et al. | Mar 2014 | B2 |
8714440 | Flood et al. | May 2014 | B2 |
8738423 | Lyle | May 2014 | B2 |
8744642 | Nemat-Nasser et al. | Jun 2014 | B2 |
8803695 | Denson | Aug 2014 | B2 |
8818908 | Altice et al. | Aug 2014 | B2 |
8849501 | Cook et al. | Sep 2014 | B2 |
8854199 | Cook et al. | Oct 2014 | B2 |
8862495 | Ritter | Oct 2014 | B2 |
8880279 | Plante | Nov 2014 | B2 |
8930072 | Lambert et al. | Jan 2015 | B1 |
8952819 | Nemat-Nasser | Feb 2015 | B2 |
8970703 | Thomas, II et al. | Mar 2015 | B1 |
8996234 | Tamari et al. | Mar 2015 | B1 |
9047721 | Botnen | Jun 2015 | B1 |
9058706 | Cheng | Jun 2015 | B2 |
9098884 | Borowski et al. | Aug 2015 | B2 |
9098956 | Lambert et al. | Aug 2015 | B2 |
9111453 | Alselimi | Aug 2015 | B1 |
9158962 | Nemat-Nasser et al. | Oct 2015 | B1 |
9180887 | Nemat-Nasser et al. | Nov 2015 | B2 |
9189899 | Cook et al. | Nov 2015 | B2 |
9226004 | Plante | Dec 2015 | B1 |
9235750 | Sutton et al. | Jan 2016 | B1 |
9238467 | Hoye et al. | Jan 2016 | B1 |
9240079 | Lambert et al. | Jan 2016 | B2 |
9240080 | Lambert et al. | Jan 2016 | B2 |
9245391 | Cook et al. | Jan 2016 | B2 |
9247040 | Sutton | Jan 2016 | B1 |
9251388 | Flood | Feb 2016 | B2 |
9268741 | Lambert et al. | Feb 2016 | B1 |
9275090 | Denson | Mar 2016 | B2 |
9280857 | Lambert et al. | Mar 2016 | B2 |
9292980 | Cook et al. | Mar 2016 | B2 |
9298575 | Tamari et al. | Mar 2016 | B2 |
9317980 | Cook et al. | Apr 2016 | B2 |
9330287 | Graczyk et al. | May 2016 | B2 |
9341487 | Bonhomme | May 2016 | B2 |
9342884 | Mask | May 2016 | B2 |
9344683 | Nemat-Nasser et al. | May 2016 | B1 |
9347818 | Curotto | May 2016 | B2 |
9358926 | Lambert et al. | Jun 2016 | B2 |
9373257 | Bonhomme | Jun 2016 | B2 |
9389147 | Lambert et al. | Jul 2016 | B1 |
9390568 | Nemat-Nasser et al. | Jul 2016 | B2 |
9396453 | Hynes et al. | Jul 2016 | B2 |
9401985 | Sutton | Jul 2016 | B2 |
9403278 | Van Kampen et al. | Aug 2016 | B1 |
9405992 | Badholm et al. | Aug 2016 | B2 |
9418488 | Lambert | Aug 2016 | B1 |
9428195 | Surpi | Aug 2016 | B1 |
9442194 | Kurihara et al. | Sep 2016 | B2 |
9463110 | Nishtala et al. | Oct 2016 | B2 |
9466212 | Stumphauzer et al. | Oct 2016 | B1 |
9472083 | Nemat-Nasser | Oct 2016 | B2 |
9495811 | Herron | Nov 2016 | B2 |
9501690 | Nemat-Nasser et al. | Nov 2016 | B2 |
9520046 | Call et al. | Dec 2016 | B2 |
9525967 | Mamlyuk | Dec 2016 | B2 |
9546040 | Flood et al. | Jan 2017 | B2 |
9573601 | Hoye et al. | Feb 2017 | B2 |
9574892 | Rodoni | Feb 2017 | B2 |
9586756 | O'Riordan et al. | Mar 2017 | B2 |
9589393 | Botnen | Mar 2017 | B2 |
9594725 | Cook et al. | Mar 2017 | B1 |
9595191 | Surpi | Mar 2017 | B1 |
9597997 | Mitsuta et al. | Mar 2017 | B2 |
9604648 | Tamari et al. | Mar 2017 | B2 |
9633318 | Plante | Apr 2017 | B2 |
9633576 | Reed | Apr 2017 | B2 |
9639535 | Ripley | May 2017 | B1 |
9646651 | Richardson | May 2017 | B1 |
9650051 | Hoye et al. | May 2017 | B2 |
9679210 | Sutton et al. | Jun 2017 | B2 |
9685098 | Kypri | Jun 2017 | B1 |
9688282 | Cook et al. | Jun 2017 | B2 |
9702113 | Kotaki et al. | Jul 2017 | B2 |
9707595 | Ripley | Jul 2017 | B2 |
9721342 | Mask | Aug 2017 | B2 |
9734717 | Surpi et al. | Aug 2017 | B1 |
9754382 | Rodoni | Sep 2017 | B1 |
9766086 | Rodoni | Sep 2017 | B1 |
9778058 | Rodoni | Oct 2017 | B2 |
9803994 | Rodoni | Oct 2017 | B1 |
9824336 | Borges et al. | Nov 2017 | B2 |
9824337 | Rodoni | Nov 2017 | B1 |
9829892 | Rodoni | Nov 2017 | B1 |
9834375 | Jenkins et al. | Dec 2017 | B2 |
9852405 | Rodoni et al. | Dec 2017 | B1 |
10029685 | Hubbard et al. | Jul 2018 | B1 |
10152737 | Lyman | Dec 2018 | B2 |
10198718 | Rodoni | Feb 2019 | B2 |
10204324 | Rodoni | Feb 2019 | B2 |
10210623 | Rodoni | Feb 2019 | B2 |
10255577 | Steves et al. | Apr 2019 | B1 |
10311501 | Rodoni | Jun 2019 | B1 |
10332197 | Kekalainen et al. | Jun 2019 | B2 |
10354232 | Tomlin, Jr. et al. | Jul 2019 | B2 |
10382915 | Rodoni | Aug 2019 | B2 |
10410183 | Bostick et al. | Sep 2019 | B2 |
10594991 | Skolnick | Mar 2020 | B1 |
10625934 | Mallady | Apr 2020 | B2 |
10628805 | Rodatos | Apr 2020 | B2 |
10750134 | Skolnick | Aug 2020 | B1 |
10855958 | Skolnick | Dec 2020 | B1 |
10911726 | Skolnick | Feb 2021 | B1 |
11074557 | Flood | Jul 2021 | B2 |
11128841 | Skolnick | Sep 2021 | B1 |
11140367 | Skolnick | Oct 2021 | B1 |
11172171 | Skolnick | Nov 2021 | B1 |
11222491 | Romano et al. | Jan 2022 | B2 |
11373536 | Savchenko | Jun 2022 | B1 |
11386362 | Kim | Jul 2022 | B1 |
11425340 | Skolnick | Aug 2022 | B1 |
11475416 | Patel et al. | Oct 2022 | B1 |
11475417 | Patel et al. | Oct 2022 | B1 |
11488118 | Patel et al. | Nov 2022 | B1 |
11616933 | Skolnick | Mar 2023 | B1 |
11673740 | Leon | Jun 2023 | B2 |
11715150 | Rodoni | Aug 2023 | B2 |
11727337 | Savchenko | Aug 2023 | B1 |
11790290 | Kim et al. | Oct 2023 | B1 |
11928693 | Savchenko et al. | Mar 2024 | B1 |
20020069097 | Conrath | Jun 2002 | A1 |
20020077875 | Nadir | Jun 2002 | A1 |
20020125315 | Ogawa | Sep 2002 | A1 |
20020194144 | Berry | Dec 2002 | A1 |
20030014334 | Tsukamoto | Jan 2003 | A1 |
20030031543 | Elbrink | Feb 2003 | A1 |
20030069745 | Zenko | Apr 2003 | A1 |
20030191658 | Rajewski | Oct 2003 | A1 |
20030233261 | Kawahara et al. | Dec 2003 | A1 |
20040039595 | Berry | Feb 2004 | A1 |
20040167799 | Berry | Aug 2004 | A1 |
20050038572 | Krupowicz | Feb 2005 | A1 |
20050080520 | Kline et al. | Apr 2005 | A1 |
20050182643 | Shirvanian | Aug 2005 | A1 |
20050209825 | Ogawa | Sep 2005 | A1 |
20050234911 | Hess et al. | Oct 2005 | A1 |
20050261917 | Forget Shield | Nov 2005 | A1 |
20060235808 | Berry | Oct 2006 | A1 |
20070150138 | Plante | Jun 2007 | A1 |
20070260466 | Casella et al. | Nov 2007 | A1 |
20070278140 | Mallett et al. | Dec 2007 | A1 |
20080010197 | Scherer | Jan 2008 | A1 |
20080065324 | Muramatsu et al. | Mar 2008 | A1 |
20080077541 | Scherer et al. | Mar 2008 | A1 |
20080202357 | Flood | Aug 2008 | A1 |
20080234889 | Sorensen | Sep 2008 | A1 |
20090014363 | Gonen et al. | Jan 2009 | A1 |
20090024479 | Gonen et al. | Jan 2009 | A1 |
20090055239 | Waitkus, Jr. | Feb 2009 | A1 |
20090083090 | Rolfes et al. | Mar 2009 | A1 |
20090126473 | Porat et al. | May 2009 | A1 |
20090138358 | Gonen et al. | May 2009 | A1 |
20090157255 | Plante | Jun 2009 | A1 |
20090161907 | Healey et al. | Jun 2009 | A1 |
20100017276 | Wolff et al. | Jan 2010 | A1 |
20100071572 | Carroll et al. | Mar 2010 | A1 |
20100119341 | Flood et al. | May 2010 | A1 |
20100175556 | Kummer et al. | Jul 2010 | A1 |
20100185506 | Wolff et al. | Jul 2010 | A1 |
20100217715 | Lipcon | Aug 2010 | A1 |
20100312601 | Lin | Dec 2010 | A1 |
20110108620 | Wadden et al. | May 2011 | A1 |
20110137776 | Goad et al. | Jun 2011 | A1 |
20110208429 | Zheng et al. | Aug 2011 | A1 |
20110225098 | Wolff et al. | Sep 2011 | A1 |
20110260878 | Rigling | Oct 2011 | A1 |
20110279245 | Hynes et al. | Nov 2011 | A1 |
20110316689 | Reyes et al. | Dec 2011 | A1 |
20120029980 | Paz et al. | Feb 2012 | A1 |
20120029985 | Wilson et al. | Feb 2012 | A1 |
20120047080 | Rodatos | Feb 2012 | A1 |
20120262568 | Ruthenberg | Oct 2012 | A1 |
20120265589 | Whittier | Oct 2012 | A1 |
20120310691 | Carlsson et al. | Dec 2012 | A1 |
20130024335 | Lok | Jan 2013 | A1 |
20130039728 | Price et al. | Feb 2013 | A1 |
20130041832 | Rodatos | Feb 2013 | A1 |
20130075468 | Wadden et al. | Mar 2013 | A1 |
20130332238 | Lyle | Dec 2013 | A1 |
20130332247 | Gu | Dec 2013 | A1 |
20140060939 | Eppert | Mar 2014 | A1 |
20140112673 | Sayama | Apr 2014 | A1 |
20140114868 | Wan et al. | Apr 2014 | A1 |
20140172174 | Poss et al. | Jun 2014 | A1 |
20140214697 | McSweeney | Jul 2014 | A1 |
20140236446 | Spence | Aug 2014 | A1 |
20140278630 | Gates et al. | Sep 2014 | A1 |
20140379588 | Gates et al. | Dec 2014 | A1 |
20150095103 | Rajamani et al. | Apr 2015 | A1 |
20150100428 | Parkinson, Jr. | Apr 2015 | A1 |
20150144012 | Frybarger | May 2015 | A1 |
20150278759 | Harris et al. | Oct 2015 | A1 |
20150294431 | Fiorucci et al. | Oct 2015 | A1 |
20150298903 | Luxford | Oct 2015 | A1 |
20150302364 | Calzada et al. | Oct 2015 | A1 |
20150307273 | Lyman | Oct 2015 | A1 |
20150324760 | Borowski et al. | Nov 2015 | A1 |
20150326829 | Kurihara et al. | Nov 2015 | A1 |
20150348252 | Mask | Dec 2015 | A1 |
20150350610 | Loh | Dec 2015 | A1 |
20160021287 | Loh | Jan 2016 | A1 |
20160044285 | Gasca et al. | Feb 2016 | A1 |
20160179065 | Shahabdeen | Jun 2016 | A1 |
20160187188 | Curotto | Jun 2016 | A1 |
20160224846 | Cardno | Aug 2016 | A1 |
20160232498 | Tomlin, Jr. et al. | Aug 2016 | A1 |
20160239689 | Flood | Aug 2016 | A1 |
20160247058 | Kreiner et al. | Aug 2016 | A1 |
20160292653 | Gonen | Oct 2016 | A1 |
20160300297 | Kekalainen et al. | Oct 2016 | A1 |
20160321619 | Inan et al. | Nov 2016 | A1 |
20160334236 | Mason et al. | Nov 2016 | A1 |
20160335814 | Tamari et al. | Nov 2016 | A1 |
20160372225 | Lefkowitz et al. | Dec 2016 | A1 |
20160377445 | Rodoni | Dec 2016 | A1 |
20160379152 | Rodoni | Dec 2016 | A1 |
20160379154 | Rodoni | Dec 2016 | A1 |
20170008671 | Whitman et al. | Jan 2017 | A1 |
20170011363 | Whitman et al. | Jan 2017 | A1 |
20170029209 | Smith et al. | Feb 2017 | A1 |
20170046528 | Lambert | Feb 2017 | A1 |
20170061222 | Hoye et al. | Mar 2017 | A1 |
20170076249 | Byron et al. | Mar 2017 | A1 |
20170081120 | Liu et al. | Mar 2017 | A1 |
20170086230 | Azevedo et al. | Mar 2017 | A1 |
20170109704 | Lettieri et al. | Apr 2017 | A1 |
20170116583 | Rodoni | Apr 2017 | A1 |
20170116668 | Rodoni | Apr 2017 | A1 |
20170118609 | Rodoni | Apr 2017 | A1 |
20170121107 | Flood et al. | May 2017 | A1 |
20170124533 | Rodoni | May 2017 | A1 |
20170154287 | Kalinowski et al. | Jun 2017 | A1 |
20170176986 | High et al. | Jun 2017 | A1 |
20170193798 | Call et al. | Jul 2017 | A1 |
20170200333 | Plante | Jul 2017 | A1 |
20170203706 | Reed | Jul 2017 | A1 |
20170221017 | Gonen | Aug 2017 | A1 |
20170243269 | Rodini et al. | Aug 2017 | A1 |
20170243363 | Rodini | Aug 2017 | A1 |
20170277726 | Huang et al. | Sep 2017 | A1 |
20170308871 | Tallis | Oct 2017 | A1 |
20170330134 | Botea et al. | Nov 2017 | A1 |
20170344959 | Bostick et al. | Nov 2017 | A1 |
20170345169 | Rodoni | Nov 2017 | A1 |
20170350716 | Rodoni | Dec 2017 | A1 |
20170355522 | Salinas et al. | Dec 2017 | A1 |
20170364872 | Rodoni | Dec 2017 | A1 |
20180012172 | Rodoni | Jan 2018 | A1 |
20180025329 | Podgorny et al. | Jan 2018 | A1 |
20180075417 | Gordon et al. | Mar 2018 | A1 |
20180158033 | Woods et al. | Jun 2018 | A1 |
20180194305 | Reed | Jul 2018 | A1 |
20180224287 | Rodini et al. | Aug 2018 | A1 |
20180245940 | Dong et al. | Aug 2018 | A1 |
20180247351 | Rodoni | Aug 2018 | A1 |
20190005466 | Rodoni | Jan 2019 | A1 |
20190019167 | Candel et al. | Jan 2019 | A1 |
20190050879 | Zhang et al. | Feb 2019 | A1 |
20190056416 | Rodoni | Feb 2019 | A1 |
20190065901 | Amato et al. | Feb 2019 | A1 |
20190121368 | Bussetti et al. | Apr 2019 | A1 |
20190196965 | Zhang et al. | Jun 2019 | A1 |
20190197498 | Gates et al. | Jun 2019 | A1 |
20190210798 | Schultz | Jul 2019 | A1 |
20190217342 | Parr et al. | Jul 2019 | A1 |
20190244267 | Rattner et al. | Aug 2019 | A1 |
20190311333 | Kekalainen et al. | Oct 2019 | A1 |
20190360822 | Rodoni et al. | Nov 2019 | A1 |
20190385384 | Romano et al. | Dec 2019 | A1 |
20200082167 | Shalom et al. | Mar 2020 | A1 |
20200082354 | Kurani | Mar 2020 | A1 |
20200109963 | Zass | Apr 2020 | A1 |
20200175556 | Podgorny | Jun 2020 | A1 |
20200189844 | Sridhar | Jun 2020 | A1 |
20200191580 | Christensen et al. | Jun 2020 | A1 |
20200401995 | Aggarwala et al. | Dec 2020 | A1 |
20210024068 | Lacaze et al. | Jan 2021 | A1 |
20210060786 | Ha | Mar 2021 | A1 |
20210188541 | Kurani et al. | Jun 2021 | A1 |
20210217156 | Balachandran et al. | Jul 2021 | A1 |
20210345062 | Koga et al. | Nov 2021 | A1 |
20210371196 | Krishnamurthy et al. | Dec 2021 | A1 |
20220118854 | Davis et al. | Apr 2022 | A1 |
20230117427 | Turner et al. | Apr 2023 | A1 |
Number | Date | Country |
---|---|---|
2632738 | May 2016 | CA |
2632689 | Oct 2016 | CA |
101482742 | Jul 2009 | CN |
101512720 | Aug 2009 | CN |
105787850 | Jul 2016 | CN |
105929778 | Sep 2016 | CN |
106296416 | Jan 2017 | CN |
209870019 X | Dec 2019 | CN |
69305435 | Apr 1997 | DE |
69902531 | Apr 2003 | DE |
102012006536 | Oct 2013 | DE |
577540 | Oct 1996 | EP |
1084069 | Aug 2002 | EP |
2028138 | Feb 2009 | EP |
2447184 | Sep 2008 | GB |
2508209 | May 2014 | GB |
3662616 | Jun 2005 | JP |
2012-206817 | Oct 2012 | JP |
2013-142037 | Jul 2013 | JP |
9954237 | Oct 1999 | WO |
2007067772 | Jun 2007 | WO |
2007067775 | Jun 2007 | WO |
2012069839 | May 2012 | WO |
2012172395 | Dec 2012 | WO |
2016074608 | May 2016 | WO |
2016187677 | Dec 2016 | WO |
2017070228 | Apr 2017 | WO |
2017179038 | Oct 2017 | WO |
2018182858 | Oct 2018 | WO |
2018206766 | Nov 2018 | WO |
2018215682 | Nov 2018 | WO |
2019051340 | Mar 2019 | WO |
2019150813 | Aug 2019 | WO |
Entry |
---|
US 9,092,921 B2, 07/2015, Lambert et al. (withdrawn) |
Nilopherjan, N. et al.; Automatic Garbage Volume Estimation Using SIFT Features Through Deep Neural Networks and Poisson Surface Reconstruction; International Journal of Pure and Applied Mathematics; vol. 119, No. 14; 2015; pp. 1101-1107. |
Ghongane, Aishwarya et al; Automatic Garbage Tracking and Collection System; International Journal of Advanced Technology in Engineering and Science; vol. 5, No. 4; Apr. 2017; pp. 166-173. |
Rajani et al.; Waste Management System Based on Location Intelligence; 4 pages; Poojya Doddappa Appa Colleage of Engineering, Kalaburgi. |
Waste Management Review; A clear vison on waste collections; Dec. 8, 2015; 5 pages; http://wastemanagementreiew.com/au/a-clear-vison-on-waste-collections/. |
Waste Management Surveillance Solutiosn; Vehicle Video Cameral; Aug. 23, 2017; 6 pages; http://vehiclevideocameras.com/mobile-video-applications/waste-management-camera.html. |
Rich, John I.; Truck Equipment: Creating a Safer Waste Truck Environment; Sep. 2013; pp. 18-20; WasteAdvantage Magainze. |
Town of Prosper; News Release: Solid Waste Collection Trucks Equipped wit “Third Eye,” video system aborad trash and recycling trucks to improve service; Jan. 13, 2017; 1 page; U.S. |
Product News Network; Telematics/Live Video System Increases Driver Safety/Productivity; Mar. 30, 2015; 3 pages; Thomas Industrial Network, Inc. |
Karidis, Arlene; Waste Pro to Install Hight-Tech Camera Systems in all Trucks to Address Driver Safety; Mar. 10, 2016; 2 pages; Wastedive.com. |
Greenwalt, Megan; Finnish Company Uses IoT to Digitize Trash Bins; Sep. 14, 2016; 21 pages; www.waste360.com. |
Georgakopoulos, Chris; Cameras Cut Recycling Contamination; The Daily Telegraph; Apr. 7, 2014; 2 pages. |
Van Dongen, Matthew; Garbage ‘Gotcha’ Videos on Rise in City: Residents Irked Over Perceived Infractions; Nov. 18, 2015; 3 pages; The Spectator. |
The Advertiser; Waste Service Drives Innovation; Jan. 25, 2016; 2 pages; Fairfax Media Publications Pty Limited; Australia. |
RWP—wasteportal.com; Waste & Recycling Data Portal and Software; 16 pages; printed Oct. 3, 2019. |
Bhargava, Hermant K. et al.; A Web-Based Decision Support System for Waste Disposal and Recycling; pp. 47-65; 1997; Comput.Environ. And Urban Systems, vol. 21, No. 1; Pergamon. |
Kontokasta, Constantine E. et al.; Using Machine Learning and Small Area Estimation to Predict Building-Level Municipal Solid Waste Generation in Cities; pp. 151-162; 2018; Computer, Envieonment and Urban Systems; Elsevier. |
Ferrer, Javier et al.; BIN-CT: Urban Waste Collection Based on Predicting the Container Fill Level; Apr. 23, 2019; 11 pages; Elsevier. |
Vu, Hoang Lan et al.; Waste Management: Assessment of Waste Characteristics and Their Impact on GIS Vechicle Collection Route Optimization Using ANN Waste Forecasts; Environmental Systems Engineering; Mar. 22, 2019; 13 pages; Elsevier. |
Hina, Syeda Mahlaqa; Municipal Solid Waste Collection Route Optimization Using Geospatial Techniques: A Case Study of Two Metropolitan Cities of Pakistan; Feb. 2016; 205 pages; U.S. |
Kannangara, Miyuru et al.; Waste Management: Modeling and Prediction of Regional Municipal Soid Waste Generation and Diversion in Canada Using Machine Learning Approaches; Nov. 30, 2017; 3 pages; Elsevier. |
Tan, Kah Chun et al.; Smart Land: AI Waste Sorting System; University of Malaya; 2 pages; Keysight Techonogies. |
Oliveira, Veronica et al.; Journal of Cleaner Production: Artificial Neural Network Modelling of the Amount of Separately-Collected Household Packaging Waste; Nov. 8, 2018; 9 pages; Elsevier. |
Zade, Jalili Ghazi et al.; Prediction of Municipal Solid Waste Generation by Use of Artificial Neural Network: A Case Study of Mashhad; Winter 2008; 10 pages; Int. J. Environ. Res., 2(1). |
Sein, Myint Myint et al.; Trip Planning Query Based on Partial Sequenced Route Algorithm; 2019 IEEE 8th Global Conference; pp. 778-779. |
A.F., Thompson et al.; Application of Geographic Information System to Solid Waste Management; Pan African International Conference on Information Science, Computing and Telecommunications; 2013; pp. 206-211. |
Malakahmad, Amirhossein et al.; Solid Waste Collection System in Ipoh City, A Review; 2011 International Conference on Business, Engineering and Industrial Applications; pp. 174-179. |
Ali, Tariq et al.; IoT-Based Smart Waste Bin Monitoring and Municipal Solid Waste Manaement System for Smart Cities; Arabian Journal for Science and Engineering; Jun. 4, 2020; 14 pages. |
Alfeo, Antonio Luca et al.; Urban Swarms: A new approch for autonomous waste management; Mar. 1, 2019; 8 pages. |
Jwad, Zainab Adnan et al.; An Optimization Approach for Waste Collection Routes Based on GIS in Hillah-Iraq; 2018; 4 pages; Publisher: IEEE. |
Chaudhari, Sangita S. et al.; Solid Waste Collection as a Service using IoT-Solution for Smart Cities; 2018; 5 pages; Publisher: IEEE. |
Burnley, S.J. et al.; Assessing the composition of municipal solid waste in Wales; May 2, 2006; pp. 264-283; Elsevier B.V. |
Lokuliyana, Shashika et al.; Location based garbage management system with IoT for smart city; 13th ICCSE; Aug. 8-11, 2018; pp. 699-703. |
Ma, Enlin et al.; Review of Cutting-Edge Sensing Technologies for Urban Underground Construction; Measurement 167; Jan. 2021; pp. 1-16. |
Number | Date | Country | |
---|---|---|---|
62615360 | Jan 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17479106 | Sep 2021 | US |
Child | 17892971 | US | |
Parent | 17144027 | Jan 2021 | US |
Child | 17479106 | US | |
Parent | 16920037 | Jul 2020 | US |
Child | 17144027 | US | |
Parent | 16809335 | Mar 2020 | US |
Child | 16920037 | US | |
Parent | 16243257 | Jan 2019 | US |
Child | 16809335 | US |