Field of the Disclosure
The present disclosure is generally related to adaptive traffic management and more particularly to adaptive zone detection for use in adaptive traffic management, where the adaptive zone detection is based on pedestrian and object traffic at an intersection.
Description of the Related Art
Pedestrians usually suffer from severe injuries when they are involved in accidents with motor objects. Pedestrian accidents occur while people are on sidewalks, stepping off curbs, leaving buses or taxis, walking though parking lots, jogging down the street, or standing on roadsides. According to a recent report by the National Highway Transportation Safety Administration (NHTSA), 4,432 pedestrians were killed and another 69,000 pedestrians injured on public streets and roadways in the United States in 2011. On average, one crash-related pedestrian death occurred every two hours, and an injury every eight minutes. More than one-third of those accidents occurred on crosswalks.
Crosswalks are often insufficiently marked and lack proper lighting, stop signs, or other warnings signs to alert drivers well in advance to slow down. Historically, intersections are designed focusing more on moving traffic efficiently rather than on the safety of pedestrians. In addition to poorly designed or maintained crosswalks, a significant number of pedestrian accidents are caused by drivers who are inexperienced, distracted, and, in many cases, speeding over the legal limits.
Thus, an improved traffic optimization system is required that could help improve pedestrian safety at crosswalks.
In a first claimed embodiment, a method of automatic zone creation and modification includes applying default zone parameters to define detection zones at one or more sensors installed at an intersection where the detection zones are used by the one or more sensors for monitoring and detecting traffic conditions at the intersection. The method further includes determining a current vehicular traffic flow rate and a current pedestrian traffic flow rate at the intersection, determining if a triggering condition for adjusting one or more of the default zone parameters and adjusting the one or more of the default zone parameters if the triggering condition is met.
In a second claimed embodiment, a controller includes a memory with computer-readable instructions therein and one or more processors configured to execute the computer-readable instructions to apply default zone parameters to define detection zones at one or more sensors installed at an intersection, where the detection zones are used by the one or more sensors for monitoring and detecting traffic conditions at the intersection; determine a current vehicular traffic flow rate and a current pedestrian traffic flow rate at the intersection; determine if a triggering condition for adjusting one or more of the default zone parameters; and adjust the one or more of the default zone parameters if the triggering condition is met.
In a third claimed embodiment, one or more non-transitory computer-readable medium have computer-readable instructions stored therein that can be executed by one or more processors of a controller for the controller to apply default zone parameters to define detection zones at one or more sensors installed at an intersection, where the detection zones are used by the one or more sensors for monitoring and detecting traffic conditions at the intersection; determine a current vehicular traffic flow rate and a current pedestrian traffic flow rate at the intersection; determine if a triggering condition for adjusting one or more of the default zone parameters; and adjust the one or more of the default zone parameters if the triggering condition is met.
The accompanying drawings illustrate various embodiments of systems, methods, and embodiments of various other aspects of the disclosure. Any person with ordinary skills in the art will appreciate that the illustrated element boundaries (e.g. boxes, groups of boxes, or other shapes) in the figures represent one example of the boundaries. It may be that in some examples one element may be designed as multiple elements or that multiple elements may be designed as one element. In some examples, an element shown as an internal component of one element may be implemented as an external component in another, and vice versa. Furthermore, elements may not be drawn to scale. Non-limiting and non-exhaustive descriptions are described with reference to the following drawings. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating principles.
Adaptive traffic management relies, in part, on proper detection of traffic conditions (number and types of objects, traffic flow rates, etc.), at an intersection in real time to adjust, also in real time, traffic control parameters such as signal durations to respond to the changed traffic conditions. The proper detection of traffic conditions at an intersection is made possible by using various types of smart cameras, sensors, etc.
Performance of a smart camera to better detect objects and traffic conditions improves as its zone detection and modification capabilities improve. In one aspect, such zone detection and modification capabilities can be improved but taking into consideration not only the vehicular traffic conditions at an intersection but also pedestrian traffic conditions and correlations thereof with vehicular traffic conditions.
Hereinafter, examples will be described for improving zone identification and detection for smart traffic cameras based on vehicular and pedestrian traffic conditions. The disclosure begins with a description of an example system.
There may be more than one smart traffic camera 103 or one traffic light 117 installed at intersection 101. The smart traffic camera 103 may be one of various types of cameras, including but not limited to, fisheye traffic cameras to detect and optimize traffic flows at the intersection 101 and/or at other intersections part of the same local network or corridor. The smart traffic camera 103 can be any combination of cameras or optical sensors, such as but not limited to fish-eye cameras, directional cameras, infrared cameras, etc. The smart traffic camera 103 can allow for other types of sensors to be connected to thereto (e.g., via various known or to be developed wired and/or wireless communication schemes) for additional data collection. The smart traffic camera 103 can collect video and other sensor data at the intersection 101 and convey the same to the traffic controller 106 for further processing, as will be described below.
The light controller 102 can manage and control traffic for all zones (directions) at which traffic enters and exits the intersection 101. Examples of different zones of the intersection 101 are illustrated in
The system 100 may further include network 104. The network 104 can enable the light controller 102 to communicate with a remote traffic control system 106 (which may be referred to as a centralized traffic control system or simply a traffic controller 106). The network 104 can be any known or to be developed cellular, wireless access network and/or a local area network that enables communication (wired or wireless) among components of the system 100. As mentioned above, the light controller 102 and the traffic controller 106 can communicate via the network 104 to exchange data, created traffic rules or control settings, etc., as will be described below.
The traffic controller 106 can be a centralized system used for managing and controlling traffic lights and conditions at multiple intersections (in a given locality, neighbourhood, an entire town, city, state, etc.).
The traffic controller 106 can be communicatively coupled (e.g., via any known or to be developed wired and/or wireless network connection such as network 104) to one or more databases such as a zone setting database 108, a tracking and ID database 110, a pedestrian database 112 and a 3rd party database 114.
The zone setting database 108 may be configured to store settings for vehicular zones and pedestrian zones, as will be described below. The tracking and ID database 110 may be configured to store identity of detected objects (e.g., vehicles) and pedestrians determined by traffic controller 106, as will be described below. The pedestrian database 112 may be configured to store machine-learning based rules created for zone detection and modification using detected pedestrian traffic and correlation thereof with vehicular traffic. Finally, the 3rd party database 114 may be configured to store additional contextual data to be used for zone identification and detection including, but not limited to, information about mobile devices carried by detected pedestrians, distances of such devices to respective pedestrians, scheduled nearby public events or events and nearby points of interest, etc. In one example, 3rd party database 114 can be provided by a 3rd party and can be publicly or privately (subscription based) accessible to traffic controller 106.
The traffic controller 106 can provide a centralized platform for network operators to view and manage traffic conditions, set traffic control parameters and/or manually override any traffic control mechanisms at any given intersection. An operator can access and use the traffic controller 106 via a corresponding graphical user interface 116 after providing login credentials and authentication of the same by the traffic controller 106. The traffic controller 106 can be controlled, via the graphical user interface 116, by an operator to receive traffic control settings and parameters to apply to one or more designated intersections. The traffic controller 106 can also perform automated and adaptive control of traffic at the intersection 101 or any other associated intersection based on analysis of traffic conditions, data and statistics at a given intersection(s) using various algorithms and computer-readable programs such as known or to be developed machine learning algorithms. The components and operations of traffic controller 106 will be further described below.
Traffic controller 106 can be a cloud based component running on a public, private and/or a hybrid cloud service/infrastructure provided by one or more cloud service providers.
The system 100 can also have additional intersections and corresponding light controllers associated therewith. Accordingly, not only the traffic controller 106 is capable of adaptively controlling the traffic at an intersection based on traffic data at that particular intersection but it can further adapt traffic control parameters for that particular intersection based on traffic data and statistics at nearby intersections communicatively coupled to the traffic controller 106.
As shown in
The intersections 120 can be any number of intersections adjacent to the intersection 101, within the same neighbourhood or city as the intersection 101, intersections in another city, etc.
In one or more examples, the light controller 102 and the traffic controller 106 can be the same (one component implementing the functionalities of both). In such example, components of each described below with reference to
As mentioned above, the components of the system 100 can communicate with one another using any known or to be developed wired and/or wireless network. For example, for wireless communication, techniques such as Visible Light Communication (VLC), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE), Fifth Generation (5G) cellular, Wireless Local Area Network (WLAN), Infrared (IR) communication, Public Switched Telephone Network (PSTN), Radio waves, and other communication techniques known, or to be developed, in the art may be utilized.
While certain components of the system 100 are illustrated in
Having described components of the system 100 as an example, the disclosure now turns to description of one or more examples of components of the light controller 102 and the traffic controller 106.
The interface(s) 204 may assist an operator in interacting with the traffic controller 106. The interface(s) 204 of the traffic controller 106 can be used instead of or in addition to the graphical user interface 116 described above with reference to
The memory 206 may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, Compact Disc Read-Only Memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, Random Access Memories (RAMs), Programmable Read-Only Memories (PROMs), Erasable PROMs (EPROMs), Electrically Erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions.
The memory 206 may include computer-readable instructions, which when executed by the processor 202 cause the traffic controller 106 to perform dynamic detection zone creation and modification for adaptive traffic control. The computer-readable instructions stored in the memory 206 can be identified as tracking and ID module (service) 208, learning module (service) 210 and zone setup module (service) 212. The functionalities of each of these modules, when executed by the processor 202 will be further described below.
The light controller 102 can comprise one or more processors such as a processor 302, interface(s) 304, sensor(s) 306, and one or more memories such as a memory 308. The processor 302 may execute an algorithm stored in the memory 308 for adaptive traffic control, as will be described below. The processor 302 may also be configured to decode and execute any instructions received from one or more other electronic devices or server(s). The processor 302 may include one or more general purpose processors (e.g., INTEL® or Advanced Micro Devices® (AMD) microprocessors, ARM) and/or one or more special purpose processors (e.g., digital signal processors, Xilinx® System On Chip (SOC) Field Programmable Gate Array (FPGA) processor and/or, and/or Graphics Processing Units (GPUs)). The processor 302 may be configured to execute one or more computer-readable program instructions, such as program instructions to carry out any of the functions described in this description.
The interface(s) 304 may assist an operator in interacting with the light controller 102. The interface(s) 304 of the light controller 102 may be used instead of or in addition to the graphical user interface 116 described with reference to
The sensor(s) 306 can be one or more smart cameras such as fish-eye cameras mentioned above or any other type of sensor/capturing device that can capture various types of data (e.g., audio/visual data) regarding activities and traffic patterns at the intersection 101. Any one sensor 306 can be located at the intersection 101 and coupled to the traffic controller 106 and/or the traffic light 117.
As mentioned, the sensor(s) 306 may be installed to capture objects moving across the roads. The sensor(s) 306 used may include, but are not limited to, optical sensors such as fish-eye camera mentioned above, Closed Circuit Television (CCTV) camera and Infrared camera. Further, sensor(s) 306 can include, but not limited to induction loops, Light Detection and Ranging (LIDAR), radar/microwave, weather sensors, motion sensors, audio sensors, pneumatic road tubes, magnetic sensors, piezoelectric cable, and weigh-in motion sensor, which may also be used in combination with the optical sensor(s) or alone.
The memory 308 may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, Compact Disc Read-Only Memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, Random Access Memories (RAMs), Programmable Read-Only Memories (PROMs), Erasable PROMs (EPROMs), Electrically Erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions.
The memory 308 may include computer-readable instructions, which when executed by the processor 302 cause the light controller 102 to perform dynamic zone creation and modification for adaptive traffic control. The computer-readable instructions stored in the memory 206 can be identified as normal traffic module (service) 310, pedestrian activity module (service) 312 and zone change module (service) 314), the functionalities of which, when executed by the processor 302 will be further described below.
As mentioned above, light controller 102 and traffic controller 106 may form a single physical unit, in which case system components of each, as described with reference to
While certain components have been shown and described with reference to
Having described an example system and example components of one or more elements thereof with reference to
Furthermore,
At step 400, the traffic controller 106 may receive traffic data at the intersection 101 captured by smart traffic camera 103 and/or sensor(s) 306 associated therewith. The received traffic data may include an image of the intersection 101 including any objects (moving or still) present therein.
At step 410, the traffic controller 106 may display (output) the received video data on a screen of GUI 116 for a user (e.g., a network operator). In response, the user, using tools made available to the user via GUI 116 may draw (visually) parameters of one or more zones at the intersection 101. Alternatively, the user may provide dimensional information of the one or more zones at the intersection 101 instead of drawing them. The specified parameters by the user may be referred to as default zone parameters.
Optionally, the user, via the GUI 116, can also specify one or more alternative zone parameters (which can also be referred to as extended zone parameters, per
Additionally, such alternative zone parameters may have one or more corresponding triggering conditions. For example, the user may enlarge one or more zones in a given direction at the intersection 101 (e.g., on north-south bound direction) during a rush hour period (e.g., daily between 4 PM to 6 PM) or during a scheduled road closure or public event.
At step 420, the traffic controller 106 may receive the default zone parameters and/or any specified alternative zone parameters for the intersection 101 via the GUI 116.
At step 430, the traffic controller 106 may store the specified default zone parameters and/or any alternative zone parameters for the intersection 101 in the zone setting database 108.
Optionally, at step 440, traffic controller 106 can send the default zone parameters and any alternative zone parameters together with corresponding triggering conditions to smart traffic camera 103 for implementation. Alternatively, the smart traffic camera 103 may query the zone settings database for default/alterative zone parameters.
Furthermore,
At step 600 and after the default and/or any applicable alternative zone parameters have been applied to the smart traffic camera 103 and/or any applicable ones of the sensors 306, the traffic controller 106 may receive traffic data at the intersection 101 captured by smart traffic camera 103 and/or sensor(s) 306 associated therewith. The received traffic data may include an image of the intersection 101 including any objects (moving or still) present therein.
At step 610, the traffic controller 106 may analyse the traffic data received at step 600. The analysis can be for identification and tracking of objects and pedestrians at the intersection 101, determining pedestrians, pedestrian types (e.g., adults, children, animals, etc.), vehicle and their types, pedestrian and vehicular flow rates at each zone of the intersection 101, location of existing zones of the intersection 101, light phases and timing at the time of data capture by the smart traffic light 103 and/or sensor(s) 306, etc. Objects and vehicles can include any one or more of various types of vehicles, bicycles, motorcycles, autonomous vehicles, etc.
In one example, the traffic controller 106 performs step 610 by implementing computer-readable instructions stored on memory 206 thereof that correspond to the tracking and ID module (service) 208. By execution of computer-readable instructions corresponding to tracking and ID module 208, processor 202 of traffic controller 106 can utilize known or to be developed methods of image/video processing, such as salient point optical flow, for determining and tracking pedestrians, types of pedestrians, vehicles, the types of vehicles, tacking the pedestrians and the objects, etc. Traffic flow rates can also be determined based on the vehicle tracking data according to any known or to be developed method for doing so.
Thereafter, at step 620, the traffic controller 106 stores the results of analysis at step 610 in the tracking and ID database 110. This storing may be in a tabular form, an example of which is described below with reference to
In one example, the traffic controller may continuously perform the process of
In one example and for purposes of resource conservation, the traffic controller 106 may overwrite data stored in the tracking and ID database 110 every predetermined period of time. For example, data may be stored for a period of 7 days and thereafter new data is written over the “old” data starting with the oldest stored data.
The above results of tracking and identification of objects and pedestrians at the intersection 101 can be used as training datasets to train a machine learning algorithm that monitors and identifies correlations between pedestrian traffic flow rates and vehicular or object traffic flow rates at the intersection 101 and creates rules that adjust and modify zone parameters for the intersection 101 to perform better recognition and subsequently better traffic control at the intersection 101.
Furthermore,
At step 800, the traffic controller 106 may retrieve stored data from the tracking and ID database 110. For example, the traffic controller 106 can retrieve the stored data for a single time interval (e.g., time interval 1 per table 700 of
The retrieved data can include information on vehicular and pedestrian traffic flow rates for each vehicular zone and pedestrian zone at the intersection 101.
At step 810, the traffic controller 106 may determine if one or more pedestrian traffic flow rates within the selected time interval is greater than a pedestrian threshold. The pedestrian threshold may be a configurable parameter that may be set based on experiments and/or empirical studies. For example, the pedestrian threshold may be set to 10 individuals (or 10 counts). In other words, at step 810, the traffic controller 106 may determine if there are more than 10 pedestrians in one or more pedestrian zones at the intersection 101.
If there is no pedestrian zone with a pedestrian traffic flow rate that is greater than the pedestrian threshold, the process reverts back to step 800, where steps 800 to 820 are repeated for the next time interval in the table 700. However, if there is at least one pedestrian traffic flow rate that is greater than the pedestrian threshold, then at step 820, the traffic controller 106 may determine if there are one or more vehicular traffic flow rates that are greater than a vehicular threshold. The vehicular threshold may also be a configurable parameter that can be determined based on experiments and/or empirical studies. For example, the vehicular threshold may be set to 10 vehicles per/minute.
If there is no vehicular traffic zone with a corresponding vehicular traffic flow that is greater than the vehicular threshold, the process reverts back to step 800, where steps 800 to 820 are repeated for the next time interval in the table 700.
However, if at step 820, the traffic controller 106 determines that there is at least one vehicular traffic zone with a corresponding vehicular traffic flow rate that is greater than the vehicular threshold, then at step 830, the traffic controller 106 may determine a correlation (a correlation value) between the pedestrian traffic flow rates and vehicular traffic flow rates at the intersection 101. This may be done according to any known or to be developed statistical analyses methods.
At step 840, the traffic controller may determine if the correlation value is greater than a correlation threshold. The correlation threshold may be a configurable parameter determined based on experiments and/or empirical studies. For example, the correlation threshold may be set to have a R2 (R-squared) value of equal to or greater than 0.9, with R2 of 1 being indicative of complete correlation and a R2 value of 0 being indicative of no correlation.
After step 840, the traffic controller may optionally perform steps 850 and 860. Otherwise, the process proceeds to step 870, which will be described below.
At step 850, the traffic controller 106 can repeat steps 800 to 840 for at least one subsequent time interval to validate the correlation. The process of steps 850 is for validating the correlation between the pedestrian traffic flow rate(s) and the vehicular traffic flow rate(s) by repeating steps 800 to 840 for at least one subsequent (consecutive) time interval (e.g., time interval 2 of table 700) and determining whether the correlation value remains above the correlation threshold or not.
Thereafter, at step 860, the traffic controller 106 can determine if step 850 validates the correlation. As noted above, the process of steps 850 and 860 are optional. If not, the process reverts back to step 800 and the traffic controller 106 repeats steps 800 to 860. Otherwise, at step 870, the traffic controller 106 creates a rule (pedestrian rule) for adjusting zone parameters (e.g., default zone parameters) for the zones of the intersection 101. This rule creation can be machine-learning based implemented by executing computer-readable instructions corresponding to learning module 210 stored on the memory 206 of traffic controller 106.
The created rule can be for example to change the default zone parameters for one or more of the vehicular and/or pedestrian zones to one or more corresponding alternative (extended) zone parameters as provided in table 500 of
In another example, the created rule can take factors such as scheduled public events and/or road closures obtained via 3rd party database 114 into consideration when determining the pedestrian rule. For example, given that a Sunday outdoor mass is scheduled every Sunday at a church near the intersection 101, which can result in an increase in pedestrian and/or vehicular traffic flow rates, the created rule (that can be machine learned) can enlarge all zone parameters for pedestrian and vehicular zones in the direction of the church to ensure that corresponding pedestrian and/or traffic lights are extended to let the church traffic through and improve pedestrian safety.
Thereafter, the traffic controller 106 can store the created pedestrian rule for adjusting zone parameters in the pedestrian database 112.
Examples described with reference to
With a table of default zones and a pedestrian database of pedestrian rules for adjusting zone parameters, as described above, the disclosure now turns to
Furthermore,
At step 1000, the light controller 102 may query the zone setting database 108 to obtain default zone parameters to be implemented at the smart traffic camera(s) 103 (and/or any other applicable one of sensor(s) 306) at the intersection 101. Alternatively, instead of querying the zone setting database 108, at step 1000, the traffic controller may receive the default zone parameters from the zone setting database 108.
At step 1010, the light controller 102 may apply the default zone parameters to the smart traffic camera(s) 103 (and/or any other applicable one of sensor(s) 306) at the intersection 101.
In one example, the light controller 102 implements the steps 1000 and/or 1010 using the computer-readable instructions corresponding to the normal traffic module 310 stored on the memory 308 of the light controller 102.
At step 1020, the light controller 102 may monitor the current pedestrian and vehicular traffic flow rates at the intersection 101. This monitoring may include determining such current pedestrian and vehicular traffic flow rates in the same manner as described above using known image/video processing methods. This monitoring can be for the purpose of determining if a triggering condition is met for querying the pedestrian database 112 for zone modification rules. The triggering condition can be the same as the pedestrian condition and the vehicular thresholds of
At step 1030, the light controller 102 can determine if the triggering condition is met. In one example and when the triggering condition includes two thresholds, at step 1030, the light controller 102 determines if both thresholds are met. In another example, the light controller 102 determines if one of the thresholds (e.g., pedestrian threshold) is met.
If at step 1030, the light controller 102 determines that the triggering condition is not met, the process reverts back to step 1020 and the light controller 102 can continuously monitor the pedestrian and vehicular traffic conditions at the intersection 101 until the triggering condition is met. In one example, steps 1020 and 1030 may be performed by implementing computer-readable instructions corresponding to the pedestrian activity module 312 stored on memory 308 of the light controller 102.
However, it at step 1030, the triggering condition is met, then at step 1040, the light controller 102 may query the pedestrian database 112 to see if there is a zone modification rule (Pedestrian rule) (e.g., table 900 of
At step 1050, the light controller 102 determines if a pedestrian rule exists or not. If a rule does not exist, the process reverts back to step 1020 and the light controller 102 repeats steps 1020 to 1050. However, if a rule exists, then at step 1060, the light controller 102 can query the zone setting database 108 to retrieve zone modification parameters stored therein and indicated in the pedestrian rule. As described above, rules stored in pedestrian rules database (e.g., Table 900 of
At step 1070, the light controller 102 may apply the corresponding alternative (extended or reduced) zone parameters to the smart traffic camera(s) 103 (and/or any other applicable sensor(s) 306) installed at the intersection 101. In one example, steps 1040-1070 may be performed by implementing computer-readable instructions corresponding to the zone change module 314 stored on memory 308 of the light controller 102.
Thereafter, at step 1080, the light controller 102 determines if current pedestrian and/or vehicular traffic flow rates at the intersection 101 has/have returned to normal and if so, the process reverts back to step 1000. If not, the process reverts back to step 1020.
In one example, monitoring and detection of pedestrian traffic flow rate at the intersection 101 using smart traffic cameras such as the smart traffic camera 103 can help improve pedestrian safety at the intersection 101. For example, when a detection of a pedestrian and a correspondingly low pedestrian traffic flow rate during night time, can trigger the light controller 102 to highlight the present few pedestrian(s) using one or more lighting systems installed at the intersection 101 such as the traffic light 117. Accordingly, the present pedestrians can be highlighted for incoming vehicular traffic at the intersection 101 and help reduce accidents and pedestrian related incidents.
Example embodiments of the present disclosure may be provided as a computer program product, which may include a computer-readable medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The computer-readable medium may include, but is not limited to, fixed (hard) drives, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware). Moreover, embodiments of the present disclosure may also be downloaded as one or more computer program products, wherein the program may be transferred from a remote computer to a requesting computer by way of data signals embodied in a carrier wave or other propagation medium via a communication link (e.g., a modem or network connection).
This application claims priority to U.S. Provisional Patent Application No. 62/545,279 filed on Aug. 14, 2017, the entire content of which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4833469 | David | May 1989 | A |
5111401 | Everette et al. | May 1992 | A |
5208584 | Kaye et al. | May 1993 | A |
5406490 | Braegas | Apr 1995 | A |
5444442 | Sadakata et al. | Aug 1995 | A |
6064139 | Ozawa et al. | May 2000 | A |
6075874 | Higashikubo et al. | Jun 2000 | A |
6249241 | Jordan et al. | Jun 2001 | B1 |
6317058 | Lemelson et al. | Nov 2001 | B1 |
6366219 | Hoummady | Apr 2002 | B1 |
6405132 | Breed et al. | Jun 2002 | B1 |
6505046 | Baker | Jan 2003 | B1 |
6526352 | Breed et al. | Feb 2003 | B1 |
6720920 | Breed et al. | Apr 2004 | B2 |
6741926 | Zhao et al. | May 2004 | B1 |
6751552 | Minelli | Jun 2004 | B1 |
6768944 | Breed et al. | Jul 2004 | B2 |
6862524 | Nagda et al. | Mar 2005 | B1 |
6937161 | Nishimura | Aug 2005 | B2 |
7110880 | Breed et al. | Sep 2006 | B2 |
7610146 | Breed | Oct 2009 | B2 |
7630806 | Breed | Dec 2009 | B2 |
7698055 | Horvitz et al. | Apr 2010 | B2 |
7698062 | McMullen et al. | Apr 2010 | B1 |
7821421 | Tamir et al. | Oct 2010 | B2 |
7835859 | Bill | Nov 2010 | B2 |
7899611 | Downs et al. | Mar 2011 | B2 |
7979172 | Breed | Jul 2011 | B2 |
8050863 | Trepagnier et al. | Nov 2011 | B2 |
8135505 | Vengroff et al. | Mar 2012 | B2 |
8144947 | Kletter | Mar 2012 | B2 |
8212688 | Morioka et al. | Jul 2012 | B2 |
8255144 | Breed et al. | Aug 2012 | B2 |
8373582 | Hoffberg | Feb 2013 | B2 |
8566029 | Lopatenko et al. | Oct 2013 | B1 |
8589069 | Lehman | Nov 2013 | B1 |
8682812 | Ran | Mar 2014 | B1 |
8706394 | Trepagnier et al. | Apr 2014 | B2 |
8825350 | Robinson | Sep 2014 | B1 |
8903128 | Shet et al. | Dec 2014 | B2 |
9043143 | Han | May 2015 | B2 |
9387928 | Gentry et al. | Jul 2016 | B1 |
9418546 | Whiting | Aug 2016 | B1 |
9720412 | Zhu | Aug 2017 | B1 |
9965951 | Gallagher | May 2018 | B1 |
20040155811 | Albero et al. | Aug 2004 | A1 |
20050187708 | Joe et al. | Aug 2005 | A1 |
20070162372 | Anas | Jul 2007 | A1 |
20070208494 | Chapman et al. | Sep 2007 | A1 |
20070273552 | Tischer | Nov 2007 | A1 |
20080094250 | Myr | Apr 2008 | A1 |
20080195257 | Rauch | Aug 2008 | A1 |
20090048750 | Breed | Feb 2009 | A1 |
20090051568 | Corry et al. | Feb 2009 | A1 |
20110037618 | Ginsberg et al. | Feb 2011 | A1 |
20110205086 | Lamprecht et al. | Aug 2011 | A1 |
20120038490 | Verfuerth | Feb 2012 | A1 |
20120112928 | Nishimura et al. | May 2012 | A1 |
20120307065 | Mimeault et al. | Dec 2012 | A1 |
20130048795 | Cross | Feb 2013 | A1 |
20140136414 | Abhyanker | May 2014 | A1 |
20140159925 | Mimeault et al. | Jun 2014 | A1 |
20140277986 | Mahler et al. | Sep 2014 | A1 |
20160027299 | Raamot | Jan 2016 | A1 |
20160225259 | Harris | Aug 2016 | A1 |
20170169309 | Reddy et al. | Jun 2017 | A1 |
20180004210 | Iagnemma | Jan 2018 | A1 |
20180075739 | Ginsberg et al. | Mar 2018 | A1 |
20190049264 | Malkes | Feb 2019 | A1 |
20190050647 | Malkes | Feb 2019 | A1 |
20190051152 | Malkes | Feb 2019 | A1 |
20190051160 | Malkes | Feb 2019 | A1 |
20190051161 | Malkes | Feb 2019 | A1 |
20190051162 | Malkes | Feb 2019 | A1 |
20190051163 | Malkes | Feb 2019 | A1 |
20190051164 | Malkes | Feb 2019 | A1 |
20190051171 | Malkes | Feb 2019 | A1 |
Number | Date | Country |
---|---|---|
100533151 | Aug 2009 | CN |
101799987 | Aug 2010 | CN |
101944295 | Jan 2011 | CN |
0 464 821 | Jan 1992 | EP |
2013-0067847 | Jun 2013 | KR |
Entry |
---|
U.S. Appl. No. 16/030,396 Office Action dated Nov. 20, 2018. |
U.S. Appl. No. 16/058,343 Office Action dated Nov. 19, 2018. |
Dehghan et al., Afshin; “Automatic Detection and Tracking of Pedestrians in Videos with Various Crowd Densities”, Pedestrian and Evacuation Dynamics 2012. |
Dubska et al.; “Automatic Camera Calibration for Traffic Understanding”, bmva.org, 2014. |
Grosser, Kari; “Smart Io T Technologies for Adaptive Traffic Management Using a Wireless Mes Sensor Network”, Advantech Industrial Io T Blog, Feb. 3, 2017. |
https://www.flir.com, 2018. |
Halper, Mark; Smart Cameras Will Help Spokane Light It's World More Intelligently (Updated), LEDs Magazine, and Business/Energy/Technology Journalist, Apr. 19, 2017. |
Heaton, Brian; “Smart Traffic Signals Get a Green Light”, Government Technology Magazine, Feb. 15, 2012. |
Kolodny, Lora; Luminar reveals sensors that could make self-driving cars safer than human, Techcrunch, Apr. 13, 2017. |
McDermott, John; “Google's newest secret weaon for local ads”, Digiday, Jan. 29, 2014. |
Resnick, Jim; “How Smart Traffic Signals May Ease Your Commute”, BBC, Autos, Mar. 18, 2015. |
Sun et al., M.; “Relating Things and Stuff via Object Property Interactions”, cvgl.stanford.edu, Sep. 4, 2012. |
Whitwam, Ryan; “How Google's self-driving cars detect and avoid obstacles”, ExtremeTech, Sep. 8, 2014. |
Yamaguchi, Jun'ichi; “Three Dimensional Measurement Using Fisheye Stereo Vision”, Advances in Theory and Applications of Stereo Vision, Dr. Asim Bhatti (Ed.), ISBN:978-953-307-516-7, InTech. Jan. 8, 2011. |
U.S. Appl. No. 16/030,396, William A. Malkes, System and Method Adaptive Controlling of Traffic Using Camera Data, Jul. 9, 2018. |
U.S. Appl. No. 16/044,891, William A. Malkes, System and Method for Controlling Vehicular Traffic, Jul. 25, 2018. |
U.S. Appl. No. 16/032,886, William A. Malkes, Adaptive Traffic Control Using Object Tracking and Identity Details, Jul. 11, 2018. |
U.S. Appl. No. 16/058,106, William A. Malkes, System and Method for Managing Traffic by Providing Recommendations to Connected Objects, Aug. 8, 2018. |
U.S. Appl. No. 16/059,814, William A. Malkes, Systems and Methods of Navigating Vehicles, Aug. 9, 2018. |
U.S. Appl. No. 16/059,886, William A. Malkes, System and Method of Adaptive Traffic Optimization Using Unmanned Aerial Vehicles, Aug. 9, 2018. |
U.S. Appl. No. 16/058,214, William A. Malkes, System and Method of Adaptive Traffic Management at an Intersection, Aug. 8, 2018. |
U.S. Appl. No. 16/101,766, William A. Malkes, System and Method for Retail Revenue Based Traffic Management, Aug. 10, 2018. |
U.S. Appl. No. 16/101,933, William A. Malkes, Adaptive Optimization of Navigational Routes Using Traffic Data, Aug. 13, 2018. |
U.S. Appl. No. 16/058,343, William A. Malkes, System and Method of Adaptive Controlling of Traffic Using Zone Based Occupancy, Aug. 8, 2018. |
Number | Date | Country | |
---|---|---|---|
20190051167 A1 | Feb 2019 | US |
Number | Date | Country | |
---|---|---|---|
62545279 | Aug 2017 | US |