The present disclosure relates generally to railroad track inspection systems, and more particularly, to systems and methods for inspecting railroad tracks using one or more cameras that are automatically actuated responsive to movement of a transport device.
Railroad tracks (e.g., subway tracks, elevated train tracks, high speed rail tracks, monorail tracks, tram tracks, etc.) often have defects (e.g., cracks, pitting, misalignment, missing track elements, etc.) that develop over continued use. These defects may negatively impact the quality of the ride along the railroad tracks and/or pose a safety hazard to trains, including derailment. Often, inspecting railroad tracks requires a user to manually operate various devices located on a transport device (e.g., railcar, railroad vehicle, etc.) to collect data to identify the presence or absence of railroad track features of interest, some of which may be considered as defects requiring repair or maintenance. In other words, a user must be physically present on the transport device in order to manually actuate cameras and/or lights of the inspection system. The present disclosure is directed to solving these and other problems.
According to some implementations of the present disclosure, a method for inspecting a railroad track includes detecting, using one or more sensors of an inspection system, movement of a transport device along the railroad track; responsive to the detecting, automatically actuating one or more cameras of the inspection system; receiving, from at least one of the one or more cameras, first image data reproducible as a first image of a first portion of the railroad track; analyzing the first image data to identify the presence or absence of one or more features of interest on the first portion of the railroad track; determining a first location of the transport device; and associating the first location of the transport device with the first image data.
According to some implementations of the present disclosure, a system for inspecting a railroad track includes one or more sensors coupled to a wheel of a transport device configured to travel along the railroad track, the one or more sensors being configured to detect movement of the transport device along the railroad track, one or more cameras coupled to the transport device and being configured to generate first image data reproducible as a first image of a first portion of the railroad track, a wireless communication module, one or more memory devices coupled to the transport device and being configured to receive and store the first image data therein, and one or more processors coupled to the transport device and being configured to: automatically actuate the one or more cameras responsive to movement of the transport device along the railroad track and associate a first location of the transport device with the first image data.
According to some implementations of the present disclosure, a method for inspecting a railroad track includes detecting, using one or more sensors of an inspection system, movement of a transport device along the railroad track; responsive to the detecting, automatically actuating one or more cameras of the inspection system; receiving, from at least one of the one or more cameras, first image data reproducible as a first image of a first portion of the railroad track; analyzing the first image data to identify the presence or absence of one or more features of interest on the first portion of the railroad track; determining a first location of the transport device; associating the first location of the transport device with the first image data; receiving, from at least one of the one or more cameras, second image data reproducible as a second image of a second portion of the railroad track; analyzing the second image data to identify the presence or absence of one or more features of interest on the second portion of the railroad track; determining a second location of the transport device; and associating the second location of the transport device with the second image data.
The above summary is not intended to represent each embodiment or every aspect of the present invention. Additional features and benefits of the present invention are apparent from the detailed description and figures set forth below.
While the disclosure is susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that it is not intended to limit the invention to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.
Referring generally to
The railroad inspection system 100 is coupled to a transport device 150 that is moveable along one or more rails of a railroad track. As shown in
The optical sensor(s) 112 of the railroad inspection system 100 are configured to detect movement of the transport device 150 along the railroad track 160. More specifically, in some implementations, the optical sensor(s) 112 includes an optical encoder that detects rotational position changes and converts the angular position or motion to analog or digital signal outputs. As shown in
In other implementations, the optical sensor(s) 112 are generally aimed at the ground (e.g., aimed generally in the direction of the railroad track) and can determine motion, distance traveled, and/or speed of the transport device 150. As described in further detail herein, the optical sensor(s) 112 are communicatively coupled to the processor(s) 132, which can determine, for example, position, speed, and/or distance traveled by the transport device 150 based on the signal output of the optical sensor(s) 112. The processor(s) 132 automatically actuate the camera(s) 120 responsive to receiving signals or data from the optical sensor(s) 112 indicating that the transport device 150 is moving along the railroad track 160.
In addition to, or in the alternative of, the optical sensor(s) 112, the radar-based sensor(s) 114, the GPS sensor(s) 116, or both, can be used to detect movement of the transport device 150 along the railroad track 160. The radar-based sensor(s) 114 are configured to detect movement, speed, and/or distance traveled by the transport device 150 by emitting radar signals generally in the direction of the ground underneath the transport device 150 while the transport device 150 is moving. These radar signals are reflected by the ground back to the radar-based sensor(s) 114, which are configured to analyze the reflected signals to detect movement, speed, and/or distance traveled by the transport device 150.
The GPS sensor(s) 116 can be used to determine a location (e.g., latitude and longitude, or other coordinates) of the transport device 150 in real-time. For example, the GPS sensor(s) 116 can determine a first location of the transport device 150 at a first time and a second location of the transport device 150 at a second time. If the first location is different than the second location, this indicates that the transport device 150 has moved, and this information can be used to actuate the camera(s) 120. The GPS sensor(s) 116 can also be used to determine a distance traveled by the transport device 150 along the railroad track within a given time interval.
The RFID reader(s) 118 of the railroad inspection system 100 include an antenna configured to receive location information from RFID tags positioned on or along the railroad track 160. Referring to
While the RFID reader(s) 118 are shown as being coupled to underside of the transport device 150, the RFID reader(s) 118 can be more generally be coupled to any suitable location on the transport device 150 (e.g., on either side of the transport device 150, on either end of the transport device 150, etc.) such that it is within sufficient proximity to receive location information from RFID tags positioned on or along the railroad track 160.
During operation of the railroad transport system 100, the transport device 150 may travel through a tunnel or other structure such that the GPS sensor(s) 116 cannot obtain an adequate signal (e.g., satellite or cellular signal) to reliably and accurately determine the location of the transport device 150 such that the features of interest can be associated with a location. Advantageously, the RFID reader(s) 118 can be used to determine a location of the transport device 150 by obtaining location data from RFID tags (e.g., RFID tag 164 of
The camera(s) 120 and the light source(s) 122 of the railroad inspection system 100 are generally used to produce image data reproducible as an image of a portion of one or more rails of the railroad, from which the system 100 can identify the presence or absence of one or more features of interest on the railroad track. The defects that can be identified include cracks, pitting defects (which may be depressions defined in the rail head surface), grinding marks (e.g., marks left by rail grinding machines used to perform maintenance on the rails), flaking or spalling (e.g., pieces of surface material detaching from the rail head surface), missing track elements, cracked or broken track elements, fouled ballast, or any combination thereof.
Referring to
The light source(s) 122 are generally used to illuminate the rail 162 underneath the transport device 150 to aid the camera(s) 120 in generating image data reproducible as one or more images of the surface of the rail 162 from which the presence or absence of one or more features of interest on the rail can be identified by the processor(s) 132, as described in further detail herein.
The optical sensor(s) 112, radar-based sensor(s) 114, GPS sensor(s) 116, RFID reader(s) 118, camera(s) 120, light source(s) 122, or any combination thereof are coupled to the processor(s) 132, the memory device(s) 134, or both. As shown in
The processor(s) 132 are configured to automatically actuate the camera(s) 120 and the light source(s) 122 responsive to determining that the transport device 150 is moving via data from the optical sensor(s) 112, the radar-based sensor(s) 114, and/or GPS sensor(s). Likewise, the processor(s) 132 are configured to automatically turn off the camera(s) 120 and the light source(s) 122 responsive to determining that the transport device 150 is no longer moving via data from the optical sensor(s) 112, the radar-based sensor(s) 114, and/or GPS sensor(s) 116. The processor(s) 132 of the inspection system 100 are coupled to the transport device 150 (e.g., the processor(s) 132 of the inspection system 100 are physically on the transport device 150 and/or directly attached to the transport device 150).
Advantageously, automatically actuating the camera(s) 120 in response to movement of the transport device 150 aids in preventing large, unnecessary amounts of image data from being produced, or consuming power, while the transport device 150 is stationary. Further, automatically actuating the camera(s) 120 responsive to movement, and automatically turning the camera(s) 120 off responsive to the end of movement, obviates the need for an operator to manually actuate the camera(s) 120, meaning that the railroad inspection system 100 does not require an operator to be physically present on the transport device 150 during operation. As such, in some implementations, the railroad inspection system 100 can be coupled to any suitable transport device (e.g., a passenger car, a freight car, a locomotive, etc.), rather than a separate, specialized inspection transport device.
In some implementations, the processor(s) 132 are also configured to analyze image data received from the camera(s) 120 to identify the presence or absence of one or more features of interest on a portion of the railroad track, as described in further detail below.
The memory device(s) 134 are configured to receive the image data from the camera(s) 120. As described in further detail herein, in some implementations, the memory device(s) 134 store a reference map including locations of a plurality of landmarks (e.g., mileposts, chain markers, signals, stations, other railroad assets, etc.) such that the image data can be associated with one or more of the plurality of landmarks (e.g., a distance from a landmark is associated with the feature(s) of interest).
The communication module(s) 136 of the railroad inspection system 100 are configured to wirelessly communicate with a remote computer system 170 that is not located on (e.g., not coupled to) the transport device 150. The remote computer system 170 can be a computer, a laptop, a tablet, a smartphone, a server, or the like, and includes one or more processors 172 and one or more memory devices 174. Because the remote computer system 170 is not located on the transport device 150, a user can receive and analyze data from the railroad inspection system 100 without having to be physically located on the transport device 100. As a result, the railroad inspection system 100 can be coupled to a revenue generating railroad car (e.g., a locomotive, a passenger car, a freight car, etc.), rather than a separate, specialized railroad car or vehicle that requires additional personnel to be physically present on the railroad throughout the inspection process.
In some implementations, the remote computer system 170 receives raw image data from the wireless communication module 136 of the railroad inspection system 100, and the processor(s) 172 of the remote computer system 170 analyze the received image data to identify the presence or absence of features of interest. In other implementations, as described in further detail herein, the processor(s) 132 of the railroad inspection system 100 are configured to analyze the image data from the camera(s) 120 to identify the presence or absence of one or more features of interest on a portion of the railroad track and associate the features of interest with a location of the transport device 150. In such implementations, responsive to identifying the presence of one of more features of interest, the communication module(s) 136 of the railroad inspection system 100 are configured to wirelessly communicate identified features of interest and associated locations to the remote computer system 170.
While the railroad inspection system 100 has been shown and described herein as including the optical sensor(s) 112, radar-based sensor(s) 114, GPS sensor(s) 116, RFID reader(s) 118, camera(s) 120, light source(s) 122, processor(s) 130, memory device(s) 132, and communication module(s) 134, in some implementations, the railroad inspection system 100 can include more, or less, components. For example, in some implementations, the railroad inspection system 100 includes the optical sensor(s) 112, but not the radar-based sensor(s) 114.
Referring to
Step 201 of the method 200 includes determining, using one or more sensors, movement of a transport device along the railroad track. As described herein, in some implementations, an optical sensor (e.g., the optical sensor(s) 112) detect movement of the transport device (e.g., transport device 150 in
Step 202 of the method 200 includes automatically actuating one or more cameras (e.g., camera(s) 120) and one or more lights (e.g., light source(s) 122) of the inspection system responsive to determining that the transport device is moving. More specifically, in some implementations, one or more processors of the inspection system are communicatively coupled to the optical sensor and determine that the transport device is moving based on the received signals. The processor(s) are also communicatively coupled to the camera(s) and light(s) and automatically actuate the camera(s) and light(s) responsive to receiving signals indicative of movement. If the processor(s) determine that the transport device is no longer moving based on signals from the optical sensor, the processor(s) turn off the camera(s) and light(s) to aid in preventing the capture of large amounts of unnecessary data.
Step 203 of the method 200 includes receiving image data reproducible as an image of a portion of a surface of a railroad track from the camera. The image data from the camera is stored in a memory device of the inspection system. Step 204 includes analyzing the received image data to identify the presence or absence of one or more features of interest, as described in further detail below.
Step 205 of the method 200 includes determining a location of the transport device using the optical sensor (e.g., optical sensor(s) 112), a radar-based sensor (e.g., radar-based sensor(s) 114), a GPS sensor (e.g., GPS sensor(s) 116), an RFID reader (e.g., RFID reader(s) 118), or any combination thereof. For example, in some implementations, step 205 includes obtaining the location (e.g., GPS coordinates such as latitude and longitude) of the transport device using a GPS sensor.
In other implementations, step 205 includes using an RFID reader that receives location information from an RFID tag positioned adjacent to (e.g., coupled to) the railroad track (e.g., RFID tag 164 of
In still other implementations, step 205 includes using the optical sensor and/or radar-based sensor to determine a distance traveled relative to a reference point to determine the location of the transport device. For example, an initial reference point can be obtained using a GPS sensor and/or the RFID reader. The optical sensor and/or radar-based sensor can be used determine a distance traveled relative to the initial reference point, and thus the location of the transport device can be determined. In some implementations, a reference map stored in the memory device of the inspection system can be used in conjunction with the initial reference point and calculated distance to determine the location of the transport device.
In some implementations, step 205 also includes using a reference map stored in a memory device of the inspection system to further associate the image data with other location-related information. In other words, the determined location of the transport device can be compared to the reference map such that information/data other than an absolute or relative position can be associated with the image data. For example, the image data can be associated with a track number, a switch number, a crossing number, a station number, a signal number, etc. to further specify the location of the transport device. As described herein, this additional information can aid in prioritizing repairs to identified defects along the railroad track.
Generally, determining the location in step 205 can occur before, substantially simultaneously with, or after the analyzing step 204. In some implementations, if no features of interest are identified in a given image data set during step 204, then step 205 is not performed for that image data set. In other words, in such implementations, step 205 occurs responsive to identifying the presence of a feature of interest in the image data during step 204. Advantageously, only associating the location with image data containing a feature of interest may reduce the processing requirements of the system. In other implementations, step 205 is performed regardless of whether the presence of a feature of interest is identified during step 204. In other words, in such implementations, the image data that does not contain a feature of interest is still associated with the location of the transport device.
In such implementations where step 205 is performed before, or substantially simultaneously with, the analyzing step 204, the analyzing step 204 can include using machine vision algorithms that are trained to recognize features of interest in the image data. Alternatively, the analyzing step 204 can include using an image processing algorithm that matches patterns in the image data with a well-developed set of training or reference images. In other words, the image data is compared to a database of reference or training images (e.g., stored in the memory device of the inspection system or the memory device of the remote computer) to identify the presence or absence of features of interest.
In some implementations, and preferably, analyzing the image data to identify the presence or absence of features of interest during step 205 does not include comparing the image data to a library of images containing features of interest. In other implementations, the image data is compared to a library of features of interest during step 205 in order to identify features of interest in the image data.
Step 206 includes associating the determined location of the transport device with the image data. Step 206 can occur before, or after, step 204. In other words, in some implementations, the image data can be associated with the determined location (step 206) before the analyzing to identify the presence or absence of one or more features of interest (step 204). Alternatively, the image data can be associated with the determined location (step 206) after the analyzing to identify the presence or absence of one of more features of interest (step 204). For example, in some implementations, step 206 occurs responsive to identifying the presence of one or more features of interest in a portion of the image data.
Step 207 of the method 200 includes transmitting at least a portion of the image data and associated location to a remote computer system (e.g., the remote computer system 170 of
In other implementations, step 207 includes transmitting at least a portion of the image data and associated location responsive to identifying a feature of interest in the image data. In other words, image data is only transmitted if a feature of interest is present. Further, in such implementations, step 207 includes transmitting an image snippet of an identified feature of interest of an identified feature of interest. Rather than sending all of the image data, the image snippet includes all or substantially all of the feature of interest and includes areas of the image data where there is no identified feature of interest (e.g., the image snippet is limited to an identified crack on an edge of the rail, rather than an image of the entire surface of the rail). Advantageously, in such implementations, the volume of data that is transmitted to the remote computer system is reduced compared to sending image data for every location despite the fact that many locations do not have a feature of interest.
Generally, once the image data and associated locations are transmitted to the remote computer system (step 207), this information is available to users of the remote computer system to review. In some implementations, the remote computer system can be configured to transmit a notification to one or more recipients responsive to receiving image data in which a feature of interest has been identified. For example, in such implementations, the remote computer system may automatically transmit a notification (e.g., e-mail, text message, etc.) to a third party indicating what feature(s) of interest were identified at a location along the railroad track. Thus, recipients can be notified in near real-time of defects that are identified by the inspection system as the transport device is traveling along the railroad track, enabling those recipients to make decisions about the need for urgent repairs, short term remedial actions, or long-term railroad track maintenance planning.
As the transport device continues to move along the railroad track (e.g., from a first location to a second location, from the second location to a third location, and so forth), steps 203 through 207 can be repeated one or more times such that the image data and associated locations that are transmitted to the remote computer system include data reproducible as images of the entire length of the railroad track. In this manner, all (or substantially all of) the railroad track can be inspected.
Optional step 208 of the method 200 includes prioritizing repairs of the railroad tracks based on the transmitted image data. As described herein, there many different types of features of interest (e.g., defects) that can be identifying in the image data, and each feature of interest may have a different severity. For example, it may be determined that a first crack identified at a first location is larger than a second crack identified at a second location. In this case, repair of the first crack can be prioritized over repair of the second crack. Likewise, priority of the first or second crack may be determined based on the associated location. For example, if it is determined that the first crack is in close proximity to repair assets, the repair of the first crack can be prioritized over the second crack to quickly eliminate one potential safety hazard.
Similarly, prioritization of repairs in optional step 208 can be based on other factors, such as, for example, the location associated with the feature of interest. If a first defect is identified at a first location and a second crack is identified at a second location, further information about the first location and the second location can be used to prioritize repair of the first or second crack. For example, as described herein, a track number can be associated with the image data and associated location. In turn, the track number can be associated with railroad traffic data, such as, for example, rail traffic volume (e.g., a heavily used stretch of track or a rarely used stretch of track), rail traffic type (e.g., passenger, freight, hazardous or dangerous freight, etc.), rail traffic speed (e.g., a high-speed straightaway, a low speed curve, etc.) Prioritization of the repairs of the first crack and second crack can be determined based on this railroad traffic data. For example, if it is determined that the first crack is located in a heavily used, high speed stretch of track, and the second crack is located in a rarely used, low speed stretch of track, repair of the first crack is prioritized over the second crack.
In some implementations, rather than transmitting the image data and associated location to the remote computer system (step 207) subsequent to analyzing the image data (step 205) to identify the presence or absence of features of interest, the method 200 includes step 210. Step 210 includes transmitting the image data to the remote computer system prior to analyzing the image data to identify the presence or absence of features of interest. As described herein, the remote computer system is not located on (e.g., coupled to) the transport device. In such implementations, the method 200 also includes step 212. Step 212 includes analyzing the received image data to identify the presence or absence of features of interest
Associating the image data with the determined location of the transport device (step 206) can occur before, or after, the image data is transmitted to the remote computer system (step 210). In other words, the image data can be associated with the location of the transport device by the processors of the inspection system coupled to the transport device, and this image data and associated location is then transmitted to the remote computer system for analysis (steps 210 and 212). Alternatively, the image data and raw data for determining the location of the transport device can be transmitted to the remote computer system, and the processors of the remote computer system can be used to determine the location of the transport device and associate that location with the image data.
Advantageously, as compared to analyzing the image data using the processor(s) of the inspection system (step 204), the use of steps 210 and 212 aid in reducing the processing and power requirements of the inspection system coupled to the transport device.
Referring to
Referring to
Camera 420A is generally aimed only at rail 462A, while camera 420B is generally aimed only at rail 462B. However, some implementations one or more of the cameras could be aimed at multiple rails. Each camera 420A, 420B is spaced apart from its respective rail 462A, 462B along a respective vertical axis A1 and A2. These axes extend upwards from the rail head surfaces 486A, 486B of rails 462A, 462B respectively. Axes A1 and A2 are also each generally perpendicular to the rail head surfaces 486A, 486B. However, because the rail head surfaces 486A, 486B, axes A1 and A2 may not be perpendicular to the rail head surfaces 486A, 486B at every point along the rail head surfaces 486A, 486B. However, axes A1 and A2 are generally perpendicular at least to the center of the rail head surfaces 486A, 486B. Furthermore, while axes A1 and A2 are referred to as vertical axes for ease of understanding, depending on the layout of the track and its relation to the ground, axes A1 and A2 may not always be completely vertical.
Axes A1 and A2 are also each generally perpendicular to both a transverse axis B (which extends between and connects the rails 462A, 462B) and a longitudinal axis C (which extends along the length of the rails 462A, 462B). Cameras 420A and 420B are generally mounted inwardly relative to rails 462A and 462B, respectively. In this manner, camera 420A is further spaced apart from rail 462A along the second axis B towards rail 462B. Similarly, camera 420B is further spaced apart from rail 462B along the second axis B towards rail 462A. Cameras 420A and 420B are thus positioned at angles θ1 and θ2 relative to axes A1 and A2, respectively. In some implementations, angles θ1 and θ2 can be between about 0.1° and about 20°, between about 5.0° and about 12.0°, or about 5.0°. Both of the cameras 420A and 420B can be mounted at the same angle relative to their respective vertical axes A1 and A2, or they could each be mounted at separate angles relative to axes A1 and A2. In some implementations, one or both of cameras 420A, 420B could be mounted at an angle of 0° relative to their respective vertical axes A1 and A2, e.g., the cameras can be mounted directly over the rail head surfaces 486A, 486B.
Regardless of the angle θ1, θ2 at which cameras 420A, 420B are disposed at relative to the vertical axis, cameras 420A, 420B are generally aimed at the rail head surfaces 486A, 486B, respectively. Thus, camera 420A and 420B are not always aimed directly downward with respect to where they are coupled to transport device at, e.g., parallel to axes A1 and A2 respectively. Rather, cameras 420A and 420B are generally aimed outward towards rails 462A, 462B respectively. Camera 420A is thus aimed at an angle back towards rail head surface 486A, while camera 420B is aimed at angle back towards rail head surface 486B. These angles are generally equal to angles θ1 and θ2, respectively. However, in some implementations, these angles are different from angles θ1 and θ2, respectively.
The camera 420A, 420B are generally positioned inwardly and aimed outwardly with respect to rails 462A and 462B to ensure that the images that are produced from the generated image data clearly show the areas of the rail head surfaces 486A, 486B closer the inner edges, 38B of rails 462A, 462B, e.g., the right edge of the left rail 462A, and the left edge of the right rail 462B. Most vehicles that travel along tracks, such as a train car traveling along a railroad track, have wheels with inner flanges that extend downward along the inner side of the rails 462A, 462B. As a result, there is generally more contact between the wheels and the inner areas of the rail head surfaces 486A, 486B that are closer to the inner edges of the rails 462A, 462B. These inner areas of the rail head surfaces 486A, 486B can be damaged more easily, and thus it is important to produce images that fully shows these inner areas.
More generally, cameras 420A and 420B can generate image data reproducible as one or more images of rail 462A, rail base 482A, rail web 483A, rail head 484A, rail head surface 486A, rail 462B, rail base 482B, rail web 483B, rail head 484B, rail head surface 486B, cross-tie 488, fasteners (e.g., fasteners that are the same as, or similar to, fasteners 368 of
Unlike camera 420A, 420B however, light sources 422A-422D are generally not spaced apart from the rail along axis B, e.g., are not angled inwardly or outwardly with respect to rails 462A, 462B. However, light sources 422A-422D are spaced apart along axis C from the portion of the rail head surfaces 486A, 486B that light sources 422A-422D are illuminating. Light sources 422A-422D are thus aimed at an angle toward rail head surfaces 486A, 486B. As best shown in
Other implementations of the system can have any number or arrangement of lights and camera. For example, in some implementations the system could include a single camera with a field of view wide enough to generate image data that can be reproduced into an image of both rails. Other implementations could include multiple cameras per rail. Still other implementations could modify the angle at which the cameras are positioned relative to the rail. For example, some implementations could have the cameras mounted directly above the rails and aimed straight down, or mounted outwardly from the rails and aimed inwardly. Implementations including multiple cameras per rail could have each camera for a rail disposed between the rails, or could have one camera disposed at an inward angle, one camera disposed directly above the rails, one camera dispose at an outward angle, etc. In other implementations, all of the cameras can be mounted at aimed at its own respective angle, which may be the same or different as the angle in one or more of the other cameras (e.g., two cameras, three cameras, four cameras, six cameras, etc.) of the system.
In some implementations, the system may include only a single light source per rail. Other implementations could include a single light source that illuminates both rails. The light sources can have a focused beam of light that has a width approximately equal to the width of the rail head surfaces. In other implementations, the width of the beam of light produced by the light sources is wider than the width of the rail head surfaces. Similar to the cameras, each light source can be mounted and aimed at an angle that can be the same or different from the angle of each of the other light sources.
While the present disclosure has been described with reference to one or more particular embodiments or implementations, those skilled in the art will recognize that many changes may be made thereto without departing from the spirit and scope of the present disclosure. Each of these embodiments or implementations and obvious variations thereof is contemplated as falling within the spirit and scope of the present disclosure. It is also contemplated that additional embodiments implementations according to aspects of the present disclosure may combine any number of features from any of the embodiments described herein.
This application claims the benefit of and priority to U.S. Provisional Patent Application No. 62/538,524, filed on Jul. 28, 2017, which is hereby incorporated by reference herein in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2018/044149 | 7/27/2018 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62538524 | Jul 2017 | US |