Embodiments of the subject matter described herein relate to imaging systems, such as imaging systems onboard or near vehicle systems.
Vehicle systems such as trains or other rail vehicles can include cameras disposed on or near the vehicle systems. These cameras can be used to record actions occurring outside of the vehicle systems. For example, forward facing cameras can continuously record video of the locations ahead of a train. If a collision between the train and another vehicle occurs (e.g., an automobile is struck at a crossing), then this video can later be reviewed to determine liability for the collision, whether the other vehicle improperly moved through a gate or signal, whether the train was moving too fast, or the like. But, the image data obtained by these cameras typically is only saved on a temporary loop. Older image data is discarded when no accidents occur, even though this image data may represent one or more other problems with the vehicle and/or track.
Additionally, in order to inspect the image data, some known systems are limited to requiring an operator to review large portions of the image data to find one or more smaller sections of interest in the image data. For example, the image data acquired over a long trip may be reviewed by an operator in an attempt to find the segment of this image data that may have captured a video or image of a signal on which the operator wants to check. Searching through this entire image data can be time intensive.
Multiple vehicle systems may include multiple cameras, all capturing image data. But, this data across the multiple vehicle systems is not correlated with each other, such that it is difficult for operators to find all portions of the image data that may include video or images of a certain location that the vehicle systems moved past.
Finally, the image data acquired by the vehicle systems usually is stored onboard the vehicle systems. As a result, the image data may not be accessible to a remotely located operator until the vehicle system ends a current trip and is at a location where the vehicle system can upload or otherwise send the image data to the operator.
In one example of the inventive subject matter, a system (e.g., an image management system) includes a controller and one or more analysis processors. The controller and the one or more analysis processors may be embodied in a single processor, or may be embodied in two or more processors. For example, the controller may include one or more microprocessors, and/or hardware circuits or circuitry that include and/or are connected with one or more microprocessors, and the one or more processors may be software modules executed by the controller.
The controller can be configured to receive search parameters, the search parameters specifying at least one of operational data or a range of the operational data of one or more vehicle systems. The one or more analysis processors are configured to search remotely stored image data based on the search parameters to identify matching image data. The remotely stored image data is obtained by one or more imaging systems disposed onboard the one or more vehicle systems. The remotely stored image data is associated with the operational data of the one or more vehicle systems that was current when the remotely stored image data was acquired. The one or more analysis processors also are configured to obtain the matching image data having the operational data specified by the search parameters and to present the matching image data to an operator.
In another embodiment, a method (e.g., an image management method) includes receiving search parameters that specify at least one of operational data or a range of the operational data of one or more vehicle systems and searching (with one or more processors) remotely stored image data based on the search parameters to identify matching image data. The remotely stored image data can be obtained by one or more imaging systems disposed onboard the one or more vehicle systems, and can be associated with the operational data of the one or more vehicle systems that was current when the remotely stored image data was acquired. The method also can include obtaining the matching image data having the operational data specified by the search parameters to present the matching image data to an operator.
In another embodiment, another method (e.g., another image management method) includes acquiring first image data from one or more cameras disposed onboard a vehicle system as the vehicle system moves along a route. The image data represents at least one of images or videos of a field of view of the one or more cameras. The method also may include determining operational data of the vehicle system when the first image data was acquired. The operational data includes at least one of operational settings of the vehicle system when the first image data was acquired and/or operational conditions to which the vehicle system is exposed when the first image data was acquired. The method also can include associating the operational data with the first image data, identifying one or more segments of the first image data as matching image data responsive to receiving search parameters that specify at least some of the operational data associated with the first image data, and communicating the matching image data to an off-board location responsive to identifying the matching image data.
The subject matter described herein will be better understood from reading the following description of non-limiting embodiments, with reference to the attached drawings, wherein below:
One or more embodiments of the inventive subject matter described herein relate to imaging systems and methods for vehicle systems. While several examples of the inventive subject matter are described in terms of rail vehicles (e.g., trains, locomotive, locomotive consists, and the like), not all embodiments of the inventive subject matter are limited to rail vehicles. At least some of the inventive subject matter may be used in connection with other off-highway vehicles (e.g., vehicles that are not permitted or designed for travel on public roadways, such as mining equipment), automobiles, marine vessels, airplanes, or the like.
In one aspect of the subject matter described herein, vehicle systems acquire image data representative of actions and objects in a field of view of one or more cameras of the vehicle systems. The cameras can be internal cameras located inside the vehicle systems, such as inside cabs of the vehicle systems where operators of the vehicle systems are located to control operations of the vehicle systems. These cameras can monitor and record the actions of the operator to assist in accident reconstruction, to provide evidence of liability in an accident, to ensure the operator is present and performing appropriate tasks, or the like. Optionally, the cameras can include external cameras mounted to exterior surfaces of the vehicle systems. The field of view of the cameras (internal or external) can capture events or objects disposed outside of the vehicle systems. For example, the field of view of an internal camera can encompass objects disposed alongside a route being traveled by the vehicle system via a window or opening of the vehicle system.
The image data that is acquired and output by the cameras can be locally saved in memory devices of the vehicle systems. The image data can be embedded or otherwise associated with operational data of the vehicle system in which or on which the camera is disposed. This operational data represents operational conditions and/or operational settings of the vehicle system, as described below.
The image data acquired by one or more of the vehicle systems may be remotely accessed by one or more locations that are off-board the vehicle systems that acquired the image data. For example, an off-board facility such as a dispatch facility, repair station, refueling station, scheduling facility, another vehicle system (that did not acquire the image data), or the like, can obtain access to and view the image data stored onboard the vehicle systems that acquired the image data. The off-board facility can filter through the image data acquired by one or more vehicle systems using the operational data to find certain segments of the image data that are or were obtained at or near a location of interest to an operator at the off-board facility. For example, from an off-board facility, the operational data may be used to search through the image data stored remotely on one or more vehicle systems to find image data that was obtained at or near the location of a signal that has been identified as potentially having a burned out bulb in a light of the signal, to find image data obtained at or near a reported rock slide, or the like. The off-board facility may then identify relevant image data relatively quickly in order to determine one or more responsive actions to take. Otherwise, the off-board facility may be forced to wait until the vehicle systems that traveled near the location of interest return to a location where the image data is downloaded and/or communicated to the off-board facility, where one or more operators may then need to sift through (e.g., view) lengthy files of image data in an attempt to find the image data acquired at or near the location of interest.
As described below, one or more of the vehicles 104, 106 include an imaging system that generates image data representative of images and/or video captured by a camera disposed onboard (e.g., inside and/or external to the vehicles 104, 106). The image data can be communicated between the vehicle system 100 and an off-board facility where the management system 102 is located. For example, the vehicle system 100 and the management system 102 can include communication systems 112, 114 that permit the vehicle system 100 and the management system 102 to wirelessly communicate the image data. Optionally, the image data can be communicated through one or more wired connections, such as by conducting electric signals through rails, catenaries, power lines, or the like, between the vehicle system 100 and the management system 102.
The management system 102 can be located at a stationary building, such as a dispatch facility, scheduling facility, repair facility, refueling facility, or the like. Optionally, the management system 102 may be located at a mobile location, such as another vehicle system or another vehicle 104, 106 of the same vehicle system 100 that generated the image data, and/or in a laptop computer or other portable computer that is configured to communicate via a wireless network and/or via a wired connection (e.g., via an Ethernet connection).
The management system 102 includes the communication system 114 referred to above, which includes hardware circuits or circuitry that include and/or are connected with one or more computer processors (e.g., microprocessors) and communication devices (e.g., a wireless antenna 116 and/or one or more wired connections described above) that operate as transmitters and/or transceivers for communicating signals with the communication system 112 of the vehicle system 100. For example, the communication system 114 may wirelessly communicate (e.g., receive) signals that include image data via the antenna 116 and/or communicate the signals over the wired connection from the vehicle system 100 that acquired and/or locally stores the image data.
A controller 118 of the management system 102 includes or represents hardware circuits or circuitry that includes and/or is connected with one or more computer processors, such as one or more computer microprocessors. The controller 118 can be used to control operations of the management system 102, such as by receiving input from an operator of the management system 102 to search for image data that illustrates a location of interest (e.g., a location being investigated by the operator), image data acquired when certain operational conditions were present at the vehicle system 100, image data acquired when certain operational settings were used in the vehicle system 100, or the like.
A memory device 120 of the management system 102 includes one or more computer readable media used to at least temporarily store the image data obtained from one or more of the vehicle systems 100. Without limitation, the memory device 120 can include a computer hard drive, flash drive, optical disk, or the like. An analysis processor 122 of the management system 102 includes hardware circuits and/or circuitry that include and/or are connected with one or more computer processors, such as one or more computer microprocessors. The analysis processor 122 receives the input provided by the operator to search for and obtain certain segments of the image data acquired by the imaging system of one or more vehicle systems 100. For example, the operator of the management system 102 can provide certain search parameters, such as location, time, date, operational settings, operational conditions, or the like, and the analysis processor 122 can search through the image data stored remotely onboard the one or more vehicle systems 100, stored locally in the memory device 120, and/or elsewhere, to find one or more segments of image data having operational data that matches the search parameters (or at least more closely matches the search parameters than one or more other segments of the image data). The image data that is found through this search may include image data acquired by different vehicle systems 100. The image data having operational data that matches or more closely matches the search parameters than other image data can be referred to herein as matching image data.
The analysis processor 122 can relay the matching image data to a display device 124. The display device 124 may be a monitor, television, touchscreen, or other output device that visually presents the matching image data. The operator can use the controller 118 to control presentation of the matching image data.
In one aspect, the image data is stored remotely from the analysis processor 122, such as onboard one or more of the vehicle systems 100. The image data can be stored on the vehicle systems 100 in such a way that the image data is associated with operational data described herein. For example, the operational data may be embedded, modulated into, or otherwise included in the same electronic files that include the image data. In one aspect, the operational data can be stored in the image data as metadata of the image data. Alternatively, the operational data can be separately stored from the electronic files that include the image data, but associated with the image data. For example, one or more tables, lists, file pointers, databases, or the like, can be used to associate different operational data with different image data so that, when operational data is found to match the search parameters, the image data associated with, but separately stored from, this operational data can then be retrieved for presentation on the display device 124.
The camera 202 may be referred to as an internal camera or cab camera because the camera 202 is disposed inside a cab of the vehicle 104 where an operator of the vehicle 104 is located to control and/or monitor operations of the vehicle 104. The camera 202 can be positioned and oriented so that the field of view 208 of the camera 202 includes the interior space of the cab in the vehicle 104, as well as a portion of the exterior of the vehicle 104. This portion of the exterior of the vehicle 104 can be the space outside of the vehicle 104 that is viewable through one or more windows 214 of the vehicle 104. In the illustrated example, the camera 202 is oriented so that at least a portion of the route 110 that is ahead of the vehicle 104 is viewable in the field of view 208 of the camera 202.
The camera 204 may be referred to as an external or exterior camera because the camera 204 is outside of the vehicle 104. The field of view 210 of the camera 204 is oriented ahead of the vehicle 104, but optionally may be oriented in another direction. The camera 206 can be referred to as a route monitoring camera because the field of view 212 of the camera 206 includes the route 110. The image data provided by the camera 206 can be used to inspect the route 110.
The route 110, one or more wayside devices (e.g., equipment, systems, assemblies, and the like, that are located outside of the vehicle system 100 at, near, or alongside the route 110), actions of an onboard operator, and other objects may be imaged by the cameras 202, 204, 206 during travel of the vehicle 104. The images and/or video captured and output by the cameras 202, 204, 206 can be referred to herein as image data.
One or more of the cameras 202, 204, 206 may be a digital camera capable of obtaining relatively high quality image data (e.g., static or still images and/or videos). For example, one or more of the cameras 202, 204, 206 may be Internet protocol (IP) cameras that generate packetized image data. One or more of the cameras 202, 204, 206 can be a high definition (HD) camera capable of obtaining image data at relatively high resolutions. For example, one or more of the cameras 202, 204, 206 may obtain image data having at least 480 horizontal scan lines, at least 576 horizontal scan lines, at least 720 horizontal scan lines, at least 1080 horizontal scan lines, or an even greater resolution. The image data generated by the cameras 202, 204, 206 can include still images and/or videos.
A controller 216 of the vehicle 104 includes hardware circuits or circuitry that includes and/or is connected with one or more computer processors, such as one or more computer microprocessors. The controller 216 is used to autonomously and/or manually control operations of the vehicle 104 and/or vehicle system 100. For example, the controller 112 may receive inputs from one or more input devices 226, such as one or more levers, pedals, buttons, switches, touchscreen, keyboards, styluses, or the like. The inputs may be used by the controller 216 to change throttle settings, brake settings, or the like, of the vehicle 104 and/or vehicle system 100.
The controller 216 can report at least some of the operational settings of the vehicle 104 and/or vehicle system 100 to an onboard memory device 218 and/or the camera 202, 204, and/or 206 so that the operational settings can be stored with the image data as operational data. For example, the throttle settings, brake settings, amount of fuel and/or electric energy that is onboard the vehicle 104 and/or vehicle system 100, the amount of fuel and/or electric energy consumed by the vehicle 104 and/or vehicle system 100, or the like, can be embedded in the image data or otherwise associated with the image data. The operational settings that are current (e.g., the settings being used at the same time that the image data is obtained) may be embedded in or otherwise associated with the image data in the memory device 218. For example, the brake setting being used to propel the vehicle 104 at the same that image data is generated by the camera 204 may be embedded in the image data.
The memory device 218 includes one or more computer readable media used to store the image data provided by the one or more of the cameras 202, 204, 206 and/or the operational data associated with the image data. Without limitation, the memory device 218 can include a computer hard drive, flash drive, optical disk, or the like. The memory device 218 may be disposed entirely onboard the vehicle 104, or may be at least partially stored off-board the vehicle 104.
As described above, operational data that is included in or otherwise associated with the image data can include operational conditions. Operational conditions represent the state of the conditions in and/or around the vehicle 104 and/or vehicle system 100. Examples of operational conditions include a date, time, location of the vehicle 104 and/or vehicle system 100, acceleration of the vehicle 104 and/or vehicle system 100, vibrations of the vehicle 104 and/or vehicle system 100, forces exerted on the couplers 108 (shown in
Various components of the vehicle 104 and/or vehicle system 100 can provide the operational data for storage in or association with the image data. For example, the controller 216 can include an internal clock or otherwise determine the date and/or time at which image data is acquired. This date and/or time can be stored in or associated with the image data acquired at the date and/or time. A location determining device 220 generates operational data representative of where the vehicle 104 and/or vehicle system 100 is located when image data is obtained. The location determining device 220 can represent a global positioning system (GPS) receiver, a radio frequency identification (RFID) transponder that communicates with RFID tags or beacons disposed alongside the route 110, a computer that triangulates the location of the vehicle 104 and/or vehicle system 100 using wireless signals communicated with cellular towers or other wireless signals, a speed sensor (that outputs data representative of speed, which is translated into a distance from a known or entered location by the controller 216), or the like. The location determining device 220 can include an antenna 222 (and associated hardware receiver or transceiver circuitry) for determining the location. The location of the vehicle 104 and/or vehicle system 100 can be embedded in or otherwise associated with the image data acquired at that location.
One or more sensors 224 can generate operational data representative of operational conditions of the vehicle 104 and/or vehicle system 100 for storage in or association with the image data. For example, the sensors 224 can include accelerometers that measure acceleration, vibrations, or forces exerted on the vehicle 104 and/or vehicle system 100. The sensors 224 can include force sensors that measure or estimate forces exerted on the couplers 108 (shown in
The operational data generated by the sensors 224 can be stored with the image data to indicate the operational conditions when the image data was obtained. For example, the location of the vehicle 104 and/or vehicle system 100 when the image data was obtained, the time and/or date when the image data was obtained, or the like, can be stored in or otherwise associated with the image data in the memory device 218.
The image data and embedded or otherwise associated operational data can be accessible by the remotely located management system 102 (shown in
In operation, the management system 102 permits an operator to provide search parameters to search for segments of interest in the image data obtained by one or more of the vehicle systems 100, to remotely access the matching image data (e.g., from the memory devices 218 on the vehicle systems 100 via the communication systems 112, 114), and to present the matching image data to the operator on the display device 124. For example, the operator of the management system 102 can specify a date and/or time, and the management system 102 reviews the operational data associated with the image data to find the image data that was acquired at the specified date and/or time. In one embodiment, the management system 102 can review the operational data stored in the remotely located memory devices 218 of the vehicle systems 100. Alternatively, the operational data can be stored in the memory device 120 (shown in
In one embodiment, the use of digital cameras 202, 204, 206 allows for digital storage of the image data in digital files saved in the onboard memory device 218. The files may be randomly accessible in one aspect. The operational data that is embedded in or otherwise associated with the files can be available both in the vehicle system 100 and the remotely located management system 102. For example, the vehicle 104 may include a display device similar to the display device 124 of the management system 102. An onboard operator of the vehicle 104 can use the controller 216 to input search parameters to search for image data having or associated with operational data that matches or otherwise corresponds to the search parameters, and to view the matching image data onboard the vehicle 104 or another vehicle of the vehicle system 100.
The management system 102 can be used by a remote fleet control and operations center (e.g., facility that monitors and/or controls movements of multiple vehicle systems 100) can remotely obtain image data of interest from the imaging systems 200 of the vehicle systems 100. For example, the communication systems 112, 114 can communicate the image data as the image data is being acquired so that the management system 102 can view the image data in near real time (e.g., relatively shortly after the image data is acquired), or at a later time. The management system 102 can recall image data of local conditions at or near the vehicle systems 100 by specifying specific dates, times, and/or locations on demand.
The image data that has the operational data matching the search parameters can be presented to an operator of the management system 102 on the display device 124 with the operating data overlaid on the image data. This can allow the operator to review or to assess the conditions existing at the specified operational data used to search for the image data (e.g., the date, time, and/or location of interest). In one aspect, the management system 102 can obtain multiple different sets of image data acquired by multiple different imaging systems 200 on different vehicle systems 100. An operator at the management system 102 can then use a location in a transportation network (e.g., formed by two or more roads, tracks, or the like) in which different vehicle systems 100 are traveling and/or previously traveled while acquiring image data. The operator-specified location can then be used by the management system 102 to search through the image data acquired by the different imaging systems 200 of the different vehicle systems 100. The management system 102 can find the matching image data acquired at or near (e.g., within a designated distance threshold) the operator-specified location using the operational data embedded in or otherwise associated with the image data. The operator optionally can limit this matching image data by specifying other operational data, such as a time and/or date when the image data was acquired, throttle settings used when the image data was acquired, brake settings used when the image data was acquired, and the like. The matching image data that is found by the management system 102 and then presented to the operator can then include the image data representative of the operator-specified location.
During travel of the vehicle systems 302, 304, 306, the imaging systems 200 (shown in
The operator at the management system 102 can then enter the location of the location of interest 314 and/or temporal ranges (e.g., times and/or dates during which the location of interest 314 may have occurred) as search parameters for the image data. Optionally, other search parameters can be used. The management system 102 can then search the image data previously and/or currently acquired by the imaging systems 200 of the vehicle systems 302, 304, 306 to find the image data stored onboard the vehicle systems 302, 304, 306 that was acquired at or near the location of interest 314.
The segments of the image data acquired at or near the location of interest 314 can be communicated from the vehicle systems 302, 304, 306 (or from another location) to the management system 102. These segments of the image data can be presented to the operator so that the operator is able to view the image data obtained at or near the location of interest 314 in order to examine the location of interest 314.
In one aspect, the management system 102 can filter the image data by providing the operator with the segments of the image data that were acquired at or near the location of interest 314, and by not providing the operator with the image data that was not acquired at or near the location of interest 314.
For example, the segment of interest 408 of the image data 402 can represent the portion of the image data 402 acquired at or near the location of interest 314 (shown in
In one embodiment, instead of providing all of the image data 402, 404, 406 to an operator when the management system 102 searches for image data acquired at or near the location of interest 314, the management system 102 may only provide the segments of interest 408, 410, 412 to the operator. In doing so, the operator may be able to more efficiently review the portions of the image data that are relevant to the location of interest 314. Alternatively, a greater amount of the image data may be provided to the operator.
Optionally, the search parameters provided by an operator can include a camera perspective or field of view. As described above, different cameras of an imaging system may have different fields of view. The field of view may be stored as operational data with the image data that is acquired for that field of view. The operator can select one or more fields of view as searching parameters, and the management system can search through the image data from one or more imaging systems to find matching image data for presentation to the operator (e.g., image data of the selected field of view).
At 504, operational settings and/or operational conditions of the vehicle system are determined. These settings and/or conditions are referred to as operational data, as described above. The operational data at the same times that the image data was acquired is determined. For example, the location, time, date, speed, throttle settings, brake settings, vibrations, wheel slip, and the like, of the vehicle system at the same time that the image data is obtained is determined.
At 506, the operational data is embedded or otherwise associated with the image data. The operational data can be saved in the same electronic file as the image data, and/or in another location while being associated with the image data. For example, different segments of the image data can be embedded with the different locations, times, dates, and the like, of the vehicle system when the different segments of the image data were obtained.
At 508, search parameters are received, such as from an operator of the management system. The search parameters can include operational data, ranges of operational data, or the like, for which the operator wants to find corresponding image data. For example, the search parameters can include locations when the operator wants to find image data acquired at or near the specified locations, times when the operator wants to find image data acquired at or during the specified times, brake settings when the operator wants to find image data acquired when those brake settings were being applied, or the like.
At 510, the operational data of the image data acquired by imaging systems onboard one or more vehicle systems is examined to determine if the operational data matches or otherwise corresponds to the search parameters. For example, the management system can examine the operational data of the acquired image data to determine if any image data was acquired at the locations or times specified by the operator, or when the brake settings specified by the operator were being applied.
If at least some image data includes or is associated with operational data that matches the search parameters (or that more closely matches the search parameters more than other operational data of the image data), then the image data can be identified as matching image data for presentation to the operator. As a result, flow of the method 500 can proceed to 514.
On the other hand, if the image data does not include operational data that matches the search parameters (or does not more closely match the search parameters more than other operational data of the image data), then no image data may be identified as matching image. As a result, flow of the method 500 can proceed to 512.
At 512, a notification is communicated to the operator that little or no image data has been acquired that includes or is associated with the search parameters. This notification can be an audible, textual, or other notification communicated to the operator. In response, the operator may specify different search parameters to find other image data.
At 514, the matching image data is identified. For example, the location of where the matching image data is stored may be determined. The image data may be stored onboard one or more vehicle systems or other locations that are remote (e.g., not in the same building, town, county, or the like) from the management system. The management system may remotely search the image data by sending request signals to the imaging systems of the vehicle systems, with the request signals identifying the search parameters. The imaging systems (e.g., the controllers of the imaging systems) can review the image data stored onboard the vehicle systems to find the matching image data.
At 516, the matching image data is sent from the imaging systems to the management system. For example, the remotely stored image data can be transmitted and/or broadcast by the communication systems of the vehicle systems to the management system in response to receiving the search parameters and finding the matching image data.
At 518, the matching image data is presented to the operator of the management system. The matching image data is received from the vehicle systems and can be shown on a display device to the operator.
The one or more vehicle systems 100 may be configured in any of the following manners (the vehicle systems may all be configured the same, or there may be a mix of different configurations): (i) the vehicle system is configured generally as shown and described in regards to
The server unit 602 is configured, in operation, to receive the search parameters 620, which specify operational data and/or a range of the operational data of one or more vehicle systems 100. For example, as noted above, the search parameters 620 may be received as data over the network 612, by way of an operator entering, selecting, etc. the search parameters 620 on the website 614 displayed on the remote operator terminal 618. The server unit 602 is further configured to search remotely stored image data 622 (e.g., stored on the one or more vehicle systems 100) based on the search parameters to identify matching image data 624 (data of the stored image data that matches the search parameter(s)). The server unit 602 is further configured to obtain the matching image data having the operational data specified by the search parameters. In embodiments, remotely searching and obtaining the matching image data is carried out by the server unit 602 in communication with the client unit 604 on board the one or more vehicle systems. For example, the client unit 604 may be configured to authenticate communications from the server unit 602, and to process commands received from the server unit 602 for searching the image data 622 and/or communicating the matching image data (or information relating thereto) to the server unit 602.
The server unit 602 is further configured to present the matching image data 624 to an operator. For example, the matching image data 624 may be displayed on the website 614, and/or available for download on the website 614; the website 614 is displayed and operable on the operator's remote terminal 618.
In embodiments, the website 614 is hosted on a third-party server 602, with the service provider and/or operator of the website (i.e., entity that controls and/or owns the website and related functionality) accessing the website (for servicing, updating, etc.) from a remote terminal 616 via the network 612.
In one embodiment, a system (e.g., an image management system) includes a controller and one or more analysis processors. The controller is configured to receive search parameters, the search parameters specifying at least one of operational data or a range of the operational data of one or more vehicle systems. The one or more analysis processors are configured to search remotely stored image data based on the search parameters to identify matching image data. The remotely stored image data is obtained by one or more imaging systems disposed onboard the one or more vehicle systems. The remotely stored image data is associated with the operational data of the one or more vehicle systems that was current when the remotely stored image data was acquired. The one or more analysis processors also are configured to obtain the matching image data having the operational data specified by the search parameters and to present the matching image data to an operator.
In one aspect, the remotely stored image data is remotely stored onboard one or more vehicles of the one or more vehicle systems that do not include the one or more analysis processors.
In one aspect, the controller and one or more analysis processors are located off board the one or more vehicle systems at an off-board facility.
In one aspect, the remotely stored image data is embedded with the operational data that was current for the one or more vehicle systems when the remotely stored image data was acquired.
In one aspect, the operational data includes operational settings of the one or more vehicle systems that are used to control operations of the one or more vehicle systems.
In one aspect, the operational data includes operational conditions to which the one or more vehicle systems were exposed when the remotely stored image data was acquired.
In one aspect, the operational data includes at least one of locations of the one or more vehicle systems when different segments of the remotely stored image data were acquired, times at which the different segments of the remotely stored image data were acquired, and/or dates on which the different segments of the remotely stored image data were acquired.
In one aspect, the controller is configured to receive at least one of an operator specified location, an operator specified time, an operator specified range of times, an operator specified date, and/or an operator specified range of dates as the search parameters. The one or more analysis processors can be configured to search the operational data associated with the remotely stored image data to select one or more segments of the remotely stored image data that were acquired at the at least one of the operator specified location, the operator specified time, the operator specified range of times, the operator specified date, and/or the operator specified range of dates as the matching image data.
In one aspect, the one or more analysis processors are configured to identify the matching image data by determining which of the remotely stored image data are associated with the operational data that matches or more closely matches the search parameters more than other remotely stored image data.
In another embodiment, a method (e.g., an image management method) includes receiving search parameters that specify at least one of operational data or a range of the operational data of one or more vehicle systems and searching (with one or more processors) remotely stored image data based on the search parameters to identify matching image data. The remotely stored image data can be obtained by one or more imaging systems disposed onboard the one or more vehicle systems, and can be associated with the operational data of the one or more vehicle systems that was current when the remotely stored image data was acquired. The method also can include obtaining the matching image data having the operational data specified by the search parameters to present the matching image data to an operator.
In one aspect, the remotely stored image data is remotely stored from the one or more processors.
In one aspect, the remotely stored image data is searched by examining the operational data that is embedded in the remotely stored image data and that was current for the one or more vehicle systems when the remotely stored image data was acquired.
In one aspect, the operational data includes operational settings of the one or more vehicle systems that are used to control operations of the one or more vehicle systems.
In one aspect, the operational data includes operational conditions to which the one or more vehicle systems were exposed when the remotely stored image data was acquired.
In one aspect, the operational data includes at least one of locations of the one or more vehicle systems when different segments of the remotely stored image data were acquired, times at which the different segments of the remotely stored image data were acquired, and/or dates on which the different segments of the remotely stored image data were acquired.
In one aspect, the search parameters that are received include at least one of an operator specified location, an operator specified time, an operator specified range of times, an operator specified date, and/or an operator specified range of dates.
In one aspect, the remotely stored image data is searched by examining the operational data associated with the remotely stored image data to select one or more segments of the image data that were acquired at the at least one of the operator specified location, the operator specified time, the operator specified range of times, the operator specified date, and/or the operator specified range of dates as the matching image data.
In one aspect, the matching image data is identified by determining which of the matching image data are associated with the operational data that matches or more closely matches the search parameters more than other matching image data.
In another embodiment, another method (e.g., another image management method) includes acquiring first image data from one or more cameras disposed onboard a vehicle system as the vehicle system moves along a route. The image data represents at least one of images or videos of a field of view of the one or more cameras. The method also may include determining operational data of the vehicle system when the first image data was acquired. The operational data includes at least one of operational settings of the vehicle system when the first image data was acquired and/or operational conditions to which the vehicle system is exposed when the first image data was acquired. The method also can include associating the operational data with the first image data, identifying one or more segments of the first image data as matching image data responsive to receiving search parameters that specify at least some of the operational data associated with the first image data, and communicating the matching image data to an off-board location responsive to identifying the matching image data.
In one aspect, the operational data can be associated with the first image data by embedding the operational data in the first image data in a memory device disposed onboard the vehicle system.
In one aspect, the operational data includes at least one of locations of the vehicle system when the first image data was acquired, times at which the first image data was acquired, and/or dates on which the first image data was acquired.
It is to be understood that the above description is intended to be illustrative, and not restrictive. For example, the above-described embodiments (and/or aspects thereof) may be used in combination with each other. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the inventive subject matter without departing from its scope. While the dimensions and types of materials described herein are intended to define the parameters of the inventive subject matter, they are by no means limiting and are exemplary embodiments. Many other embodiments will be apparent to one of ordinary skill in the art upon reviewing the above description. The scope of the inventive subject matter should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Moreover, in the following claims, the terms “first,” “second,” and “third,” etc. are used merely as labels, and are not intended to impose numerical requirements on their objects. Further, the limitations of the following claims are not written in means-plus-function format and are not intended to be interpreted based on 35 U.S.C. §112(f), unless and until such claim limitations expressly use the phrase “means for” followed by a statement of function void of further structure.
This written description uses examples to disclose several embodiments of the inventive subject matter and also to enable a person of ordinary skill in the art to practice the embodiments of the inventive subject matter, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the inventive subject matter is defined by the claims, and may include other examples that occur to those of ordinary skill in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.
The foregoing description of certain embodiments of the inventive subject matter will be better understood when read in conjunction with the appended drawings. To the extent that the figures illustrate diagrams of the functional blocks of various embodiments, the functional blocks are not necessarily indicative of the division between hardware circuitry. Thus, for example, one or more of the functional blocks (for example, processors or memories) may be implemented in a single piece of hardware (for example, a general purpose signal processor, microcontroller, random access memory, hard disk, and the like). Similarly, the programs may be stand-alone programs, may be incorporated as subroutines in an operating system, may be functions in an installed software package, and the like. The various embodiments are not limited to the arrangements and instrumentality shown in the drawings.
As used herein, an element or step recited in the singular and proceeded with the word “a” or “an” should be understood as not excluding plural of said elements or steps, unless such exclusion is explicitly stated. Furthermore, references to “one embodiment” of the inventive subject matter are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments “comprising,” “including,” or “having” an element or a plurality of elements having a particular property may include additional such elements not having that property.
This application claims priority to U.S. Provisional Application No. 61/940,696, which was filed on 17 Feb. 2014, and is titled “Vehicle Image Data Management System And Method” (the “'696 Application”), to U.S. Provisional Application No. 61/940,813, which was filed on 17 Feb. 2014, and is titled “Portable Camera System And Method For Transportation Data Communication” (the “'813 Application”), to U.S. Provisional Application No. 61/940,660, which was filed on 17 Feb. 2014, and is titled “Route Imaging System And Method” (the “'660 Application”), and to U.S. Provisional Application No. 61/940,610, which was filed on 17 Feb. 2014, and is titled “Wayside Imaging System And Method” (the “'610 Application”). The entire disclosures of these applications (the '696 Application, the '813 Application, the '660 Application, and the '610 Application) are incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
2701610 | Carlson | Feb 1955 | A |
3505742 | Fiechter | Apr 1970 | A |
3581071 | Payseure | May 1971 | A |
3641338 | Peel et al. | Feb 1972 | A |
3864039 | Wilmarth | Feb 1975 | A |
4259018 | Poirier | Mar 1981 | A |
4281354 | Conte | Jul 1981 | A |
4490038 | Theurer et al. | Dec 1984 | A |
4654973 | Worthy | Apr 1987 | A |
4751571 | Lillquist | Jun 1988 | A |
4783593 | Noble | Nov 1988 | A |
4915504 | Thurston | Apr 1990 | A |
5203089 | Trefouel et al. | Apr 1993 | A |
5337289 | Fasching et al. | Aug 1994 | A |
5364047 | Petit et al. | Nov 1994 | A |
5379224 | Brown et al. | Jan 1995 | A |
5429329 | Wallace et al. | Jul 1995 | A |
5506682 | Pryor | Apr 1996 | A |
RE35590 | Bezos et al. | Aug 1997 | E |
5659305 | Rains et al. | Aug 1997 | A |
5699057 | Ikeda et al. | Dec 1997 | A |
5717737 | Doviak et al. | Feb 1998 | A |
5729213 | Ferrari et al. | Mar 1998 | A |
5735492 | Pace | Apr 1998 | A |
5786750 | Cooper | Jul 1998 | A |
5793420 | Schmidt | Aug 1998 | A |
5867122 | Zahm et al. | Feb 1999 | A |
5867404 | Bryan | Feb 1999 | A |
5867717 | Milhaupt et al. | Feb 1999 | A |
5893043 | Moehlenbrink et al. | Apr 1999 | A |
5938717 | Dunne et al. | Aug 1999 | A |
5954299 | Pace | Sep 1999 | A |
5961571 | Gorr et al. | Oct 1999 | A |
5978718 | Kull | Nov 1999 | A |
6011901 | Kirsten | Jan 2000 | A |
6081769 | Curtis | Jun 2000 | A |
6088635 | Cox et al. | Jul 2000 | A |
6128558 | Kernwein | Oct 2000 | A |
6144755 | Niyogi et al. | Nov 2000 | A |
6150930 | Cooper | Nov 2000 | A |
6163755 | Peer et al. | Dec 2000 | A |
6259375 | Andras | Jul 2001 | B1 |
6263266 | Hawthorne | Jul 2001 | B1 |
6285778 | Nakajima et al. | Sep 2001 | B1 |
6310546 | Seta | Oct 2001 | B1 |
6356299 | Trosino et al. | Mar 2002 | B1 |
6373403 | Korver et al. | Apr 2002 | B1 |
6377215 | Halvorson et al. | Apr 2002 | B1 |
6384742 | Harrison | May 2002 | B1 |
6453056 | Laumeyer et al. | Sep 2002 | B2 |
6453223 | Kelly et al. | Sep 2002 | B1 |
6487500 | Lemelson et al. | Nov 2002 | B2 |
6519512 | Haas et al. | Feb 2003 | B1 |
6526352 | Breed et al. | Feb 2003 | B1 |
6532035 | Saari et al. | Mar 2003 | B1 |
6532038 | Haring et al. | Mar 2003 | B1 |
6570497 | Puckette et al. | May 2003 | B2 |
6600999 | Clark et al. | Jul 2003 | B2 |
6631322 | Arthur et al. | Oct 2003 | B1 |
6637703 | Matheson et al. | Oct 2003 | B2 |
6647891 | Holmes et al. | Nov 2003 | B2 |
6712312 | Kucik | Mar 2004 | B1 |
6778284 | Casagrande | Aug 2004 | B2 |
6831573 | Jones | Dec 2004 | B2 |
6995556 | Nejikovsky et al. | Feb 2006 | B2 |
7039367 | Kucik | May 2006 | B1 |
7348895 | Lagassey | Mar 2008 | B2 |
7403296 | Farritor et al. | Jul 2008 | B2 |
7463348 | Chung | Dec 2008 | B2 |
7527495 | Yam et al. | May 2009 | B2 |
7545322 | Newberg et al. | Jun 2009 | B2 |
7571051 | Shulman | Aug 2009 | B1 |
7616329 | Villar et al. | Nov 2009 | B2 |
7659972 | Magnus et al. | Feb 2010 | B2 |
7707944 | Bounds | May 2010 | B2 |
7742873 | Agnew et al. | Jun 2010 | B2 |
7772539 | Kumar | Aug 2010 | B2 |
7826969 | Hein et al. | Nov 2010 | B2 |
7845504 | Davenport et al. | Dec 2010 | B2 |
7908114 | Ruggiero | Mar 2011 | B2 |
7965312 | Chung et al. | Jun 2011 | B2 |
7999848 | Chew | Aug 2011 | B2 |
8180590 | Szwilski et al. | May 2012 | B2 |
8233662 | Bhotika et al. | Jul 2012 | B2 |
8335606 | Mian et al. | Dec 2012 | B2 |
8355834 | Duggan et al. | Jan 2013 | B2 |
8405837 | Nagle, II et al. | Mar 2013 | B2 |
8412393 | Anderson et al. | Apr 2013 | B2 |
8576069 | Nadeem et al. | Nov 2013 | B2 |
8599005 | Fargas et al. | Dec 2013 | B2 |
8649917 | Abernathy | Feb 2014 | B1 |
8712610 | Kumar | Apr 2014 | B2 |
8744196 | Sharma et al. | Jun 2014 | B2 |
8751073 | Kumar et al. | Jun 2014 | B2 |
8838301 | Makkinejad | Sep 2014 | B2 |
8903574 | Cooper et al. | Dec 2014 | B2 |
9041553 | Sako et al. | May 2015 | B2 |
9049433 | Prince | Jun 2015 | B1 |
9108640 | Jackson | Aug 2015 | B2 |
20010050324 | Greene, Jr. | Dec 2001 | A1 |
20020003510 | Shigetomi | Jan 2002 | A1 |
20020007225 | Costello et al. | Jan 2002 | A1 |
20020031050 | Blevins et al. | Mar 2002 | A1 |
20020035417 | Badger et al. | Mar 2002 | A1 |
20020037104 | Myers et al. | Mar 2002 | A1 |
20020041229 | Satoh et al. | Apr 2002 | A1 |
20020101509 | Slomski | Aug 2002 | A1 |
20020135471 | Corbitt et al. | Sep 2002 | A1 |
20020145665 | Ishikawa et al. | Oct 2002 | A1 |
20020149476 | Ogura | Oct 2002 | A1 |
20030048193 | Puckette et al. | Mar 2003 | A1 |
20030142297 | Casagrande | Jul 2003 | A1 |
20030202101 | Monroe et al. | Oct 2003 | A1 |
20040022416 | Lemelson et al. | Feb 2004 | A1 |
20040064241 | Sekiguchi | Apr 2004 | A1 |
20040093196 | Hawthorne et al. | May 2004 | A1 |
20040182970 | Mollet et al. | Sep 2004 | A1 |
20040183663 | Shimakage | Sep 2004 | A1 |
20040263624 | Nejikovsky et al. | Dec 2004 | A1 |
20050012745 | Kondo et al. | Jan 2005 | A1 |
20050018748 | Ringermacher et al. | Jan 2005 | A1 |
20050110628 | Kernwein et al. | May 2005 | A1 |
20050113994 | Bell et al. | May 2005 | A1 |
20050125113 | Wheeler et al. | Jun 2005 | A1 |
20050284987 | Kande et al. | Dec 2005 | A1 |
20060017911 | Villar et al. | Jan 2006 | A1 |
20060132602 | Muto et al. | Jun 2006 | A1 |
20060244830 | Davenport et al. | Nov 2006 | A1 |
20070005202 | Breed | Jan 2007 | A1 |
20070027583 | Tamir | Feb 2007 | A1 |
20070085703 | Clark et al. | Apr 2007 | A1 |
20070170315 | Manor et al. | Jul 2007 | A1 |
20070200027 | Johnson | Aug 2007 | A1 |
20070216771 | Kumar | Sep 2007 | A1 |
20070217670 | Bar-Am | Sep 2007 | A1 |
20070284474 | Olson et al. | Dec 2007 | A1 |
20080169939 | Dickens et al. | Jul 2008 | A1 |
20080304065 | Hesser et al. | Dec 2008 | A1 |
20090020012 | Holten et al. | Jan 2009 | A1 |
20090037039 | Yu et al. | Feb 2009 | A1 |
20090144233 | Grigsby et al. | Jun 2009 | A1 |
20090189981 | Siann et al. | Jul 2009 | A1 |
20090314883 | Arlton et al. | Dec 2009 | A1 |
20100026551 | Szwilski et al. | Feb 2010 | A1 |
20100039514 | Brand | Feb 2010 | A1 |
20100049405 | Li | Feb 2010 | A1 |
20100060735 | Sato | Mar 2010 | A1 |
20100066527 | Liou | Mar 2010 | A1 |
20100073480 | Hoek et al. | Mar 2010 | A1 |
20100207787 | Catten et al. | Aug 2010 | A1 |
20100253541 | Seder et al. | Oct 2010 | A1 |
20100283837 | Oohchida et al. | Nov 2010 | A1 |
20110064273 | Zarembski et al. | Mar 2011 | A1 |
20110115913 | Lang et al. | May 2011 | A1 |
20110216200 | Chung et al. | Sep 2011 | A1 |
20110285842 | Davenport et al. | Nov 2011 | A1 |
20120089537 | Cooper et al. | Apr 2012 | A1 |
20120192756 | Miller et al. | Aug 2012 | A1 |
20120263342 | Haas et al. | Oct 2012 | A1 |
20120274772 | Fosburgh et al. | Nov 2012 | A1 |
20120300060 | Farritor | Nov 2012 | A1 |
20130018766 | Christman | Jan 2013 | A1 |
20130116972 | Hanatsuka et al. | May 2013 | A1 |
20130191070 | Kainer et al. | Jul 2013 | A1 |
20130233964 | Woodworth et al. | Sep 2013 | A1 |
20130286204 | Cheng | Oct 2013 | A1 |
20140003724 | Feris et al. | Jan 2014 | A1 |
20140012438 | Shoppa et al. | Jan 2014 | A1 |
20140036063 | Kim et al. | Feb 2014 | A1 |
20140036076 | Nerayoff et al. | Feb 2014 | A1 |
20140067167 | Levien et al. | Mar 2014 | A1 |
20140071269 | Mohamed et al. | Mar 2014 | A1 |
20140129154 | Cooper et al. | May 2014 | A1 |
20140142868 | Bidaud | May 2014 | A1 |
20140266882 | Metzger | Sep 2014 | A1 |
20140300738 | Mueller | Oct 2014 | A1 |
20150269841 | Makowitz et al. | Sep 2015 | A1 |
20160006922 | Boudreau et al. | Jan 2016 | A1 |
Number | Date | Country |
---|---|---|
201325416 | Oct 2009 | CN |
201821456 | May 2011 | CN |
19529986 | Feb 1997 | DE |
19938267 | Feb 2001 | DE |
102005029956 | Feb 2006 | DE |
0093322 | Nov 1983 | EP |
0117763 | Sep 1984 | EP |
0378781 | Jul 1990 | EP |
0605848 | Jul 1994 | EP |
0761522 | Mar 1997 | EP |
0893322 | Jan 1999 | EP |
1236634 | Sep 2002 | EP |
953491 | Dec 2005 | EP |
1600351 | Jan 2007 | EP |
2384379 | Jul 2003 | GB |
2403861 | Jan 2005 | GB |
20130119633 | Nov 2013 | KR |
20140017735 | Feb 2014 | KR |
02058984 | Aug 2002 | WO |
2006112959 | Oct 2006 | WO |
2007096273 | Aug 2007 | WO |
2012150591 | Nov 2012 | WO |
2013086578 | Jun 2013 | WO |
2013121344 | Aug 2013 | WO |
Entry |
---|
PCT Search Report and Written Opinion issued in connection with corresponding PCT Application No. PCT/US2015/014355 on May 15, 2015. |
PCT Search Report and Written Opinion issued in connection with corresponding PCT Application No. PCT/US2015/016151 on May 27, 2015. |
Nicholls et al., “Applying Visual Processing to GPS Mapping of Trackside Structures,” Proc. 9th British Machine Vision Conference, Southampton, pp. 841-851, 1998. |
Saab A Map Matching Approach for Train Positioning Part I: Development and Analysis, IEEE Transactions on Vehicular Technology, vol. No. 49, Issue No. 2, pp. 467-475, Mar. 2000. |
Oertel et al., “A Video-Based Approach for Stationary Platform Supervision”, Intelligent Transportation Systems, 2002 Proceedings of IEEE 5th International Conference, pp. 892-897, 2002. |
Stella et al., “Visual Recognition of Missing Fastening Elements for Railroad Maintenance”, Intelligent Transportation Systems, The IEEE 5th International Conference, pp. 94-99, 2002. |
Kim et al., “Pseudo-Real Time Activity Detection for Railroad Grade Crossing Safety”, Intelligent Transportation Systems Proceedings of IEEE, vol. No. 2, pp. 1355-1361, 2003. |
Hirokawa et al., “Threading the Maze,” GPS World, pp. 20-26, Nov. 2004. |
U.S. Non-Final Office Action issued in connection with related U.S. Appl. No. 10/361,968 dated Aug. 30, 2006. |
PCT Search Report and Written Opinion issued in connection with related PCT Application No. PCT/US2005/039829 dated Mar. 22, 2007. |
U.S. Final Office Action issued in connection with related U.S. Appl. No. 10/361,968 dated Apr. 24, 2007. |
Eden et al., “Using 3d Line Segments for Robust and Efficient Change Detection from Multiple Noisy Images”, Computer Vision-ECCV 2008, Springer Berlin Heidelberg, pp. 172-185, 2008. |
PCT Search Report and Written Opinion issued in connection with related PCT Appication No. PCT/US2007/068780 dated Jan. 2, 2008. |
PCT Search Report and Written Opinion issued in connection with related PCT Application No. PCT/US2008/061478 dated Jul. 24, 2008. |
Unofficial English Translation of Mexican Office Action issued in connection with related MX Application No. 2003/004888 dated Sep. 10, 2008. |
Zerbst et al., “Damage Tolerance Investigations on Rails”, Engineering Fracture Mechanics, vol. No. 76, Issue No. 17, pp. 2637-2653, 2009. |
Rahmani et al., “A Novel Network Design for Future IP-Based Driver Assistance Camera Systems”, Networking, Sensing and Control, 2009. ICNSC '09. International Conference on, IEEE Xplore, pp. 457-462, Mar. 2009. |
European Office Action issued in connection with related EP Application No. 07783663.3 dated Jun. 10, 2009. |
Kaleli et al., “Vision-Based Railroad Track Extraction Using Programming”, Proc. of the 12th Inter. IEEE Conf. on Intelligent Transportation Systems, pp. 42-47, 2009. |
Oh et al., “Performance Analysis of Vision Based Monitoring System for Passenger's Safety on Railway Platform”, Control Automation and Systems (ICCAS), 2010 International Conference, pp. 1867-1870, 2010. |
Rutzinger et al., “Change Detection of Building Footprints from Airborne Laser Scanning Acquired in Short Time Intervals”, pp. 475-480, Jul. 5, 2010. |
Unofficial English Translation of Chinese Office Action issued in connection with related CN Application No. 200780024969.0 dated Jul. 26, 2010. |
U.S. Non-Final Office Action issued in connection with related U.S. Appl. No. 11/146,831 dated Sep. 21, 2010. |
Masuko et al., “Autonomous Takeoff and Landing of an Unmanned Aerial Vehicle”, System Integration (SII), 2010 EEE/SICE International Symposium on, pp. 248-253, Dec. 21-22, 2010. |
U.S. Non-Final Office Action issued in connection with related U.S. Appl. No. 11/479,559 dated Feb. 3, 2011. |
U.S. Non-Final Office Action issued in connection with related U.S. Appl. No. 11/750,681 dated May 18, 2011. |
Guofang et al.,“Research on Real-Time Location-Tracking of Underground Mine Locomotive Based on Wireless Mobile Video”, Mechatronic Science, Electric Engineering and Computer (MEC), 2011 International Conference on, IEEE Xplore, pp. 625-627, Aug. 19-22, 2011. |
U.S. Final Office Action issued in connection with related U.S. Appl. No. 11/479,559 dated Aug. 16, 2011. |
U.S. Final Office Action issued in connection with related U.S. Appl. No. 11/750,681 dated Oct. 14, 2011. |
Li et al., “A Real-Time Visual Inspection System for Discrete Surface Defects of Rail Heads”, IEEE Transactions on Instrumentation and Measurement, vol. No. 61, Issue No. 8, pp. 2189-2199, 2012. |
U.S. Non-Final Office Action issued in connection with related U.S. Appl. No. 13/194,517 dated Dec. 19, 2012. |
Stent et al., “An Image-Based System for Change Detection on Tunnel Linings”, 13th IAPR International Conference on Machine Vision Applications, 2013. |
U.S. Final Office Action issued in connection with related U.S. Appl. No. 13/194,517 dated Jul. 15, 2013. |
PCT Search Report and Written Opinion issued in connection with related PCT Application No. PCT/US2015/013720 dated Apr. 28, 2015. |
PCT Search Report and Written Opinion issued in connection with related PCT Application No. PCT/US2015/013735 dated May 1, 2015. |
U.S. Non-Final Office Action issued in connection with related U.S. Appl. No. 14/479,847 dated Jun. 10, 2015. |
U.S. Non-Final Office Action issued in connection with related U.S. Appl. No. 14/479,847 dated Nov. 18, 2015. |
U.S. Non-Final Office Action issued in connection with related U.S. Appl. No. 14/253,294 dated Dec. 30, 2015. |
U.S. Non-Final Office Action issued in connection with related U.S. Appl. No. 14/479,847 dated Apr. 21, 2016. |
European Search Report and Opinion issued in connection with related EP Application No. 15184083.2 dated Jun. 16, 2016. |
U.S. Final Office Action issued in connection with related U.S. Appl. No. 14/253,294 dated Jul. 14, 2016. |
U.S. Non-Final Office Action issued in connection with related U.S. Appl. No. 14/217,672 dated Jul. 15, 2016. |
U.S. Appl. No. 14/479,847, filed Sep. 8, 2014, Nidhi Naithani et al. |
U.S. Appl. No. 14/217,672, filed Mar. 18, 2014, Nidhi Naithani et al. |
U.S. Appl. No. 14/541,370, filed Nov. 14, 2014, Mark Bradshaw Kraeling et al. |
U.S. Appl. No. 14/624,069, filed Feb. 17, 2015, Brad Thomas Costa et al. |
U.S. Appl. No. 14/884,233, filed Oct. 15, 2015, Aadeesh Shivkant Bhagwatkar et al. |
U.S. Appl. No. 61/940,610, filed Feb. 17, 2014, Mark Bradshaw Kraeling et al. |
U.S. Appl. No. 14/457,353, filed Aug. 12, 2014, Mark Bradshaw Kraeling et al. |
U.S. Appl. No. 14/253,294, filed Apr. 15, 2014, Nidhi Naithani et al. |
U.S. Appl. No. 10/361,968, filed Feb. 10, 2003, Jeffrey James Kisak et al. |
U.S. Appl. No. 11/479,559, filed Jun. 30, 2006, David M. Davenport et al. |
U.S. Appl. No. 11/750,681, filed May 18, 2007, Ajith Kuttannair Kumar. |
U.S. Non-Final Office Action issued in connection with related U.S. Appl. No. 14/457,353 dated Nov. 3, 2016. |
Number | Date | Country | |
---|---|---|---|
20150235106 A1 | Aug 2015 | US |
Number | Date | Country | |
---|---|---|---|
61940696 | Feb 2014 | US | |
61940813 | Feb 2014 | US | |
61940660 | Feb 2014 | US | |
61940610 | Feb 2014 | US |