This disclosure relates to a system and method for viewing video, images, and data from a real-time data acquisition and recording system used in high value mobile assets.
High value mobile assets such as locomotives, aircraft, mass transit systems, mining equipment, transportable medical equipment, cargo, marine vessels, and military vessels typically employ onboard data acquisition and recording “black box” systems and/or “event recorder” systems. These data acquisition and recording systems, such as event data recorders or flight data recorders, log a variety of system parameters used for incident investigation, crew performance evaluation, fuel efficiency analysis, maintenance planning, and predictive diagnostics. A typical data acquisition and recording system comprises digital and analog inputs, as well as pressure switches and pressure transducers, which record data from various onboard sensor devices. Recorded data may include such parameters as speed, distance traveled, location, fuel level, engine revolution per minute (RPM), fluid levels, operator controls, pressures, current and forecasted weather conditions and ambient conditions. In addition to the basic event and operational data, video and audio event/data recording capabilities are also deployed on many of these same mobile assets. Typically, data is extracted from data recorders, after an incident has occurred involving an asset and investigation is required, once the data recorder has been recovered. Certain situations may arise where the data recorder cannot be recovered or the data is otherwise unavailable. In these situations, the data, such as event and operational data, video data, and audio data, acquired by the data acquisition and recording system is needed promptly regardless of whether physical access to the data acquisition and recording system or the data is available.
This disclosure relates generally to real-time data acquisition and recording systems used in high value mobile assets. The teachings herein can provide real-time, or near real-time, access to data, such as event and operational data, video data, and audio data, recorded by a real-time data acquisition and recording system on a high value mobile asset. One implementation of a method for processing, storing, and transmitting data from at least one mobile asset described herein includes receiving, using a multimedia management system onboard the mobile asset, data based on at least one data signal from at least one of: at least one 360 degree camera; at least one fixed camera; and at least one microphone; receiving, using a data recorder onboard the mobile asset, the data; encoding, using a data encoder of the data recorder, a record comprising a bit stream based on the data; and storing, using an onboard data manager of the data recorder, at least one of the data and the record at a configurable first predetermined rate in at least one local memory component of the data recorder.
One implementation of a method for displaying data from at least one mobile asset described herein includes receiving, using a web server, a request comprising specified multimedia data of the at least one mobile asset and a specified view mode; receiving, using the web server, the specified multimedia data of the at least one mobile asset from a remote memory component; and displaying, using a display device, the specified multimedia data of the at least one mobile asset in the specified view.
One implementation of a real-time data acquisition and recording system described herein includes at least one of at least one 360 degree camera, at least one fixed camera, and at least one microphone; a multimedia management system onboard the mobile asset configured to receive data based on at least one data signal from the at least one of the at least one 360 degree camera, at least one fixed camera, and at least one microphone; a data recorder onboard the mobile asset comprising at least one local memory component, an onboard data manager, and a data encoder, the data recorder configured to receive the data from the multimedia management system; the data encoder configured to encode a record comprising a bit stream based on the data; and the onboard data manager configured to store at least one of the data and the record at a configurable first predetermined rate in the at least one local memory component.
Variations in these and other aspects of the disclosure will be described in additional detail hereafter.
The description herein makes reference to the accompanying drawings wherein like reference numerals refer to like parts throughout the several views, and wherein:
A real-time data acquisition and recording system described herein provides real-time, or near real-time, access to a wide range of data, such as event and operational data, video data, and audio data, of a high value asset to remotely located users such as asset owners, operators and investigators. The data acquisition and recording system records data, via a data recorder, relating to the asset and streams the data to a remote data repository and remotely located users prior to, during, and after an incident has occurred. The data is streamed to the remote data repository in real-time, or near real-time, making information available at least up to the time of an incident or emergency situation, thereby virtually eliminating the need to locate and download the “black box” in order to investigate an incident involving the asset and eliminating the need to interact with the data recorder on the asset to request a download of specific data, to locate and transfer files, and to use a custom application to view the data. The system of the present disclosure retains typical recording capabilities and adds the ability to stream data to a remote data repository and remote end user prior to, during, and after an incident. In the vast majority of situations, the information recorded in the data recorder is redundant and not required as data has already been acquired and stored in the remote data repository.
Prior to the system of the present disclosure, data was extracted from the “black box” or “event recorder” after an incident had occurred and an investigation was required. Data files containing time segments recorded by the “black box” had to be downloaded and retrieved from the “black box” and then viewed by a user with proprietary software. The user would have to obtain physical or remote access to the asset, select the desired data to be downloaded from the “black box,” download the file containing the desired information to a computing device, and locate the appropriate file with the desired data using a custom application that operates on the computing device. The system of the present disclosure has eliminated the need for the user to perform these steps, only requiring the user to use a common web browser to navigate to the desired data. The remotely located user may access a common web browser to navigate to desired data relating to a selected asset to view and analyze the operational efficiency and safety of assets in real-time or near real-time.
The remotely located user, such as an asset owner, operator, and/or investigator, may access a common web browser to navigate to live and/or historic desired data relating to a selected asset to view and analyze the operational efficiency and safety of assets in real-time or near real-time. The ability to view operations in real-time, or near real-time, enables rapid evaluation and adjustment of behavior. During an incident, for example, real-time information and/or data can facilitate triaging the situation and provide valuable information to first responders. During normal operation, for example, real-time information and/or data can be used to audit crew performance and to aid network wide situational awareness.
The system of the present disclosure uses 360 degree cameras in, on, or in the vicinity of a mobile asset as part of a data acquisition and recording system. Prior to the system of the present disclosure, “black box” and/or “event recorders” did not include 360 degrees cameras in, on, or in the vicinity of the mobile asset. The system of the present disclosure adds the ability to use and record videos using 360 degree cameras as part of the data acquisition and recording system, providing 360 degree views in, on, or in the vicinity of the mobile asset to a remote data repository and a remote user and investigators prior to, during, and after an incident involving the mobile asset has occurred. The ability to view operations and/or 360 degree video in real-time, or near real-time, enables rapid evaluation and adjustment of crew behavior. Owners, operators, and investigators can view and analyze the operational efficiency, safety of people, vehicles, and infrastructures and can investigate or inspect an incident. The ability to view 360 degree video from the mobile asset enables rapid evaluation and adjustment of crew behavior. During an incident, for example, 360 degree video can facilitate triaging the situation and provide valuable information to first responders and investigators. During normal operation, for example, 360 degree video can be used to audit crew performance and to aid network wide situational awareness. The 360 degree cameras and fixed cameras provide a complete picture for situations to provide surveillance video for law enforcement and/or rail police, inspection of critical infrastructure, monitoring of railroad crossings, view track work progress, crew auditing both inside the cab and in the yard, and real-time remote surveillance.
Prior systems required users to download video files containing time segments in order to view the video files using a proprietary software application or other external video playback applications. The data acquisition and recording system of the present disclosure provides 360 degree video and image information and audio information that can be displayed to a remote user through the use of a virtual reality device and/or through a standard web client, thereby eliminating the need to download and use external applications to watch the videos. Additionally, remotely located users can view 360 degree videos in various modes through the use of a virtual reality device or through a standard web client, such as a web browser, thereby eliminating the need to download and use external applications to watch the video. Prior video systems required the user to download video files containing time segments of data that were only viewable using proprietary application software or other external video playback applications which the user had to purchase separately.
Data may include, but is not limited to, video and image information from cameras located at various locations in, on or in the vicinity of the asset and audio information from microphones located at various locations in, on or in vicinity of the asset. A 360 degree camera is a camera that provides a 360 degree spherical field of view and/or a 360 degree hemispherical field of view. Using 360 degree cameras in, on or in the vicinity of an asset provides the ability to use and record video using the 360 degree cameras as part of DARS, thereby making the 360 degree view in, on or in the vicinity of the asset available to a remote data repository, remotely located users, and investigators prior to, during and after an incident.
Data recorder 108 gathers video data, audio data, and other data and/or information from a wide variety of sources, which can vary based on the asset's configuration, through onboard data links. In this implementation, data recorder 108 receives data from a video management system 104 that continuously records video data and audio data from 360 degree cameras 102 and fixed cameras 106 that are placed in, on or in the vicinity of the asset 130 and the video management system 104 stores the video and audio data to the crash hardened memory module 110, and can also store the video and audio data in the non-crash hardened removable storage device of the second embodiment. Different versions of the video data are created using different bitrates or spatial resolutions and these versions are separated into segments of variable length, such as thumbnails, five minute low resolution segments, and five minute high resolution segments.
The data encoder 114 encodes at least a minimum set of data that is typically defined by a regulatory agency. The data encoder 114 receives video and audio data from the video management system 104 and compresses or encodes the data and time synchronizes the data in order to facilitate efficient real-time transmission and replication to a remote data repository 120. The data encoder 114 transmits the encoded data to the onboard data manager 112 which then sends the encoded video and audio data to the remote data repository 120 via a remote data manager 118 located in the data center 130 in response to an on-demand request by a remotely located user 134 or in response to certain operating conditions being observed onboard the asset 130. The onboard data manager 112 and the remote data manager 118 work in unison to manage the data replication process. The remote data manager 118 in the data center 132 can manage the replication of data from a plurality of assets. The video and audio data stored in the remote data repository 120 is available to a web server 122 for the remote located user 134 to access.
The onboard data manager 112 also sends data to a queueing repository (not shown). The onboard data manager 112 monitors the video and audio data stored in the crash hardened memory module 110, and the optional non-crash hardened removable storage device of the second embodiment, by the video management system 104 and determines whether it is in near real-time mode or real-time mode. In near real-time mode, the onboard data manager 112 stores the encoded data, including video data, audio data, and any other data or information, received from the data encoder 114 and any event information in the crash hardened memory module 110, and the optional non-crash hardened removable storage device of the second embodiment, and in the queueing repository. After five minutes of encoded data has accumulated in the queueing repository, the onboard data manager 112 stores the five minutes of encoded data to the remote data repository 120 via the remote data manager 118 in the data center 132 through a wireless data link 116. In real-time mode, the onboard data manager 112 stores the encoded data, including video data, audio data, and any other data or information, received from the data encoder 114 and any event information to the remote data repository 120 via the remote data manager 118 in the data center 132 through the wireless data link 116. The onboard data manager 112 and the remote data manager 118 can communicate over a variety of wireless communications links. Wireless data link 116 can be, for example, a wireless local area network (WLAN), wireless metropolitan area network (WMAN), wireless wide area network (WWAN), a private wireless system, a cellular telephone network or any other means of transferring data from the data recorder 108 to, in this example, the remote data manager 118. The process of sending and retrieving video data and audio data remotely from the asset 130 requires a wireless data connection between the asset 130 and the data center 132. When a wireless data connection is not available, the data is stored and queued in the crash hardened memory module 110, and the optional non-crash hardened removable storage device of the second embodiment, until wireless connectivity is restored. The video, audio, and any other additional data retrieval process resumes as soon as wireless connectivity is restored.
In parallel with data recording, the data recorder 108 continuously and autonomously replicates data to the remote data repository 120. The replication process has two modes, a real-time mode and a near real-time mode. In real-time mode, the data is replicated to the remote data repository 120 every second. In near real-time mode, the data is replicated to the remote data repository 120 every five minutes. The rate used for near real-time mode is configurable and the rate used for real-time mode can be adjusted to support high resolution data by replicating data to the remote data repository 120 every 0.10 seconds. Near real-time mode is used during normal operation, under most conditions, in order to improve the efficiency of the data replication process.
Real-time mode can be initiated based on events occurring onboard the asset 130 or by a request initiated from the data center 132. A typical data center 132 initiated request for real-time mode is initiated when the remotely located user 134 has requested real-time information from a web client 126. A typical reason for real-time mode to originate onboard the asset 130 is the detection of an event or incident such as an operator initiating an emergency stop request, emergency braking activity, rapid acceleration or deceleration in any axis, or loss of input power to the data recorder 108. When transitioning from near real-time mode to real-time mode, all data not yet replicated to the remote data repository 120 is replicated and stored in the remote data repository 120 and then live replication is initiated. The transition between near real-time mode and real-time mode typically occurs in less than five seconds. After a predetermined amount of time has passed since the event or incident, a predetermined amount of time of inactivity, or when the user 134 no longer desires real-time information from the asset 130, the data recorder 108 reverts to near real-time mode. The predetermined amount of time required to initiate the transition is configurable and is typically set to ten minutes.
When the data recorder 108 is in real-time mode, the onboard data manager 112 attempts to continuously empty its queue to the remote data manager 118, storing the data to the crash hardened memory module 110, and the optional non-crash hardened removable storage device of the second embodiment, and sending the data to the remote data manager 118 simultaneously.
Upon receiving video data, audio data, and any other data or information to be replicated from the data recorder 108, the remote data manager 118 stores the data to the remote data repository 120 in the data center 130. The remote data repository 120 can be, for example, cloud-based data storage or any other suitable remote data storage. When data is received, a process is initiated that causes a data decoder (not shown) to decode the recently replicated data from the remote data repository 120 and send the decoded data to a remote event detector (not shown). The remote data manager 118 stores vehicle event information in the remote data repository 120. When the remote event detector receives the decoded data, it processes the decoded data to determine if an event of interest is found in the decoded data. The decoded information is then used by the remote event detector to detect events, incidents, or other predefined situations, in the data occurring with the asset 130. Upon detecting an event of interest from the decoded data previously stored in the remote data repository 120, the remote event detector stores the event information and supporting data in the remote data repository 120.
Video data, audio data, and any other data or information is available to the user 134 in response to an on-demand request by the user 134 and/or is sent by the onboard data manager 112 to the remote data repository 120 in response to certain operating conditions being observed onboard the asset 130. Video data, audio data, and any other data or information stores in the remote data repository 120 is available on the web server 122 for the user 134 to access. The remotely located user 134 can access the video data, audio data, and any other data or information relating to the specific asset 130, or a plurality of assets, stored in the remote data repository 120 using the standard web client 126, such as a web browser, or a virtual reality device 128 which, in this implementation, can display thumbnail images of selected cameras. The web client 126 communicates the user's 134 request for video, audio, and/or other information to the web server 122 through a network 124 using common web standards protocols, and techniques. Network 124 can be, for example, the Internet. Network 124 can also be a local area network (LAN), metropolitan area network (MAN), wide area network (WAN), virtual private network (VPN), a cellular telephone network or any other means of transferring data from the web server 122 to, in this example, the web client 126. The web server 122 requests the desired data from the remote data repository 120. The web server 122 then sends the requested data to the web client 126 that provides playback and real-time display of standard video and 360 degree video. The web client 126 plays the video data, audio data, and any other data or information for the user 134 who can interact with the 360 degree video data for viewing and analysis. The user 134 can also download the video data, audio data, and any other data or information using the web client 126 and can then use the virtual reality device 128 to interact with the 360 degree video data for viewing and analysis.
The web client 126 can be enhanced with a software application that provides the playback of 360 degree video in a variety of different modes. The user 134 can elect the mode in which the software application presents the video playback such as, for example, fisheye view as shown in
In another implementation, the encoded record is then sent to the onboard data manager 112 that sequentially combines a series of records in chronological order into record blocks that include up to five minutes of data. An interim record block includes less than five minutes of data while a full record block includes a full five minutes of data. Each record block includes all the data required to fully decode the included signals, including a data integrity check. At a minimum, a record block must start with a start record and end with an end record.
In order to ensure that all of the encoded signal data is saved to the crash hardened memory module 110, and the optional non-crash hardened removable storage device of the second embodiment, should the data recorder 108 lose power, the onboard data manager 112 stores interim record blocks in the crash hardened memory module 110, and the optional non-crash hardened removable storage device of the second embodiment, at a predetermined rate, where the predetermined rate is configurable and/or variable. Interim record blocks are saved at least once per second but can also be saved as frequently as once every tenth of a second. The rate at which interim record blocks are saved depends on the sampling rates of each signal. Every interim record block includes the full set of records since the last full record block. The data recorder 108 can alternate between two temporary storage locations in the crash hardened memory module 110 when recording each interim record block to prevent the corruption or loss of more than one second of data when the data recorder 108 loses power while storing data to the crash hardened memory module 110. Each time a new interim record block is saved to a temporary crash hardened memory location it will overwrite the existing previously stored interim record block in that location.
Every five minutes, in this implementation, when the data recorder 108 is in near real-time mode, the onboard data manager 112 stores a full record block including the last five minutes of encoded signal data into a record segment in the crash hardened memory module 110, and the optional non-crash hardened removable storage device of the second embodiment, and sends a copy of the full record block, comprising five minutes of video data, audio data, and/or information, to the remote data manager 118 to be stored in the remote data repository 120 for a predetermined retention period such as two years. The crash hardened memory module 110, and the optional non-crash hardened removable storage device of the second embodiment, stores a record segment of the most recent record blocks for a mandated storage duration, which in this implementation is the federally mandated duration that the data recorder 108 must store operational or video data in the crash hardened memory module 110 with an additional 24 hour buffer, and is then overwritten.
For simplicity of explanation, process 200 and process 300 are depicted and described as a series of steps. However, steps in accordance with this disclosure can occur in various orders and/or concurrently. Additionally, steps in accordance with this disclosure may occur with other steps not presented and described herein. Furthermore, not all illustrated steps may be required to implement a method in accordance with the disclosed subject matter.
While the present disclosure has been described in connection with certain embodiments, it is to be understood that the disclosure is not to be limited to the disclosed embodiments but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims, which scope is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures as is permitted under the law.
This application claims priority to U.S. Provisional Application No. 62/337,225, filed May 16, 2016, claims priority to U.S. Provisional Application No. 62/337,227, filed May 16, 2016, claims priority to U.S. Provisional Application No. 62/337,228, filed May 16, 2016, claims priority to and is a continuation-in-part of U.S. Non-provisional application Ser. No. 15/595,650, filed May 15, 2017, now U.S. Pat. No. 9,934,623, issued Apr. 3, 2018, claims priority to and is a continuation-in-part of U.S. Non-provisional application Ser. No. 15/907,486, filed Feb. 28, 2018, and claims priority to and is a divisional of U.S. Non-provisional application Ser. No. 15/595,689, filed May 15, 2017, to the extent allowed by law and the contents of which are incorporated herein by reference in the entireties.
Number | Name | Date | Kind |
---|---|---|---|
5065321 | Bezos et al. | Nov 1991 | A |
5377497 | Powell | Jan 1995 | A |
5440336 | Buhro et al. | Aug 1995 | A |
5455684 | Fujinami et al. | Oct 1995 | A |
5627508 | Cooper et al. | May 1997 | A |
5638299 | Miller | Jun 1997 | A |
6109532 | Schindler et al. | Aug 2000 | A |
6263268 | Nathanson | Jul 2001 | B1 |
6308044 | Wright et al. | Oct 2001 | B1 |
6392692 | Monroe | May 2002 | B1 |
6496777 | Tennison et al. | Dec 2002 | B2 |
6659861 | Faris et al. | Dec 2003 | B1 |
6892167 | Polan et al. | May 2005 | B2 |
6915190 | Galasso | Jul 2005 | B2 |
7003289 | Kolls | Feb 2006 | B1 |
7302323 | Anderson et al. | Nov 2007 | B2 |
7440848 | Anderson | Oct 2008 | B2 |
7640083 | Monroe | Dec 2009 | B2 |
7755479 | Webb, Sr. | Jul 2010 | B2 |
7756617 | Cluff et al. | Jul 2010 | B1 |
7761544 | Manasseh et al. | Jul 2010 | B2 |
7782365 | Levien | Aug 2010 | B2 |
7924153 | Furey et al. | Apr 2011 | B1 |
7953425 | Jordan | May 2011 | B2 |
8081214 | Vanman et al. | Dec 2011 | B2 |
8584184 | Thomas | Nov 2013 | B2 |
8589994 | Monroe | Nov 2013 | B2 |
8612170 | Smith et al. | Dec 2013 | B2 |
8625878 | Haas et al. | Jan 2014 | B2 |
8768534 | Lentz | Jul 2014 | B2 |
8798148 | Kostrzewski et al. | Aug 2014 | B2 |
8942426 | Bar-Am | Jan 2015 | B2 |
8979363 | Groeneweg et al. | Mar 2015 | B2 |
9003052 | Holstein | Apr 2015 | B2 |
9031791 | Nedilko et al. | May 2015 | B2 |
9049433 | Prince | Jun 2015 | B1 |
9050984 | Li et al. | Jun 2015 | B2 |
9191053 | Ziarno et al. | Nov 2015 | B2 |
9235765 | Bentley et al. | Jan 2016 | B2 |
9260122 | Haas et al. | Feb 2016 | B2 |
9260199 | Sundararajan et al. | Feb 2016 | B2 |
9285294 | Jordan et al. | Mar 2016 | B2 |
9285295 | Jordan et al. | Mar 2016 | B2 |
9313276 | Pereira | Apr 2016 | B2 |
9346476 | Dargy et al. | May 2016 | B2 |
9500545 | Smith et al. | Nov 2016 | B2 |
10242714 | Roy | Mar 2019 | B2 |
20030152145 | Kawakita | Aug 2003 | A1 |
20040027255 | Greenbaum | Feb 2004 | A1 |
20040039504 | Coffee et al. | Feb 2004 | A1 |
20040260777 | Kolb et al. | Dec 2004 | A1 |
20050240343 | Schmidt et al. | Oct 2005 | A1 |
20050288903 | Jackson et al. | Dec 2005 | A1 |
20060276943 | Anderson et al. | Dec 2006 | A1 |
20070076312 | Jordan | Apr 2007 | A1 |
20080176583 | Brachet et al. | Jul 2008 | A1 |
20090102638 | Olsen et al. | Apr 2009 | A1 |
20100023201 | Kinney et al. | Jan 2010 | A1 |
20110077819 | Sakaguchi et al. | Mar 2011 | A1 |
20130274954 | Jordan, Jr. et al. | Oct 2013 | A1 |
20130307693 | Stone et al. | Nov 2013 | A1 |
20140052315 | Isailovski et al. | Feb 2014 | A1 |
20140192151 | Wang | Jul 2014 | A1 |
20140285337 | Gebhardt | Sep 2014 | A1 |
20140347481 | Kostrzewski et al. | Nov 2014 | A1 |
20150009331 | Venkatraman | Jan 2015 | A1 |
20150094885 | Dargy et al. | Apr 2015 | A1 |
20150149118 | Jordan et al. | May 2015 | A1 |
20150185090 | Groeneweg et al. | Jul 2015 | A1 |
20150221141 | Negritto | Aug 2015 | A1 |
20150225002 | Branka et al. | Aug 2015 | A1 |
20150339863 | Allwardt et al. | Nov 2015 | A1 |
20150363981 | Ziarno et al. | Dec 2015 | A1 |
20160046308 | Chung et al. | Feb 2016 | A1 |
20160073025 | Cilia | Mar 2016 | A1 |
20160073346 | Nicks et al. | Mar 2016 | A1 |
20160075443 | Schmutz et al. | Mar 2016 | A1 |
20160131483 | Jordan et al. | May 2016 | A1 |
20200294401 | Kerecsen | Sep 2020 | A1 |
20200349345 | Hodge | Nov 2020 | A1 |
Number | Date | Country |
---|---|---|
2689423 | Dec 2015 | CA |
102123274 | Mar 2013 | CN |
20150096203 | Aug 2015 | KR |
2573272 | Jan 2020 | RU |
2004019601 | Jun 2004 | WO |
2006128124 | Nov 2006 | WO |
2015150369 | Oct 2015 | WO |
Entry |
---|
http://www.nbcnews.com/storyline/egyptair-crash/some-airlines-stream-black-box-data-cost-keeps-others-offline-n580966?cid=eml_onsite. |
Number | Date | Country | |
---|---|---|---|
20190244447 A1 | Aug 2019 | US |
Number | Date | Country | |
---|---|---|---|
62337225 | May 2016 | US | |
62337227 | May 2016 | US | |
62337228 | May 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15595689 | May 2017 | US |
Child | 16385745 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15595650 | May 2017 | US |
Child | 15907486 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15907486 | Feb 2018 | US |
Child | 15595689 | US |