In large scale industries, such as the oil and gas industry, large volumes and varieties of information are collected, processed, and presented to help make decisions. In addition, information must be provided or exchanged for the safety and security of workers in environments that are not amenable to conventional forms of communication. For example, when a hazardous condition arises on an oil rig, a broadcast over a public announcement system may not be effective due to noisy equipment. In addition, grease and other debris on workers hands may prevent effective use of communication devices that require typing or touchscreens.
According to an exemplary embodiment, a wearable information gathering and processing system includes an information obtaining device, the information obtaining device including at least one of a radio frequency identification (RFID) reader, an infra red (IR) detector, a global positioning system (GPS) receiver, a laser measurement device, microphone, or camera; a processing device, the processing device including at least one of a voice recognition processor, a gesture recognition processor, or data processor; and an information-providing device coupled to the processing device, the information-providing device including at least one of a heads up display, a speaker, or a vibrator.
Referring now to the drawings wherein like elements are numbered alike in the several Figures:
As noted above, information collection and processing is important in many industries including the oil and gas industry. The development of wearable technologies such as GOOGLE GLASS, for example, facilitates the integration and management of information in ways that could not have previously been imagined. Embodiments of the systems and methods described herein relate to collection, processing, and presentation of information.
For example, the devices 140 may include a radio frequency identification (RFID) chip 500 as well as an RFID reader 600. That is, according to one embodiment, the wearer of the system 100 may be identified based on an RFID chip 500. The system 100 may include an automatic identification and data capture (AIDC) capability that facilitates identification of the system 100 (and, in turn, its wearer) without human intervention. Additionally, as part of the AIDC capability, the system 100 may include other devices 140 (e.g., global positioning system (GPS) receiver 700 that provide location as well as identification. Various uses of the location information for the system 100 are discussed below. Alternately or additionally, the system 100 may read RFID data from other objects based on including an RFID reader 600. According to this embodiment, the system 100 could perform inventory control or invoicing, for example. The system 100 could also obtain information (e.g., about the security level of an individual with an RFID chip 500) based on reading that information with the RDID reader 600. Two or more systems 100 may be used for triangulation to get a more accurate location for an object that may have been detected by the RFID reader 600, for example. Two or more systems 100 may be synchronized with each other and with other components of the site in which the wearers of the systems 100 are located. The synchronization might facilitate data sharing or shared completion of a document. For example, if each wearer of each system 100 completed part of an electronic checklist, synchronizing the systems 100 would fill the uncompleted portion of the checklist for each wearer and result in one comprehensive document. The synchronization may serve as a proximity alert, as well.
Devices 140 may include data gathering devices for use by the system 100 or, additionally or alternatively, for transmission by the system 100 over a wireless network 150, for example. Exemplary devices 140, in addition to the cameras 130 and RFID reader 600, include a laser measurement device 800, gesture sensor 900 (which may also be among the sensors 210 associated with the glove 200,
Any of the devices 140 may perform continuous data collection and, thus, surveillance of a site. The status of tools in an area may be determined and monitored based on this data collection, for example. The tool status monitoring may include interaction between the system 100 and the tool being monitored. Integration among devices 140 may include a context-camera (CTX) such that images obtained by one of the cameras 130 is integrated with stored information (stored in memory 1210, for example) to provide a correlated image. That is, generally, an image or video may be captured with a camera 130 to determine (with a processing device 1200 that is one of the devices 140 of the system 100 or associated with the network 150) location and the presence of individuals or objects regarding which context information is available. For example, a stored animated image corresponding in some way with the image being captured by a camera 130 may be overlaid on the glasses 120 (i.e., glasses 120 facilitate augmented reality). Exits and the status of exits (e.g., green display if the exit is safe for use, red display if the exit is not usable) may be displayed. During an emergency, additional information (e.g., safety protocol, procedure) or operational alarms may be displayed as overlaid information. Any and all of the information from the various devices 140 may be integrated. For example, location information obtained from a GPS receiver 700 may be combined with the camera 130 data and context-camera functionality such that the exit or emergency information provided, for example, is specific to the location of the wearer of the system 100.
The location information from the GPS receiver 700 may be combined with information received over the network 150 (e.g., map information with identified zones) or identification information gathered with other devices 140 to provide a proximity alarm, for example, based on the wearer of the system 100 entering a hazardous or unauthorized area of a site. Automated processes may be coupled to information gathered by the devices 140. Based on identification or location determined by one or more of the devices 140, parameters measured by one or more devices 140, or other information transmitted by a wearer of another system 100, one or more components of the site where the wearer of the system 100 is located may be automatically shutdown, for example. Another automated process may be job tracking. That is, devices 140 of the system 100 may track tasks associated with a particular job with or without explicit input from the wearer of the system 100. According to embodiments, certain gestures may be recognized (using the gesture sensor 900) as being associated with completion of tasks or image processing may be used based on images captured by the cameras 130. Based on determining completion of the job, an automated process to submit a bill or invoice may be initiated (e.g., by a processing device 1200). Images or other proof of completion gathered by one or more devices 140 may be submitted along with the invoice. A running total of work to date may be maintained and a signal provided when a credit or similar financial limit is reached.
The system 100 may be used to perform interactive processes. According to one embodiment, a training video may be displayed on the glasses 120 and completion of a test, via interaction of the wearer with one or more devices 140 (e.g., touch sensor 1300, voice recognition processor 1000, gesture sensor 900), may be required before the wearer may proceed to a process or a location. The training may include two-way communication with a subject matter expert via the network 150. While all the interaction and information presentation discussed above may be beneficial in most cases, there may be situations when potential distractions must be minimized for the safety of the wearer of the system 100, glove 200, boots 300, and suit 400. Thus, based on location determined according to the devices 140 or information regarding the existence of a hazardous condition received via the network 150, for example, the display on the glasses 120 (heads up display) may be shut down until the location or condition indicated by the information changes.
While one or more embodiments have been shown and described, modifications and substitutions may be made thereto without departing from the spirit and scope of the invention. Accordingly, it is to be understood that the present invention has been described by way of illustrations and not limitation.
This application claims the benefit of an earlier filing date from U.S. Provisional Application Ser. No. 62/183,894 filed Jun. 24, 2015, the entire disclosure of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62183894 | Jun 2015 | US |