Handheld devices have been used to support industrial field operations and maintenance. Such devices have suffered from the use of small user interfaces, including small format touchscreens and keypads. It is difficult for users to interact with these devices to enter data via a touch input. The text is small and difficult to read. Bright outdoor light further obscures the text. Further, while performing operations that require the use of both hands, the device must be continually stowed and retrieved. Since some operations are performed on ladders or platforms and sometimes in inclement weather, the interaction with the device may be very frustrating and distract the user from performing field operations.
A system includes one or more hands free mobile communication devices. Software stored on a machine readable storage device is executed to cause the hands free mobile communication devices to communicate audibly with field operators performing field operations. The operators are instructed regarding operations to be performed. Oral communications are received from the operators, interpreted by the system and routed appropriately. The interpreted communications may automatically trigger further instructions.
Oral communications may include comments from the operator. The system in one embodiment may route or otherwise process the comments as a function of the context in which the comments are generated and the content of the comment.
In some embodiments, the system supports the synchronous execution of procedures by multiple operators and by multiple operators in coordination with an automated control system.
In the following description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments which may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that structural, logical and electrical changes may be made without departing from the scope of the present invention. The following description of example embodiments is, therefore, not to be taken in a limited sense, and the scope of the present invention is defined by the appended claims.
The functions or algorithms described herein may be implemented in software or a combination of software and human implemented procedures in one embodiment. The software may consist of computer executable instructions stored on computer readable media such as memory or other type of storage devices. Further, such functions correspond to modules, which are software on a storage device, hardware, firmware or any combination thereof. Multiple functions may be performed in one or more modules as desired, and the embodiments described are merely examples. The software may be executed on a digital signal processor, ASIC, microprocessor, or other type of processor operating on a computer system, such as a personal computer, server or other computer system.
The procedure may be written in a machine executable program that is stored on a storage device, and provides speech based instructions to the operator. The system generates the speech based instructions, and receives oral data and commands from the field operator 110 as shown in a dialog box 130. The speech generated instructions from the system are provided to the operator. The first instruction in this example dialog 130 is “Confirm Tag number.” The operator responds orally with “11AC12”. The system interprets the oral response and acts in accordance with the procedure, such as perhaps checking the tag number for accuracy and then provides a question for the operator to answer: “Is discharge pressure greater than 90 psi?” The operator responds with “No”. This response is also interpreted and may affect the flow of the procedure. Given the response of “No”, the next instruction is “Check delta P for filters.” The operator responds “Delta P is 50%”. An instruction may then be provided to “Change filters”. The field operator in this example was able to stay on the ladder, and complete a procedure without having to physically interact with the device.
Many different procedures may be facilitated by the system, such as structured tasks, maintenance rounds, procedural operations such as unit start ups and shut downs, product installations and servicing. The system is fully speech interactive and wireless using a body worn mobile device.
In one embodiment, as shown in block diagram 200 in
Controller 210 in one embodiment communicates with multiple operators via devices 215, 220 and 225. Each device may be a wireless mobile communications platform with a wireless or wired headset having a speaker and microphone device indicated at 230, 235 and 240 respectively to provide a hands free mobile communication device. In further embodiments, noise cancellation features may be included, as many industrial plants like refineries can have a high level of ambient noise that would otherwise interfere with communications. In still further embodiments, the mobile communications platform may be integrated with the headset.
In further embodiments, controller 210 may have a console operator interface 250 coupled to it via a wired or wireless network. Console operator interface 250 may be a common workstation in one embodiment, having a display, keyboard, and a data entry device (e.g., mouse, touchpad, etc.), and is also coupled to the distributed control system 212. Still further, operator interface 250 may include voice capabilities for communicating directly with operators and the control device in a manner similar to that provided by the mobile communication devices 215, 220, and 225. Operator interface 250 may also be used as part of the execution of a procedure program being executed by controller 210. For instance, the operator interface 250 may be used to allow an operator to interact directly with a control system to cause actions to occur in the plant that are relevant to tasks being performed by one or more field operators. In one simple example, a valve may need to be closed using the controller 210 prior to the field operator performing a measurement or other action.
In one embodiment, controller 210 may execute a procedure that includes instructions for causing the hands free mobile communication device to communicate audibly with a field operator performing field operations. The mobile communication device may instruct the operator regarding operations to be performed, receive oral communications from the operator, and to interpret/process the received oral communications by means of speech recognition and provide further instructions responsive to the received oral communications.
In some procedures, multiple field operators may be helping to perform the procedure. The activities of the operators may need to be synchronized. One operator may need to complete a task prior to another operator starting a succeeding task. Some tasks may be performed in parallel by two or more operators. Operators may need to perform tasks in synchrony with tasks performed by the automatic process control system. In this case, an operator may have to wait for the control system to perform an action, such as ramping up temperature or pressure in a vessel to a prescribed level, before he or she opens a valve, introduces another chemical to the process, etc. A procedure authoring tool 260, such as one called PRIDE, may be used to generate procedures in a high level language, such as an XML script, for execution. The procedure authoring tool allows the programming of if-then scenarios, as illustrated above, where the field operator was instructed to replace a filter when a difference in pressure read in the field exceeded a certain threshold. The XML script, or other type of machine executable language, may also control timing of tasks, recognition of responses, and allow for comments.
In one embodiment, the script identifies text to read to a field operator, either by text to speech conversion, or by a pre-recorded audio file (e.g., way, etc.). The script then causes a machine executing it to perform speech recognition on audible information and commands received from the operator. Oral communications are recognized and converted to text commands. The format of the script in one example embodiment may take the form of {Instr.;Response;Comment}. The following is a word list corresponding to commands for hands free interaction for one example form that the procedure language may take in one example embodiment:
# word list file, put each voice command on a separate line
# anything after a ‘#’ symbol is ignored by parser
Ok
Cancel
Phonetic Numbers #Unlimited Digits 0-9
Save and Exit
On
Off
Phonetic Alphabets #Unlimited Letters inc “space”
Clear
Next
Previous
Edit
Clear Text
Check Box
Skip Check
Previous Settings
User Settings
Status Page
Reset Values
All
Auto
Manual
All Data
Message
Comment One
Done
Invalid
Pause
Comment
Resume
In one embodiment, one of the text commands causes recording of audible comments provided by the field operator as illustrated in a flowchart 300 in
Keywords in the comment may be identified in the text and compared to known keywords to identify the recipient and to create the communication based on the comment as indicated at 340. The communication in one embodiment may include a context in which the comment is entered. In one example, the word “broken” or “needs repair” may result in a work order being prepared as part of the communication. From the context of the task, such as “inspect pipe”, the work order may be routed to a person or department that is responsible for fixing pipes. There are many other examples that are easy to generate given a list of potential repairs, or other actions that would be known to result from a specific procedure. In some embodiment, the context includes a name of a procedure being performed by the field operator at the time the comment is provided.
In one embodiment, the procedure may synchronize instructions and commands via multiple hands free communication devices such that multiple operators are instructed to perform tasks in the procedure in a synchronized manner. Furthermore, multiple operators may be instructed to perform tasks in synchrony with tasks performed by the automated process control system. Some tasks may be performed in parallel and other tasks are performed only after receiving oral commands indicating that a predetermined task is completed. The tasks may be assigned to specific field operators.
In further embodiments, a handheld mobile communication device may include a graphical user interface, one instance of which is illustrated at 500 in
The user interface 500 may be used to help a field operator keep track of where they are in a procedure in case the operator becomes distracted during performance of the procedure. It may also be used to enter data via a touchscreen interface or other interface other than audio if desired. The graphical user interface, touchscreen interface, or other interface works in synchrony with the speech-based interface so that the user can move seamlessly back and forth between interaction modalities. In some embodiments, speech recognition may be performed at the mobile communication device or the controller 210.
The console operator is first instructed to contact the field operator to inform him or her about the procedure that will be executed, and asks the field operator to confirm arrival at the Coker. The field operator confirms arrival and may be provided with an indication to wait for the next instruction. Waiting music may be provided, or further information regarding the procedure to be performed may be provided in various embodiment. The console operator then performs several operations responsive to instructions provided by execution of the procedure. Steps in chart 800 are numbered in the left most column. Once parameters reach a predetermined level as indicated in step 5a., a beep or other communication may be provided to the field operator to get ready to perform a task. The first task for the field operator is identified in an audible instruction to verify that a furnace has tripped to natural draft. The operator may reply with a confirmatory command, such as “OK”, or “Verified”.
As can be seen in this procedure, operations by the console operator and field operator transfer back and forth. In step 9, another operator may be asked to perform a task, followed by further instructions for the field operator in column 830. In a mixed manual-automated procedure, the automated control system may perform tasks as well, exchanging initiative as dictated by the procedure with the field and console operators.
A block diagram of a computer system that executes programming for performing the procedures is shown in
Computer-readable instructions to execute methods and algorithms described above may be stored on a computer-readable medium such as illustrated at a program storage device 925 are executable by the processing unit 902 of the computer 910. A hard drive, CD-ROM, and RAM are some examples of articles including a computer-readable medium.
The Abstract is provided to comply with 37 C.F.R. §1.72(b) is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims.
This application is a continuation of U.S. patent application Ser. No. 12/658,917, filed Feb. 16, 2010, which application is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5619655 | Weng et al. | Apr 1997 | A |
5752232 | Basore et al. | May 1998 | A |
5926790 | Wright | Jul 1999 | A |
5950167 | Yaker | Sep 1999 | A |
6064323 | Ishii et al. | May 2000 | A |
6128594 | Gulli et al. | Oct 2000 | A |
6173192 | Clark | Jan 2001 | B1 |
6298324 | Zuberec et al. | Oct 2001 | B1 |
6456973 | Fado | Sep 2002 | B1 |
6477437 | Hirota | Nov 2002 | B1 |
6480765 | Gardner | Nov 2002 | B2 |
6574672 | Mitchell et al. | Jun 2003 | B1 |
6598018 | Junqua | Jul 2003 | B1 |
6708150 | Hirayama et al. | Mar 2004 | B1 |
6720890 | Ezroni et al. | Apr 2004 | B1 |
6778963 | Yamamoto et al. | Aug 2004 | B2 |
6803860 | Langner et al. | Oct 2004 | B1 |
6839670 | Stammler et al. | Jan 2005 | B1 |
6859773 | Breton | Feb 2005 | B2 |
6968311 | Knockeart et al. | Nov 2005 | B2 |
7010427 | Ebi | Mar 2006 | B2 |
7089108 | Merritt | Aug 2006 | B2 |
7113857 | Ilan et al. | Sep 2006 | B2 |
7240008 | Hitotsumatsu | Jul 2007 | B2 |
7254545 | Falcon et al. | Aug 2007 | B2 |
7277846 | Satoh | Oct 2007 | B2 |
7289890 | Mitchell et al. | Oct 2007 | B2 |
7349851 | Zuberec et al. | Mar 2008 | B2 |
7363229 | Falcon et al. | Apr 2008 | B2 |
7415326 | Komer et al. | Aug 2008 | B2 |
7516077 | Yokoi et al. | Apr 2009 | B2 |
7525448 | Wilson et al. | Apr 2009 | B1 |
7548861 | Nada | Jun 2009 | B2 |
7568662 | Conner | Aug 2009 | B1 |
7580377 | Judd | Aug 2009 | B2 |
7606715 | Krenz | Oct 2009 | B1 |
20020054130 | Abbott, III | May 2002 | A1 |
20020107694 | Lerg | Aug 2002 | A1 |
20020143553 | Migdol et al. | Oct 2002 | A1 |
20040107097 | Lenane et al. | Jun 2004 | A1 |
20040124998 | Dame | Jul 2004 | A1 |
20040267534 | Beiermeister et al. | Dec 2004 | A1 |
20050091036 | Shackleton et al. | Apr 2005 | A1 |
20070033043 | Hyakumoto | Feb 2007 | A1 |
20070038461 | Abbott et al. | Feb 2007 | A1 |
20070083300 | Mukheriee | Apr 2007 | A1 |
20070136069 | Veliu et al. | Jun 2007 | A1 |
20070215745 | Fleury et al. | Sep 2007 | A1 |
20070219805 | Nomura | Sep 2007 | A1 |
20070265849 | Grost et al. | Nov 2007 | A1 |
20070288242 | Spengler et al. | Dec 2007 | A1 |
20080004875 | Chengalvarayan | Jan 2008 | A1 |
20080010057 | Chengalvarayan et al. | Jan 2008 | A1 |
20080037727 | Sivertsen et al. | Feb 2008 | A1 |
20080039988 | Estabrook et al. | Feb 2008 | A1 |
20080046250 | Agapi et al. | Feb 2008 | A1 |
20080048908 | Sato | Feb 2008 | A1 |
20080065275 | Vizzini | Mar 2008 | A1 |
20080077404 | Akamine et al. | Mar 2008 | A1 |
20080083851 | Perry et al. | Apr 2008 | A1 |
20080103781 | Wasson et al. | May 2008 | A1 |
20080114504 | Goodman et al. | May 2008 | A1 |
20080114598 | Prieto et al. | May 2008 | A1 |
20080114603 | Desrochers | May 2008 | A1 |
20080140306 | Snodgrass et al. | Jun 2008 | A1 |
20080144638 | Bay et al. | Jun 2008 | A1 |
20080154607 | Cizio | Jun 2008 | A1 |
20080201148 | Desrochers | Aug 2008 | A1 |
20080255843 | Sun et al. | Oct 2008 | A1 |
20090012785 | Chengalvarayan | Jan 2009 | A1 |
20090018840 | Lutz et al. | Jan 2009 | A1 |
20090018842 | Caire et al. | Jan 2009 | A1 |
20090055180 | Coon et al. | Feb 2009 | A1 |
20090083034 | Hernandez et al. | Mar 2009 | A1 |
20090164216 | Chengalvarayan et al. | Jun 2009 | A1 |
20090182559 | Gerl et al. | Jul 2009 | A1 |
20090182562 | Caire et al. | Jul 2009 | A1 |
20090187406 | Sakuma et al. | Jul 2009 | A1 |
20100070932 | Hur | Mar 2010 | A1 |
20110202351 | Plocher et al. | Aug 2011 | A1 |
Number | Date | Country |
---|---|---|
1233407 | Aug 2002 | EP |
1832850 | Sep 2007 | EP |
1906539 | Apr 2008 | EP |
2040250 | Mar 2009 | EP |
WO-0235518 | May 2002 | WO |
WO-2006082764 | Aug 2006 | WO |
Entry |
---|
“U.S. Appl. No. 12/658,917, Final Office Action mailed Sep. 9, 2013”, 6 pgs. |
“U.S. Appl. No. 12/658,917, Non Final Office Action mailed Apr. 18, 2013”, 7 pgs. |
“U.S. Appl. No. 12/658,917, Notice of Allowance mailed Nov. 18, 2013”, 9 pgs. |
“U.S. Appl. No. 12/658,917, Respnse filed Jul. 18, 2013 to Non Final Office Action mailed Apr. 18, 2013”, 8 pgs. |
“U.S. Appl. No. 12/658,917, Response filed Nov. 11, 2013 to Final Office Action mailed Sep. 9, 2013”, 7 pgs. |
Number | Date | Country | |
---|---|---|---|
20140136442 A1 | May 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12658917 | Feb 2010 | US |
Child | 14162459 | US |