Embodiments of the disclosure generally relate to controls of industrial systems, and more particularly methods and systems for interfacing with controllers.
In industrial distributed control systems, local controllers with human-machine interfaces (HMIs) may be placed near individual subsystems to which they provide associated control, management, supervision, and operation functions to the subsystem or groups thereof. Examples of such applications that include such controllers include those that interfaces with machineries and components of power plants, factories, refineries, power distribution sites, wind or solar farms, among others.
One class of local controllers is embodied as a touch screen HMI that displays data and control features graphically. Because of the harsh, crowded, and tumultuous physical conditions associated with industrial environments, ruggedized HMIs are often used. These ruggedized HMIs often have impact-resistant designs with limited display areas that result in densely arranged graphics, data displays, and controls that are selectable and controllable by an operator. And, because of the dense arrangements of the display and control and the physical conditions associated with the industrial environments, controls may be mistakenly or inadvertently touched by the operator causing inconvenience and loss and improper conditions of the controller in many circumstances.
What are needed are devices, systems and methods that overcome challenges in the present art, some of which are described above.
Exemplified methods and systems provide a ruggedized graphical HMI having an interface that mitigate or prevent touch errors and/or inadvertent touches through the use of multiple touch inputs, at a graphical user interface, of a touch-screen input device, to trigger an associated user interface command. In some embodiments, the multiple touch inputs comprise an input at two locations, one in relative association, with a displayed interface command, to trigger the command. The multiple touch inputs may be invoke via two fingers placed at the HMI by the operator, e.g., a finger to be landed on the touch screen and another finger to be tapped on the user control on the touch screen to trigger an operation associated with the user control. This may be referred to as a “Land and Tap” input. The command-invocation multiple touch inputs beneficially provide a mechanism of mistake proofing against unintended triggering of a command or an operation due to unintentional finger tap on user control like buttons.
In some embodiments, the HMI presents a “Set Point” button for triggering the setting of a parameter value on a field device. This Set-Point button is associated with a critical operation of an industrial machinery or subsystem in an industrial control application. If the HMI display is cluttered, or densely arranged, with several user controls on one HMI screen, which often occurs due to the number of controllable inputs associated with such industrial machineries and subsystems, there is always a risk that the operator may mistakenly or inadvertently touch the ‘set point’ button. The exemplified “Land and Tap” input may be invoke via both the thumb of the operator being placed on the screen near the Set-Point button and without touching Set-Point button, and the index finger being simultaneously placed on the Set Point button. To this end, a single input received at the Set-Point button does not invoke or trigger the attached operation associated with the button.
According to an aspect, a method is disclosed of receiving multiple touch inputs, at a graphical user interface, of a touch-screen input device, in an industrial automation system, to trigger an associated user interface command (e.g., a graphical user interface command). The method includes presenting, by a processor, via a touch-screen display, a graphical element (e.g., an application icon or a control set-point) associated with execution of an application or a control commands. And, either, i) upon receipt, via the touch-screen display, of a first input at a first position corresponding to the graphical element, determining, by the processor, receipt of a second associated touch input at a second position associated with the activation of the graphical element, or ii) upon receipt, via the touch-screen display, of the second associated touch input at the second position associated with the activation of the graphical element, and determining receipt of the first input at the first position associated with the graphical element; in response to the first input and second associated touch input not being concurrently received, maintaining, by the processor, the graphical element associated with execution of the application or the control commands in a non-activated state; and in response to the first input and second associated touch input being concurrently received, causing, by the processor, activation of the graphical element associated with execution of the application or the control commands.
In some embodiments, the method includes presenting, by the processor, via the touch-screen display, a second graphical element for receipt of the second associated touch input.
In some embodiments, the second associated touch input comprises a point-based input received at one or more pre-defined virtual region (e.g., lower right or lower left of the icons—for each of right hand and left hand operators) located proximal to the graphical element associated with execution of the application or control command.
In some embodiments, the method includes presenting, by the processor, via the touch-screen display, a graphical element associated with selection of a location for the pre-defined virtual region (e.g., to select left hand control or right hand control).
In some embodiments, the second associated touch input comprises a point-based input received for a minimum time parameter.
In some embodiments, the second associated touch input comprises a point-based input received between a minimum time parameter and a maximum time parameter, wherein receipt of inputs outside the minimum and maximum time parameters are ignored as a non-activated input.
In some embodiments, the method includes presenting, by the processor, via the touch-screen display, a second graphical element for receipt of the second associated touch input at one of a lower or upper corner (e.g., lower left hand or lower right hand corner) of the touch-screen display (e.g., to require two hands—one to touch the unlock button and one to activate a command).
In some embodiments, the graphical element are displayed in a dense matrix of graphical elements.
In some embodiments, the method includes presenting, by the processor, via the touch-screen display, an indicia (e.g., screen change color) of the second associated touch input being received.
In some embodiments, the method includes, in response to a third touch input concurrently received with the first input and the second input, maintaining, by the processor, the graphical element associated with execution of the application or control command in the non-activated state.
According to another aspect, a system is disclosed (e.g., for in an industrial automation system) to trigger an associated user interface command using multiple concurrently-received touch inputs, at a graphical user interface, of a touch-screen input device. The system includes a touch-screen display; a processor operatively coupled to the touch-screen display; and a memory operatively coupled to the processor, the memory having instructions stored thereon, wherein execution of the instructions, cause the processor to: present, via the touch-screen display, a graphical element (e.g., an application icon or a control setpoint) associated with execution of an application or a control commands; either i) upon receipt, via the touch-screen display, of a first input at a first position associated with the graphical element, determine a second associated touch input at a second position associated with the activation of the graphical element, or ii) upon receipt, via the touch-screen display, of the second associated touch input at the second position associated with the activation of the graphical element, determine the first input at the first position associated with the graphical element; in response to the first and second touch input not being concurrently received with the input, maintain the graphical element associated with execution of an application or a control commands in a non-activated state; and in response to the first and second touch input being concurrently received with the input, cause activation of the graphical element associated with execution of an application or a control command.
In some embodiments, the instructions, when executed by the processor, further cause the processor to: present via the touch-screen display, a second graphical element for receipt of the second associated touch input.
In some embodiments, the second associated touch input comprises a point-based input received at one or more pre-defined virtual region (e.g., lower right or lower left of the icons—for each of right hand and left hand operators) located proximal to the graphical element associated with execution of the application or control command.
In some embodiments, the instructions, when executed by the processor, further cause the processor to: present, via the touch-screen display, a graphical element associated with selection of a location for the pre-defined virtual region (e.g., to select left hand control or right hand control).
In some embodiments, the second associated touch input comprises a point-based input received for a minimum time parameter.
In some embodiments, the second associated touch input comprises a point-based input received between a minimum time parameter and a maximum time parameter, wherein receipt of inputs outside the minimum and maximum time parameters are ignored as non-activated input.
In some embodiments, the instructions, when executed by the processor, further cause the processor to: present, via the touch-screen display, a second graphical element for receipt of the second associated touch input at one of a lower or upper corner of the touch-screen display.
In some embodiments, the graphical element are displayed in a dense matrix of graphical elements.
In some embodiments, the instructions, when executed by the processor, further cause the processor to: in response to a third touch input concurrently received with the first input and the second input, maintain the graphical element associated with execution of the application or control command in the non-activated state.
According to another aspect, a non-transitory computer readable medium to trigger an associated user interface command using multiple concurrently-received touch inputs, at a graphical user interface, of a touch-screen input device, is disclosed. The computer readable medium has instructions stored thereon, wherein execution of the instructions, cause the processor to: present, via a touch-screen display associated with a computing device, a graphical element (e.g., an application icon or a control set-point) associated with execution of an application or a control commands; either i) upon receipt, via the touch-screen display, of a first input at a first position corresponding to the graphical element, determine a second associated touch input at a second position associated with the activation of the graphical element; or ii) upon receipt, via the touch-screen display, of the second associated touch input at the second position associated with the activation of the graphical element, determine the first input at the first position associated with the graphical element; in response to the first and second touch input not being concurrently received with the input, maintain the graphical element associated with execution of an application or a control commands in a non-activated state; and in response to the first and second touch input being concurrently received with the input, cause activation of the graphical element associated with execution of an application or a control commands.
The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art. Methods and materials similar or equivalent to those described herein can be used in the practice or testing of the present disclosure.
As used in the specification and the appended claims, the singular forms “a,” “an” and “the” include plural referents unless the context clearly dictates otherwise. Ranges may be expressed herein as from “about” one particular value, and/or to “about” another particular value. When such a range is expressed, another embodiment includes—from the one particular value and/or to the other particular value. Similarly, when values are expressed as approximations, by use of the antecedent “about,” it will be understood that the particular value forms another embodiment. It will be further understood that the endpoints of each of the ranges are significant both in relation to the other endpoint, and independently of the other endpoint.
“Optional” or “optionally” means that the subsequently described event or circumstance may or may not occur, and that the description includes instances where said event or circumstance occurs and instances where it does not.
Throughout the description and claims of this specification, the word “comprise” and variations of the word, such as “comprising” and “comprises,” means “including but not limited to,” and is not intended to exclude, for example, other additives, components, integers or steps. “Exemplary” means “an example of” and is not intended to convey an indication of a preferred or ideal embodiment. “Such as” is not used in a restrictive sense, but for explanatory purposes.
Disclosed are components that can be used to perform the disclosed methods and systems. These and other components are disclosed herein, and it is understood that when combinations, subsets, interactions, groups, etc. of these components are disclosed that while specific reference of each various individual and collective combinations and permutation of these may not be explicitly disclosed, each is specifically contemplated and described herein, for all methods and systems. This applies to all aspects of this application including, but not limited to, steps in disclosed methods. Thus, if there are a variety of additional steps that can be performed it is understood that each of these additional steps can be performed with any specific embodiment or combination of embodiments of the disclosed methods.
It is understood that throughout this specification the identifiers “first”, “second”, “third”, and such, are used solely to aid in distinguishing the various components and steps of the disclosed subject matter. The identifiers “first”, “second”, “third”, and such, are not intended to imply any particular order, sequence, amount, preference, or importance to the components or steps modified by these terms.
The present methods and systems may be understood more readily by reference to the following detailed description of preferred embodiments and the Examples included therein and to the Figures and their previous and following description.
Referring still to
As illustrated in
Because of the dense presentation of widgets on a HMI for an industrial controller, mistakes may occur by an operator when the wrong area of the HMI is touched. In addition, the HMI may be located in space confined areas that increase the risk of inadvertent touching of the graphical input of the HMI.
To mitigate, errors in input or unintentional touches, the exemplified system and method uses multiple touch inputs that may be specified in a given sequence and for a given duration to activate an operation associated with a widget presented on a control screen of the HMI.
As shown in
In some embodiments, the widget associated with a control function is associated with a land area. To this end, the HMI would activate the widget when a first input is received at widget (e.g., button) and a second input is received at the HMI at a second location that enables the operation of the widget. In other embodiments, the widget is associated with a tap area. To this end, the HMI would activate the widget when a first input is received at a land area associated with enable in the operation of a tap area, which is associated with the widget.
In some embodiments, the HMI may present a visual indicator to the operator that the control widget is touched and activated. In some embodiments, the screen may change color, or the HMI may generate sound, or provide other visual, tactile (e.g., vibration), or acoustic notification.
Referring back to
In some embodiments, the land area and tap area may have the same spatial size. In other embodiments, the tap area may have an area smaller than the land area. In another embodiment, the tap area may have an area larger than the land area. In some embodiments, the tap area may change based on a failed attempt and/or a presentation of instructions of the tap area to the operator.
In some embodiments, the land area corresponds in spatial size to a presented widget associated with a control function.
Referring still to
Referring back to
As shown in
In other embodiments, the area to receive the second input (i.e., the tap area) is not presented on the HMI.
In other embodiments, the HMI provides feedback to a user that the user has landed on a land area (e.g., a sound, a change in screen color, a touch area highlighted, etc.).
In some embodiments, the land area may be enabled via touch gestures besides a tap. For example, the land area may be touched to activate gesture control of virtual knobs or sliders that may require movements of the finger on the touch screen (i.e., movement other than a tap).
Referring still to
Referring still to
Thus, to activate the control widget, the land input 606 and the tap input (shown as input 602c) are entirely overlapping in which the land input 606 is received prior to the tap input 602c.
In some embodiments, the land input 606 may be rejected if the duration time for the input exceeds a pre-defined maximum time value. The maximum time value may be modified via a configuration panel of the HMI. In some embodiments, the maximum time value may be between 10 and 30 seconds. In some embodiments, the maximum time value may be between 5 and 10 seconds.
In some embodiments, the HMI may only cause activation of the control widget if the tap input 602c is received within a predefined time (shown as time 605) from the contact time 601 of the land input 606. In some embodiments, this activation time is between 1 and 5 seconds.
In some embodiments, the GUI receives input via a touch class, e.g., the system.windows.input class in PresentationCore.dll (for Windows). In some embodiments, the GUI receives via libinput library in Linux. In some embodiments, the GUI may operate in conjunction with a multi-touch gesture program such as Touchegg, or other multi-touch gesture programs, that runs as a user in the background, and adds multi-touch support to the window managers.
Example HMI
Processor 721 may include one or more processors, each configured to execute instructions and process data to perform one or more functions associated with a computer for indexing images. Processor 721 may be communicatively coupled to RAM 722, ROM 723, storage 724, database 725, I/O devices 726, and interface 727. Processor 721 may be configured to execute sequences of computer program instructions to perform various processes. The computer program instructions may be loaded into RAM 722 for execution by processor 721. As used herein, processor refers to a physical hardware device that executes encoded instructions for performing functions on inputs and creating outputs.
RAM 722 and ROM 723 may each include one or more devices for storing information associated with operation of processor 721. For example, ROM 723 may include a memory device configured to access and store information associated with HMI controller 720, including information for identifying, initializing, and monitoring the operation of one or more components and subsystems. RAM 722 may include a memory device for storing data associated with one or more operations of processor 721. For example, ROM 723 may load instructions into RAM 722 for execution by processor 721.
Storage 724 may include any type of mass storage device configured to store information that processor 721 may need to perform processes consistent with the disclosed embodiments. For example, storage 724 may include one or more magnetic and/or optical disk devices, such as hard drives, CD-ROMs, DVD-ROMs, or any other type of mass media device.
Database 725 may include one or more software and/or hardware components that cooperate to store, organize, sort, filter, and/or arrange data used by HMI controller 720 and/or processor 721. For example, database 725 may store hardware and/or software configuration data associated with input-output hardware devices and controllers, as described herein. It is contemplated that database 725 may store additional and/or different information than that listed above.
I/O devices 726 may include one or more components configured to communicate information with a user associated with HMI controller 720. For example, I/O devices may include a console with an integrated keyboard and mouse to allow a user to maintain a database of images, update associations, and access digital content. I/O devices 726 may also include a display including a graphical user interface (GUI) for outputting information on a monitor. I/O devices 726 may also include peripheral devices such as, for example, a printer for printing information associated with HMI controller 720, a user-accessible disk drive (e.g., a USB port, a floppy, CD-ROM, or DVD-ROM drive, etc.) to allow a user to input data stored on a portable media device, a microphone, a speaker system, or any other suitable type of interface device.
Interface 727 may include one or more components configured to transmit and receive data via a communication network, such as the Internet, a local area network, a workstation peer-to-peer network, a direct link network, a wireless network, or any other suitable communication platform. For example, interface 727 may include one or more modulators, demodulators, multiplexers, demultiplexers, network communication devices, wireless devices, antennas, modems, and any other type of device configured to enable data communication via a communication network.
Number | Name | Date | Kind |
---|---|---|---|
4099247 | Mikada et al. | Jul 1978 | A |
5900877 | Weiss et al. | May 1999 | A |
7062716 | Washington | Jun 2006 | B2 |
7593000 | Chin | Sep 2009 | B1 |
8059101 | Westerman et al. | Nov 2011 | B2 |
8125312 | Orr | Feb 2012 | B2 |
8255867 | Chaplin et al. | Aug 2012 | B1 |
8286102 | Wilensky | Oct 2012 | B1 |
8405616 | Jung et al. | Mar 2013 | B2 |
8445793 | Westerman et al. | May 2013 | B2 |
8458485 | Bandyopadhyay et al. | Jun 2013 | B2 |
8525799 | Grivna et al. | Sep 2013 | B1 |
8536978 | Coggill | Sep 2013 | B2 |
8619052 | Benko et al. | Dec 2013 | B2 |
8638939 | Casey et al. | Jan 2014 | B1 |
8686958 | Rutledge et al. | Apr 2014 | B2 |
8823642 | Valik et al. | Sep 2014 | B2 |
8824040 | Buchheit et al. | Sep 2014 | B1 |
8830072 | Batra et al. | Sep 2014 | B2 |
9001061 | Locker et al. | Apr 2015 | B2 |
9030418 | Ku et al. | May 2015 | B2 |
9165159 | McDonnell | Oct 2015 | B1 |
9189614 | DeLuca | Nov 2015 | B2 |
9262603 | Dow et al. | Feb 2016 | B2 |
9357391 | Alsvig et al. | May 2016 | B1 |
9460575 | Park et al. | Oct 2016 | B2 |
9600103 | Eischeid | Mar 2017 | B1 |
9703392 | Wakabayashi et al. | Jul 2017 | B2 |
9983664 | Kim | May 2018 | B2 |
10320789 | Tribbensee | Jun 2019 | B1 |
10452259 | Eischeid | Oct 2019 | B1 |
20020054120 | Kawano et al. | May 2002 | A1 |
20020109677 | Taylor | Aug 2002 | A1 |
20020140688 | Steinberg et al. | Oct 2002 | A1 |
20020167500 | Gelbman | Nov 2002 | A1 |
20040003036 | Eagle | Jan 2004 | A1 |
20040156170 | Mager et al. | Aug 2004 | A1 |
20060284852 | Hofmeister | Dec 2006 | A1 |
20070024551 | Gelbman | Feb 2007 | A1 |
20070177803 | Elias et al. | Aug 2007 | A1 |
20070285385 | Albert et al. | Dec 2007 | A1 |
20080136587 | Orr | Jun 2008 | A1 |
20080303637 | Gelbman et al. | Dec 2008 | A1 |
20090051648 | Shamaie et al. | Feb 2009 | A1 |
20090089701 | Baier et al. | Apr 2009 | A1 |
20090122018 | Vymenets | May 2009 | A1 |
20090135147 | Hsu | May 2009 | A1 |
20090195496 | Koyama et al. | Aug 2009 | A1 |
20090225023 | Szolyga et al. | Sep 2009 | A1 |
20090262379 | Miyake et al. | Oct 2009 | A1 |
20090278807 | Hu | Nov 2009 | A1 |
20090322700 | D'Souza et al. | Dec 2009 | A1 |
20090327975 | Stedman | Dec 2009 | A1 |
20100020025 | Lemort et al. | Jan 2010 | A1 |
20100031200 | Chen | Feb 2010 | A1 |
20100031344 | Zhao et al. | Feb 2010 | A1 |
20100073303 | Wu | Mar 2010 | A1 |
20100115473 | Reeves et al. | May 2010 | A1 |
20100138764 | Hatambeiki et al. | Jun 2010 | A1 |
20100162182 | Oh et al. | Jun 2010 | A1 |
20100177660 | Essinger et al. | Jul 2010 | A1 |
20100194702 | Chen | Aug 2010 | A1 |
20100245102 | Yokoi | Sep 2010 | A1 |
20100245341 | Tanaka | Sep 2010 | A1 |
20100322485 | Riddiford | Dec 2010 | A1 |
20110041102 | Kim et al. | Feb 2011 | A1 |
20110069018 | Atkins | Mar 2011 | A1 |
20110078568 | Park et al. | Mar 2011 | A1 |
20110156867 | Carrizo et al. | Jun 2011 | A1 |
20110157375 | Kusumoto | Jun 2011 | A1 |
20110175839 | Prabhu | Jul 2011 | A1 |
20110242022 | Wen | Oct 2011 | A1 |
20110260829 | Lee | Oct 2011 | A1 |
20110273388 | Joo et al. | Nov 2011 | A1 |
20110285645 | Cho et al. | Nov 2011 | A1 |
20110320978 | Horodezky et al. | Dec 2011 | A1 |
20120023574 | Osborn et al. | Jan 2012 | A1 |
20120066650 | Tirpak et al. | Mar 2012 | A1 |
20120184368 | Yamaoka | Jul 2012 | A1 |
20120206474 | Holland et al. | Aug 2012 | A1 |
20120256863 | Zhang et al. | Oct 2012 | A1 |
20120291120 | Griffin | Nov 2012 | A1 |
20120306793 | Liu et al. | Dec 2012 | A1 |
20130033436 | Brinda | Feb 2013 | A1 |
20130057070 | Onishi et al. | Mar 2013 | A1 |
20130104065 | Stecher | Apr 2013 | A1 |
20130135178 | Miyahara | May 2013 | A1 |
20130227496 | Maekawa | Aug 2013 | A1 |
20130241844 | Chang | Sep 2013 | A1 |
20130268900 | Ferren et al. | Oct 2013 | A1 |
20130298071 | Wine | Nov 2013 | A1 |
20140026055 | Cohn et al. | Jan 2014 | A1 |
20140035853 | Ok et al. | Feb 2014 | A1 |
20140092031 | Schwartz et al. | Apr 2014 | A1 |
20140109018 | Casey et al. | Apr 2014 | A1 |
20140123080 | Gan | May 2014 | A1 |
20140143859 | Linge et al. | May 2014 | A1 |
20140149922 | Hauser | May 2014 | A1 |
20140173529 | Hicks | Jun 2014 | A1 |
20140189855 | Moradi et al. | Jul 2014 | A1 |
20140223381 | Huang et al. | Aug 2014 | A1 |
20140223549 | Quintanilla | Aug 2014 | A1 |
20140245203 | Lee | Aug 2014 | A1 |
20140267015 | Saatchi et al. | Sep 2014 | A1 |
20140277753 | Eiynk | Sep 2014 | A1 |
20140298237 | Galu, Jr. | Oct 2014 | A1 |
20140372896 | Raman | Dec 2014 | A1 |
20150007308 | Mankowski | Jan 2015 | A1 |
20150029095 | Gomez et al. | Jan 2015 | A1 |
20150038072 | Cordier | Feb 2015 | A1 |
20150046885 | Zhang et al. | Feb 2015 | A1 |
20150067578 | Ryu et al. | Mar 2015 | A1 |
20150072784 | Lee | Mar 2015 | A1 |
20150121314 | Bombolowsky | Apr 2015 | A1 |
20150135129 | Kwon et al. | May 2015 | A1 |
20150138142 | Liao | May 2015 | A1 |
20150153932 | Jiang et al. | Jun 2015 | A1 |
20150169141 | Kim et al. | Jun 2015 | A1 |
20150169216 | Cho | Jun 2015 | A1 |
20150169502 | Koenig et al. | Jun 2015 | A1 |
20150188970 | Kowshik et al. | Jul 2015 | A1 |
20150220182 | Avrahami | Aug 2015 | A1 |
20150227943 | Radomsky | Aug 2015 | A1 |
20150294096 | Grigg et al. | Oct 2015 | A1 |
20150331399 | Hackl | Nov 2015 | A1 |
20150355805 | Chandler | Dec 2015 | A1 |
20150365492 | Kalan et al. | Dec 2015 | A1 |
20160054851 | Kim | Feb 2016 | A1 |
20170039691 | Sugioka et al. | Feb 2017 | A1 |
20170090463 | Wang | Mar 2017 | A1 |
20170230378 | Bliss | Aug 2017 | A1 |
20180004386 | Hinckley | Jan 2018 | A1 |
20180267690 | Kemp | Sep 2018 | A1 |
20190005958 | Ishikawa | Jan 2019 | A1 |
20190095075 | Yang | Mar 2019 | A1 |
20190174069 | Poindexter, Jr. | Jun 2019 | A1 |
Number | Date | Country |
---|---|---|
101251884 | Aug 2008 | CN |
102592524 | Jul 2012 | CN |
203204640 | Sep 2013 | CN |
2 042 955 | Apr 2009 | EP |
2 416 308 | Feb 2012 | EP |
WO-9721204 | Jun 1997 | WO |
2015012789 | Jan 2015 | WO |
Entry |
---|
Jiao, et al., “An Investigation of Two-Handled Manipulation and Related Techniques in Multi-touch Interaction”, Machine Vision and Human-Machine Interface (MVHI), 2010, 565-568. |
Lee, et al., “Access to an Automated Security System Using Gesture-Based Passwords”, Network-Based Information Systems (NBiS), 2012 15 International Conference, 2012, 760-765. |
Sae-Bae, et al., “Multitouch Gesture-Based Authentication”, Information Forensics and Security, IEEE Transactions, 2014, 568-582. |
Tsagaris, et al., “Methodology for finger gesture control of mechatronic systems”, MECHATRONIKA, 2012, 1-6. |
Wang, et al., “VirtualTouch: A finger glove to simulate touch screen commands”, Sensors, 2012 IEEE, 2012, 1-4. |
Copending U.S. Appl. No. 15/145,087, filed May 3, 2016, and the prosecution history thereof. |
Copending U.S. Appl. No. 15/145,073, filed May 3, 2016, and the prosecution history thereof. |
Decision on Rejection for Chinese Application No. 201480080514.0, dated Sep. 30, 2019. |
European Search Report and Opinion issued in connection with related EP Application No. 16168865.0 dated Jul. 12, 2016. |
European Search Report and Opinion issued in connection with related EP Application No. 16168865.0 dated Oct. 17, 2016. |
Final Office Action issued in connection with related U.S. Appl. No. 14/713,467 dated Apr. 19, 2017. |
First Office Action for Chinese Application No. 201480080514.0, dated Jun. 29, 2018. |
International Preliminary Report on Patentability for Application No. PCT/US2014/069247, dated Jan. 10, 2017. |
International Search Report and Written Opinion for Application No. PCT/US2014/069247, dated Jun. 23, 2015. |
Niu, Yuan et al., “Gesture Authentication with Touch Input for Mobile Devices,” Third International ICST Conference, MobiSec 2011, Aalborg, Denmark, May 17-19, 2011, pp. 13-24. |
Non-Final Office Action issued in connection with related U.S. Appl. No. 14/713,467 dated Oct. 4, 2016. |
Office Action, European patent application No. 14824987.3, dated Jul. 9, 2019. |
Third Office Action for Chinese Application No. 201480080514.0, dated Apr. 12, 2019. |
Number | Date | Country | |
---|---|---|---|
20170322721 A1 | Nov 2017 | US |