Electronic devices (such as mobile phones) may generally include one or more applications (such as a gaming application, a banking application, etc.) to perform one or more functions (such as games, banking activities, etc.). The one or more functions may correspond to or may be accessed by one or more user interface elements (such as icons) associated with the electronic device. In some cases, the electronic device may display such user interface elements, so that, a user may manually select a function of an application of interest from the displayed user interface elements. At times, it may be difficult for the user to identify a location of the displayed user interface elements that corresponds to the function or may be difficult to quickly access the function or related application. Further, it may be time consuming for the user to manually search for the displayed user interface elements that corresponds to the function.
An exemplary aspect of the disclosure provides an electronic device. The electronic device may include a memory which may store gesture mapping information that may indicate an association between a plurality of gestures and a plurality of functions. The plurality of functions may be associated with a plurality of applications configured in the electronic device. The electronic device may further include a processor that may be communicably coupled with the memory. The processor may receive first information about a first gesture of the plurality of gestures, from a first user interface of the electronic device. The processor may further determine a first set of functions that may be related to a first set of applications of the plurality of applications based on the stored gesture mapping information and the received first information. The first set of functions may be associated with the first gesture. The processor may further receive second information about a second gesture of the plurality of gestures, from the first user interface. The second information about the second gesture may be received within a predefined time from the receipt of the first information and may be associated with a first function of the first set of functions. The processor may further control an execution of the first function associated with the second gesture.
Another exemplary aspect of the disclosure provides a method for a gesture-based application control executed by an electronic device. The method may include storing gesture mapping information that may indicate an association between a plurality of gestures and a plurality of functions. The plurality of functions may be associated with a plurality of applications that may be configured in the electronic device. The method may further include receiving first information about a first gesture of the plurality of gestures, from a first user interface of the electronic device. The method may further include determining a first set of functions, which may be related to a first set of applications of the plurality of applications, based on the stored gesture mapping information and the received first information. The first set of functions may be associated with the first gesture. The method may further include receiving second information about a second gesture of the plurality of gestures from the first user interface. The second information about the second gesture may be received within a predefined time from the receipt of the first information and may be associated with a first function of the first set of functions. The method may further include controlling an execution of the first function associated with the second gesture.
Another exemplary aspect of the disclosure provides a non-transitory computer-readable medium. The non-transitory computer-readable medium may store thereon, computer-executable instructions which, when executed by an electronic device, cause the electronic device to execute operations. The operations may include storing gesture mapping information that may indicate an association between a plurality of gestures and a plurality of functions. The plurality of functions may be associated with a plurality of applications that may be configured in the electronic device. The operations may further include receiving first information about a first gesture of the plurality of gestures, from a first user interface of the electronic device. The operations may further include determining a first set of functions, which may be related to a first set of applications of the plurality of applications, based on the stored gesture mapping information and the received first information. The first set of functions may be associated with the first gesture. The operations may further include receiving second information about a second gesture of the plurality of gestures, from the first user interface. The second information about the second gesture may be received within a predefined time from the receipt of the first information and may be associated with a first function of the first set of functions. The operations may further include controlling an execution of the first function associated with the second gesture.
The foregoing summary, as well as the following detailed description of the present disclosure, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the preferred embodiment are shown in the drawings. However, the present disclosure is not limited to the specific methods and structures disclosed herein. The description of a method step or a structure referenced by a numeral in a drawing is applicable to the description of that method step or structure shown by that same numeral in any subsequent drawing herein.
The following described implementations may be found in a disclosed electronic device (such as a mobile phone or a vehicle infotainment system). Exemplary aspects of the disclosure may provide the electronic device that may be configured to perform a gesture-based application control. The gesture-based application control may relate to a control of an application (such as, but not limited to, a location tracking application, or other applications) configured or installed in the electronic device, based on a gesture (such as a touch gesture) and gesture mapping information. The gesture mapping information may indicate an association between a plurality of gestures and a plurality of functions that may be associated with a plurality of applications of the electronic device.
For example, if a user or a user device (such as a touch pen or stylus) selects or accesses a first set of functions related to a first set of applications of interest, using a first gesture of the plurality of gestures from a first user interface (such as a touch screen) of the electronic device, the electronic device may receive first information about the first gesture. The first gesture may be the touch gesture to create a pattern of a character, a symbol or a shape. Based on the received first information and the gesture mapping information, the electronic device may determine the first set of functions related to the first set of applications of interest and control the execution of the first set of functions based on the determination. Hence, such gesture-based application control may improve time-efficiency to access the required set of functions, as compared to traditional methods of application control (such as, manually identifying location of an icon associated with an application configured in the electronic device, hovering over the icon, and selecting the icon to access the application). Details of such gesture-based application control of the electronic device are further described, for example, in 3A-3C, 4A-4C, 5A-5D, 6A-6C, 7A-7D, and 8A-8D.
In an embodiment, the electronic device may also receive second information about a second gesture (such as a touch gesture or a motion gesture) of the plurality of gestures from the first user interface (such as the touch screen or a motion interface). The second information may be associated with a first function of the first set of functions of the first set of applications of interest. In a scenario, the first set of functions may be associated with a single application of the plurality of applications. In such scenario, the first function determined based on the second gesture and the first set of functions determined based on the first gesture, may be a part of the single application. Hence, the electronic device may also facilitate a secondary gesture-based application control to directly access the first function associated with the application of interest and may further improve time efficiency to access the first function, as compared to the traditional methods of the application control. Details of such secondary gesture-based application control of the electronic device are further described, for example, in 3A-3C, 4A-4C, 5A-5D, 6A-6C, 7A-7D, and 8A-8D.
In another scenario, the first set of functions may be associated with multiple applications of the plurality of applications. In one example, at least one function associated with the first gesture, may relate to a first application. Further, the first function associated with the second gesture may relate to a second application different from the first application. In such scenario, as the first function is not related to the first application, the electronic device may be configured to search outside the first application and identify or access the corresponding first function in the second application, and thereby reduce time that may have taken to switch between different applications. Thus, the electronic device may also be configured to perform a global search on the stored gesture mapping information based on the received second information about the second gesture in addition to the received first information about the first gesture. Details of such secondary gesture-based application controls to swiftly switch between applications of the electronic device are further described, for example, in
In yet another scenario, the second information about the second gesture may be received based on a trigger of a user interface element (such as a button) that may be associated with the first user interface (such as the touch screen). For example, the second information about the second gesture may be received from a portion (such as a pop-window) that may overlay on the first user interface based on the trigger of the user interface element. Based on user requirements, the overlaid portion may have an opaque or a transparent background when the portion overlays on the first user interface. Therefore, the overlaid portion may enhance user experience during the receipt of the second information about the second gesture. Details of such secondary gesture-based application control of the electronic device are further described, for example, in
Reference will now be made in detail to specific aspects or features, examples of which are illustrated in the accompanying drawings. Wherever possible, corresponding or similar reference numbers will be used throughout the drawings to refer to the same or corresponding parts.
The network environment 100 may be an exemplary representation of components that may be associated with the electronic device 102. In an embodiment, the network environment 100 may include more or fewer elements than those illustrated and described in the present disclosure. For example, the network environment 100 may not include the server 114, without deviation from the scope of the disclosure. In such scenario, the gesture mapping information may be directly stored (as described in
The electronic device 102 may include suitable logic, circuitry, and interfaces that may be configured to receive information about at least one gesture from the plurality of gestures 106 from the first user interface 104 of the electronic device 102. The electronic device 102 may further determine at least one function of the one or more functions 112 associated with the plurality of applications 110, based on the pre-stored gesture mapping information and the received information. The electronic device 102 may be further configured to perform a control of the function of the one or more functions 112, based on the determination. The electronic device 102 may be further configured to display the one or more user interface elements (or other information) related to the one or more functions 112, based on the controlled at least one function of the one or more functions 112. Examples of the electronic device 102 may include, but are not limited to, a computing device, a mainframe machine, a server, a computer workstation, and/or a consumer electronic (CE) device.
In an embodiment, the electronic device 102 may be implemented as a mobile device. In such implementation, the electronic device 102 may include both the first user interface 104 and the second user interface 108. The mobile device may include suitable logic, circuitry, interfaces and/or code that may be configured to present at least a user interface to receive the information about at least one gesture of the plurality of gestures 106 and control at least one function of the plurality of applications 110, based on the received information about at least one gesture. In one embodiment, the mobile device may include entire functionality of the server 114, at least partially or in its entirety, without a deviation from the scope of the present disclosure. Examples of the mobile device may include, but are not limited to, a computing device, a smartphone, a cellular phone, a mobile phone, a gaming device, a camera device, and other portable devices.
In another embodiment, the electronic device 102 may be implemented as an in-vehicle infotainment system that may be integrated with a vehicle. The in-vehicle infotainment system may include suitable logic, circuitry, and/or interfaces that may be configured to present at least a user interface to receive the information about at least one gesture of the plurality of gestures 106 and control at least one function of the plurality of applications 110, based on the received information about at least one gesture. In one embodiment, the in-vehicle infotainment system may include entire functionality of the server 114, at least partially or in its entirety, without a deviation from the scope of the present disclosure. Examples of the in-vehicle infotainment system may include, but not limited to, an entertainment system, a navigation system, a vehicle user interface system, an Internet-enabled communication system, an in-car entertainment (ICE) system, an automotive Head-up Display (HUD), an automotive dashboard, a human-machine interface (HMI), and other entertainment systems. In an embodiment, the in-vehicle infotainment system may receive the information about at least one gesture from the plurality of gestures, from the first user interface 104.
The first user interface 104 may include suitable logic, circuitry, and interfaces that may be configured to receive user inputs (for example a touch-based gesture). The first user interface 104 may be a touch screen, which may be configured to receive the user inputs related to the plurality of gestures 106. For example, the first user interface 104 (such as the touch screen or a touch pad) may be coupled with the electronic device 102 (such as the mobile phone or the in-vehicle infotainment system) to receive the user inputs related to the plurality of gestures 106. In one embodiment, the first user interface 104 may include entire functionality of the second user interface 108 (i.e. display screen), at least partially or in its entirety, without a deviation from the scope of the present disclosure. Examples of the first user interface 104 may include, but not limited to, a resistive touchscreen, a capacitive touchscreen, a conductive touchscreen, a tactile touchscreen, a wireless touchscreen (such as an external touchpad), and the like. In an embodiment, the electronic device 102 may control the first user interface 104 to receive first information about a first gesture 106A of the plurality of gestures 106, from the first user interface 104. In additional embodiment, the electronic device 102 may control the first user interface 104 to receive second information about a second gesture 1066 of the plurality of gestures 106, from the first user interface 104.
The plurality of gestures 106 may correspond to the user inputs provided on the first user interface 104 to control the one or more functions 112 associated with the plurality of applications 110. In some embodiments, the plurality of gestures 106 may be provided on one or more user interface elements (i.e. rendered on or controlled via the first user interface 104) to control or access the one or more functions 112 associated with the plurality of applications 110. In an embodiment, the plurality of gestures 106 may include the first gesture 106A, a second gesture 106B, and a Nth gesture 106N, as shown in
In another scenario, the plurality of gestures 106 may also relate to a body gesture. The body gesture may include, but not limited to, a facial gesture, or a hand gesture. In an embodiment, the electronic device 102 may include an image capturing device (not shown) that may be configured to capture a plurality of images of a user 118 over a specified time period. The captured plurality of images may be utilized to determine the body gesture associated with the user 118. The body gesture may indicate one or more motions or positions of the user 118 (such as a hand, an eye, a mouth, a head, a nose, or eyebrows associated with the user 118). Based on the body gesture, the electronic device 102 may be configured to determine at least one function associated with the plurality of applications 110 and display information of the one or more user interface elements related to the determined function.
The second user interface 108 may include suitable logic, circuitry, and interfaces that may be configured to display the information of the one or more user interface elements related to the one or more functions 112 associated with the plurality of applications 110. For example, the second user interface 108 (such as a display screen) may be coupled with the electronic device 102 (such as the mobile device or the in-vehicle infotainment system) to display the information of the one or more user interface elements related to the one or more functions 112, based on the received user inputs from the first user interface 104. In one embodiment, the second user interface 108 may include entire functionality of the first user interface 104, at least partially or in its entirety, without a deviation from the scope of the present disclosure. Examples of the second user interface 108 may include, but not limited to, at least one of a Liquid Crystal Display (LCD) display, a Light Emitting Diode (LED) display, a plasma display, or an Organic LED (OLED) display technology, or other display devices. In accordance with an embodiment, the second user interface 108 may refer to a display screen of a head mounted device (HMD), a smart-glass device, a see-through display, a projection-based display, an electro-chromic display, or a transparent display screen. In some embodiments, the display functionality of the second user interface 108 may be integrated in the touch-based interface of the first user interface 104. In such embodiment, the first user interface 104 may be a touchscreen to receive the gesture-based user inputs and display the information of one or more user interface elements related to the one or more functions 112.
The plurality of applications 110 may be a program or a group of programs that may be stored or installed in the electronic device 102. Each application of the plurality of applications 110 may include one or more functions 112. In an embodiment, the electronic device 102 may be configured to perform one or more instructions associated with each function of the one or more functions 112. In one example, the plurality of applications 110 may be specific to a hardware capability or a software capability (such as operating system) of the mobile device or the in-vehicle infotainment system. The plurality of applications 110 may be executed on the operating system associated with the electronic device 102. In another embodiment, the plurality of applications 110 may be executed on any platform, which may be different from that of the electronic device 102. For example, one or more of the plurality of applications 110 may be executed or operational on the server 114 communicably coupled with the electronic device 102. The electronic device 102 may transmit information about the received user inputs (i.e. plurality of gestures 106) to the server 114 and further receive information about the one or more functions 112 of the related applications from the server 114. In an embodiment, the plurality of applications 110 may include a first application 110A, a second application 1106, and a Nth application 110N. The number of applications shown in
Examples of the plurality of applications 110 may include, but is not limited to, a web browser, a vehicle related application, an email application, a chatting application, a social networking application, an audio-video communication application, a camera application, a financial application, a media player application, a file/document viewer application, content sharing application, a simulator application, a gaming application, a photo editor application, a location tracking application, a navigation application, an entertainment application, a health-related application, a sports-related application, an educational application, a text processing application, data processing application, an accounting application, a customer-care application, a transport application, a service provider-based application, and other applications that may be executed or configured on the electronic device 102.
The one or more functions 112 of each application of the plurality of applications 110, may relate to one or more instructions associated with each application of the plurality of applications 110. The one or more functions 112 may be a feature, a functionality, a program, a module, a component, or a part of each application. In an embodiment, the electronic device 102 may be configured to execute the one or more instructions associated with each function to perform the corresponding function. In one example, the one or more functions 112 may be specific to the mobile device as the electronic device 102. In other example, the one or more functions 112 may be specific to the in-vehicle infotainment system as the electronic device 102. In an embodiment, the one or more functions 112 may be specific to a hardware capability or specific to a software capability of the electronic device 102. In some embodiments, the one or more functions 112 may be executed on the server 114 and information about the one or more functions 112 may be rendered on the electronic device 102. For example, an application associated with the plurality of applications 110 is a vehicle maintenance application, then the one or more functions 112 associated with such vehicle maintenance application may include, but are not limited to, a vehicle service booking function, a notification feature, a payment feature, a vehicle telematics data access function, an audio-video communication feature, a vehicle door access feature, and other functions that may be related to the determined application from the plurality of applications 110. In another example, if the application is a social networking application, the one or more functions 112 may include, but are not limited to, a message transmission function, a search function, a tagging function, a document uploading function, a group formation function, a comment function, a like/dislike function, a security function, or other profile based function.
The server 114 may include suitable logic, circuitry, interfaces, and/or code that may be configured to collaborate with the electronic device 102, so that, the electronic device 102 may retrieve the gesture mapping information from the server 114, based on gesture-based user inputs received on the first user interface 104. For example, the electronic device 102 may determine a first set of functions, related to a first set of applications of the plurality of applications 110, based on the gesture mapping information that may be stored in the server 114 and based on the received first information about the first gesture 106A. In an embodiment, the first set of functions may be associated with the first gesture 106A. In some embodiments, the server 114 may execute one or more functions 112 related to the first gesture 106A and provide information about the executed one or more functions 112 to the electronic device 102. In some embodiments, the server 114 may store one or more applications of the plurality of applications 110 and execute one or more functions 112 related to the stored applications, based on a receipt of the first information about the first gesture 106A or the second information about the second gesture 106B from the electronic device 102. In an example, the server 114 may store one or more webpages related to different applications executed on the server 114 or executed directly on the electronic device 102. Based on the received first information or the second information about the first gesture 106A and the second gesture 106B, respectively, the server 114 may provide associated one or more webpages to the electronic device 102 to be presented to the user 118.
In an embodiment, the server 114 may be a cloud server, which may be utilized to execute various operations through web applications, cloud applications, HTTP requests, repository operations, file transfer, and the like. Examples of the server 114 may include, but are not limited to, an event server, a database server, a file server, a web server, a media server, a content server, an application server, a mainframe server, or a combination thereof. In one or more embodiments, the server 114 may be implemented as a plurality of distributed cloud-based resources. In a specific embodiment, the server 114 may be configured to communicate with at least one electronic device (such as, the electronic device 102), via the communication network 116.
The communication network 116 may include a communication medium through which the electronic device 102 and the server 114 may communicate with each other. The communication network 116 may be one of a wired connection or a wireless connection. Examples of the communication network 116 may include, but are not limited to, the Internet, a cloud network, a Wireless Fidelity (Wi-Fi) network, a Personal Area Network (PAN), a Local Area Network (LAN), or a Metropolitan Area Network (MAN). Various devices in the network environment 100 may be configured to connect to the communication network 116 in accordance with various wired and wireless communication protocols. Examples of such wired and wireless communication protocols may include, but are not limited to, at least one of a Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), Zig Bee, EDGE, IEEE 802.11, light fidelity (Li-Fi), 802.16, IEEE 802.11s, IEEE 802.11g, multi-hop communication, wireless access point (AP), device to device communication, cellular communication protocols, and Bluetooth (BT) communication protocols.
In some embodiments, the communication network 116 may be an in-vehicle network. The in-vehicle network may include a medium through which various control units, components, and/or systems (for example the electronic device 102, the first user interface 104, the second user interface 108) of a vehicle (not shown) may communicate with each other. In accordance with an embodiment, in-vehicle communication of audio/video data may occur by use of Media Oriented Systems Transport (MOST) multimedia network protocol of the in-vehicle network or other suitable network protocols for vehicle communication. The MOST-based network may be a separate network from the controller area network (CAN). The MOST-based network may use a plastic optical fiber (POF) medium. In accordance with an embodiment, the MOST-based network, the CAN, and other in-vehicle networks may co-exist in the vehicle. The in-vehicle network may facilitate access control and/or communication between the control circuitry and other ECUs, such as ECM or a telematics control unit (TCU) of the vehicle. Various devices or components in the vehicle may connect to the in-vehicle network, in accordance with various wired and wireless communication protocols. Examples of the wired and wireless communication protocols for the in-vehicle network may include, but are not limited to, a vehicle area network (VAN), a CAN bus, Domestic Digital Bus (D2B), Time-Triggered Protocol (TTP), FlexRay, IEEE 1394, Carrier Sense Multiple Access With Collision Detection (CSMA/CD) based data communication protocol, Inter-Integrated Circuit (I2C), Inter Equipment Bus (IEBus), Society of Automotive Engineers (SAE) J1708, SAE J1939, International Organization for Standardization (ISO) 11992, ISO 11783, Media Oriented Systems Transport (MOST), MOST25, MOST50, MOST150, Plastic optical fiber (POF), Power-line communication (PLC), Serial Peripheral Interface (SPI) bus, and/or Local Interconnect Network (LIN).
In operation, the electronic device 102 may be operated to access or to control at least one function of an application of the plurality of applications 110. The electronic device 102 may be configured to receive the first information about the first gesture 106A (such as the touch gesture) that may be received on the first user interface 104 (such as a touch screen). The first gesture 106A may be received from a finger of the user 118 or from the user device (such as a touch stylus, not shown). The first gesture 106A may be a touch gesture or a touch pattern related to a character, a symbol, or a custom shape, drawn by the user 118 on the first user interface 104. Details of the first gesture 106A are provided, for example, in
Based on the received first information and the gesture mapping information (which may be stored in the electronic device 102 or the server 114), the electronic device 102 may determine the first set of functions (including one or more functions 112) that may be related to the first set of applications of the plurality of applications 110. The gesture mapping information may indicate an association or relationship between the plurality of gestures 106 (including the first gesture 106A) and a plurality of functions associated with the plurality of applications 110 (i.e. including the first set of applications) that may be configured/installed on the electronic device 102 or on the server 114. The electronic device 102 may further control the execution of the first set of functions based on the determination. For example, the execution may include, but is not limited to, display of icons or user interface elements associated with the first set of functions determined for the received first information about the first gesture 106A. The details of the first set of functions are provided, for example, at
In an embodiment, the electronic device 102 may also receive the second information about the second gesture 106B (such as a touch gesture or a motion gesture), from the first user interface 104 (such as the touch screen or a motion interface associated with the first user interface 104). In an embodiment, the second information about the second gesture 106B may be received, based on the receipt of the first information about the first gesture 106A. For example, the second information about the second gesture 106B may be received within a predefined time (for example in few milliseconds or seconds) from the receipt of the first information. The details about the second gesture 106B are provided, for example, in
In another scenario, the first set of functions may be associated with multiple and different applications (such as the first application 110A and the second application 1106) of the plurality of applications 110. In one example, at least one second function (i.e. different from the first function) of the first set of functions (i.e. associated with the first gesture 106A) may be related to the first application 110A of the first set of applications. Further, the first function associated with the second gesture 106B may relate to the second application 1106 of the first set of applications, where the second application 1106 is different from the first application 110A. In such scenario, as the first function (i.e. determined based on the second gesture 106B) is not related to the first application 110A, the electronic device 102 may be configured to search outside the first application 110A and identify the corresponding first function in the second application 1106, and thereby reduce time that may have taken to switch between applications (i.e. the first application 110A to the second application 1106) of the plurality of applications 110. Thus, the electronic device 102 may also perform a global search (i.e. in all applications configured in the electronic device 102) based on the stored gesture mapping information and based on the combination of the second information about the second gesture 106B and the received first information about the first gesture 106A. Details of such secondary gesture-based application control to swiftly switch between applications of the electronic device 102 are further described, for example, in
The processor 202 may include suitable logic, circuitry, and/or interfaces that may be configured to execute program instructions associated with different operations to be executed by the electronic device 102. For example, some of the operations may include, but are not limited to, reception of the first information about the first gesture 106A of the plurality of gestures 106, from the first user interface 104 of the electronic device 102, determination of the first set of functions, related to the first set of applications (such as the first application 110A and the second application 1106) of the plurality of applications 110, reception of the second information about the second gesture 1066 of the plurality of gestures 106, from the first user interface 104, and a control of the execution of the first function (i.e. one of the first set of functions) associated with the second gesture 106B of the plurality of gestures 106. The execution of operations may be further described, for example, in
The processor 202 may include any suitable special-purpose or general-purpose computer, computing entity, or processing device including various computer hardware or software modules and may be configured to execute instructions stored on any applicable computer-readable storage media (for example the memory 204). The processor 202 may be implemented based on several processor technologies known in the art. For example, the processor 202 may include a microprocessor, a microcontroller, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a Field-Programmable Gate Array (FPGA), or any other digital or analog circuitry configured to interpret and/or to execute program instructions and/or to process data. The processor 202 may include any number of processors configured to, individually or collectively, perform any number of operations of the electronic device 102, as described in the present disclosure. Examples of the processor 202 may include a Central Processing Unit (CPU), a Graphical Processing Unit (GPU), an x86-based processor, an x64-based processor, a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, and/or other hardware processors.
The memory 204 may include suitable logic, circuitry, interfaces, and/or code that may be configured to store the set of instructions executable by the processor 202. In an embodiment, the memory 204 may be configured to store the gesture mapping information that may indicate the association or relationship between the plurality of gestures 106 and the plurality of functions (such as including the one or more functions 112). The plurality of functions may be associated with the plurality of applications 110 configured or installed in the electronic device 102. In an embodiment, the processor 202 may retrieve information about the first set of functions associated with the first gesture 106A and retrieve information about the first function associated with the second gesture 1066, from the gesture mapping information stored in the memory 204. The memory 204 may also store the first information about the first gesture 106A, the second information about the second gesture 106B, information about one or more functions 112 associated with each of the plurality of applications 110, and information about one or more user interface elements related to the one or more functions 112. In an embodiment, the memory 204 may also store information associated with one or more control instructions to control the execution of the first function associated with the second gesture 106B. Examples of implementation of the memory 204 may include, but are not limited to, Random Access Memory (RAM), Read Only Memory (ROM), Hard Disk Drive (HDD), a Solid-State Drive (SSD), a CPU cache, and/or a Secure Digital (SD) card.
The I/O interface 206 may include suitable logic, circuitry, interfaces, and/or code that may be configured to receive the user inputs and may render output in response to the received user inputs (such as the plurality of gestures 106) from the user 118 or from the user device (such as a touch stylus or pen). In an embodiment, the I/O interface 206 may be integrally coupled to the electronic device 102 to receive the user inputs from the first user interface 104 and control the first function associated with at least one application of the plurality of applications 110. In another embodiment, the I/O interface 206 may be communicably coupled to the electronic device 102 to receive the user inputs. In some embodiments, the I/O interface 206 may include the first user interface 104 and the second user interface 108. In another embodiment, the I/O interface 206 may include various input and output devices that may be configured to communicate with the processor 202. Examples of the such input and output devices may include, but are not limited to, a touch screen, a touch pad, a keyboard, a mouse, a joystick, a microphone, a display device, a speaker, and/or an image sensor.
The timer 208 may include suitable logic, circuitry, interfaces, and/or code that may be configured to set a predefined time period (for example in milliseconds or seconds) to receive the second information about the second gesture 1066 after the receipt of the first information about the first gesture 106A. In an embodiment, the electronic device 102 may be configured to receive the second information about the second gesture 106B within the predefined time period set by the timer 208, from the receipt of the first information. In an example, the timer 208 may include a digital counter or clock to countdown to the predetermined time period which may be set by the processor 202. Examples of the timer 208 may include, but not limited to, a software timer, a digital clock, or an internal clock associated with the electronic device 102. In an embodiment, the electronic device 102 may activate the timer 208 for the predetermined time period based on the reception of the first information about the first gesture 106A.
The network interface 210 may include suitable logic, circuitry, and interfaces that may be configured to facilitate communication between the processor 202 and the communication network 116. The network interface 210 may be implemented by use of various technologies to support wired or wireless communication of the electronic device 102 with the communication network 116. The network interface 210 may include, but is not limited to, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, or a local buffer circuitry. The network interface 210 may be configured to communicate via wireless communication with networks, such as the Internet, an Intranet or a wireless network, such as a cellular telephone network, a wireless local area network (LAN), and a metropolitan area network (MAN). The wireless communication may be configured to use one or more of a plurality of communication standards, protocols and technologies, such as Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), wideband code division multiple access (W-CDMA), Long Term Evolution (LTE), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (such as IEEE 802.11a, IEEE 802.11b, IEEE 802.11g or IEEE 802.11n), voice over Internet Protocol (VoIP), light fidelity (Li-Fi), Worldwide Interoperability for Microwave Access (Wi-MAX), a protocol for email, instant messaging, and a Short Message Service (SMS).
Although in
Referring to
Referring to
The processor 202 of the electronic device 102 may search the first information about the first gesture 302A in the gesture mapping information and retrieve information about a first set of functions that may be related to a first set of applications of the plurality of applications 110. The first set of applications may be a subset of the plurality of applications 110 and the first set of functions may be one or more functions 112 of each of the first set of applications. In an embodiment, the first set of functions may be related to the first gesture 302A drawn on the user interface 302 as shown, for example, in
In an embodiment, based on the determined first set of functions, the processor 202 may further determine a set of user interface icons 306 (shown in
Further, based on the determined first set of functions and/or the display of the set of user interface icons 306, the electronic device 102 may display a subsidiary user interface 304 on the user interface 302 as shown, for example, in
The user interface 302 (i.e. including the subsidiary user interface 304) may be further configured to recognize the second gesture 304A and generate the second information which may indicate recognized information for the second gesture 304A. The processor 202 of the electronic device 102 may be further configured to receive the second information (i.e. recognized information) about the second gesture 304A from the user interface 302 including the subsidiary user interface 304. In an embodiment, the second information about the second gesture 106B may be received within the predefined time (in milliseconds or seconds) from the receipt of the first information. The processor 202 may discard the second information when the second gesture 304A is not received within the predefined time. For example, the second gesture 304A drawn by the user 118 may be similar or substantially similar to the first user interface icon 306A (i.e. the vehicle maintenance function or application). In such case, the second information may indicate the drawn second gesture 304A and the processor 202 may determine at least one function (i.e. a first function) from the first set of functions based on the second information. In an embodiment, the processor 202 may compare the second information about the second gesture 304A with the gesture mapping information to retrieve (or search) information about the first function associated with the second gesture 304A. With respect to
In another embodiment, based on the displayed set of user interface icons 306 related to the first set of functions for the first gesture 302A, the electronic device 102 may control the user interface 302 to receive the motion gesture (such as a swipe gesture, a tilt gesture, or a tap gesture as described in
With reference to
Referring to
In
It may be noted that the access to the first application 310 (or related functions) as a vehicle maintenance application, based on the combination of the first gesture 302A and the second gesture 304A is described in
Referring to
Referring to
In an embodiment, the tilt gesture may also be an orientation gesture (such as a portrait orientation or a landscape orientation) associated with the electronic device 102. One skilled in the art may understand that the user interface instruction 404 (i.e. a notification for the user 118) to tilt or orient the electronic device 102 for the selection of corresponding user interface icon (or function/application) is shown merely as an example. The electronic device 102 may provide different types of notifications (such text-based or audio-based) to the user 118 to provide the motion gesture (i.e. tilt or orientation) to select the corresponding function based on the second gesture 106B, as the motion gesture. In an embodiment, the electronic device 102 may further recognize the received second gesture 106B, and further access or control the corresponding first function or application (such as the first application 310) associated with the received second information, as described, for example, in
In an embodiment, the motion gesture may relate to a shake gesture (i.e. shaking the electronic device 102) as the second gesture 106B. In such case, the processor 202 may discard the received first information about the first gesture 106A, based on a determination that the second gesture 106B is the shake gesture as the motion gesture. For example, if the user 118 inadvertently provided the first gesture 106A, which is not of interest, the user 118 may perform the shake gesture (as the second gesture 106B) on the electronic device 102 to discard the received first information about the first gesture 106A and may further remove the set of user interface icons 306 displayed corresponding to the first gesture 106A. In such case, the electronic device 102 may provide a user interface (such as home screen or desktop screen) to receive the first gesture 106A (i.e. first gesture 302A shown in
Referring to
In an embodiment, the user interface instruction 406 may provide a notification for the user 118 to tap on one of the set of user interface icons 306 of the electronic device 102 for the selection of a corresponding user interface icon (or function/application). For example, the notification may include an icon (such as a pointing finger icon as shown in
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Further, based on the display of the set of user interface icons 704, the electronic device 102 may further configure the user interface 302 to display a set of user interface instructions 706 for the set of user interface icons 704. In an embodiment, the electronic device 102 may allow the user 118 to provide a second navigation gesture (such as at least one of: a touch gesture, a swipe gesture, a tilt gesture, a tap gesture), based on the displayed set of user interface instructions 706, as described in
Referring to
Referring to
Referring to
Further, based on the display of the set of user interface icons 804, the electronic device 102 may further configure the user interface 302 to display a set of user interface instructions 806 for the set of user interface icons 804. In an embodiment, the electronic device 102 may allow the user 118 to provide a sub-function gesture (such as at least one of: a touch gesture, a swipe gesture, a tilt gesture, a tap gesture, or any other type of second gesture), based on the displayed set of user interface instructions 806, as described in
Referring to
In some embodiments, one or more of the first set of functions associated with the new gesture 802 (such as the first gesture 106A in
It may be noted that different sequence of operations shown in
At 902, the gesture mapping information that may indicate the association between the plurality of gestures 106 and plurality of functions may be stored. In an embodiment, the processor 202 may control the memory 204 to store the gesture mapping information that may indicate the association between plurality of gestures 106 and plurality of functions, as described for example, in
At 904, the first information about the first gesture 106A of the plurality of gestures 106 may be received, from the first user interface 104 of the electronic device 102. In an embodiment, the electronic device 102 or the processor 202 may receive the first information about the first gesture 106A of the plurality of gestures 106, from the first user interface 104 of the electronic device 102, as described, for example, in
At 906, the first set of functions (such as the one or more functions 112), related to the first set of applications (such as the first application 110A and the second application 110B) of the plurality of applications 110 may be determined, based on the stored gesture mapping information and the received first information. In an embodiment, the electronic device 102 or the processor 202 may determine the first set of functions (such as the one or more functions 112), related to the first set of applications (such as the first application 110A and the second application 110B) of the plurality of applications 110, as described in
At 908, the second information about the second gesture 106B of the plurality of gestures 106 may be received, from the first user interface 104. In an embodiment, the electronic device 102 or the processor 202 may receive the second information about the second gesture 106B of the plurality of gestures 106, from the first user interface 104, as described in
At 910, the execution of the first function associated with the second gesture 106B may be controlled. In an embodiment, the electronic device 102 may control the execution of the first function associated with the second gesture 1066, as described in
The flowchart 900 is illustrated as discrete operations, such as 902, 904, 906, 908, and 910. However, in certain embodiments, such discrete operations may be further divided into additional operations, combined into fewer operations, or eliminated, or rearranged, depending on the implementation without detracting from the essence of the disclosed embodiments.
Various embodiments of the disclosure may provide a non-transitory, computer-readable medium and/or storage medium, and/or a non-transitory machine readable medium and/or storage medium stored thereon, a set of instructions executable by a machine and/or a computer (for example the electronic device 102) for the gesture-based application control. The set of instructions may be executable by the machine and/or the computer (for example the electronic device 102) to perform operations that may include reception of the first information about a first gesture (such as the first gesture 106A) of the plurality of gestures 106, from a first user interface (such as the first user interface 104) of the electronic device 102. The operations may further include determination of the first set of functions, related to the first set of applications (such as the first application 110A and the second application 1106) of the plurality of applications 110, based on stored gesture mapping information and the received first information (such as from the first user interface 104). The gesture mapping information may indicate an association between the plurality of gestures 106 and a plurality of functions associated with a plurality of applications 110 configured in the electronic device 102. The operations may further include reception of the second information about the second gesture 106B of the plurality of gestures 106, from the first user interface 104. The operations may further include the control of the execution of a first function associated with the second gesture 106B of the plurality of gestures 106.
The foregoing description of embodiments and examples has been presented for purposes of illustration and description. It is not intended to be exhaustive or limiting to the forms described. Numerous modifications are possible considering the above teachings. Some of those modifications have been discussed and others will be understood by those skilled in the art. The embodiments were chosen and described for illustration of various embodiments. The scope is, of course, not limited to the examples or embodiments set forth herein but can be employed in any number of applications and equivalent devices by those of ordinary skill in the art. Rather it is hereby intended the scope be defined by the claims appended hereto. Additionally, the features of various implementing embodiments may be combined to form further embodiments.
For the purposes of the present disclosure, expressions such as “including”, “comprising”, “incorporating”, “consisting of”, “have”, “is” used to describe and claim the present disclosure are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural. Further, all joinder references (e.g., attached, affixed, coupled, connected, and the like) are only used to aid the reader's understanding of the present disclosure, and may not create limitations, particularly as to the position, orientation, or use of the systems and/or methods disclosed herein. Therefore, joinder references, if any, are to be construed broadly. Moreover, such joinder references do not necessarily infer that two elements are directly connected to each other.
The present disclosure may be realized in hardware, or a combination of hardware and software. The present disclosure may be realized in a centralized fashion, in at least one computer system, or in a distributed fashion, where different elements may be spread across several interconnected computer systems. A computer system or other apparatus adapted for carrying out the methods described herein may be suited. A combination of hardware and software may be a general-purpose computer system with a computer program that, when loaded and executed, may control the computer system such that it carries out the methods described herein. The present disclosure may be realized in hardware that includes a portion of an integrated circuit that also performs other functions. It may be understood that, depending on the embodiment, some of the steps described above may be eliminated, while other additional steps may be added, and the sequence of steps may be changed.
The present disclosure may also be embedded in a computer program product, which includes all the features that enable the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods. Computer program, in the present context, means any expression, in any language, code or notation, of a set of instructions intended to cause a system with an information processing capability to perform a particular function either directly, or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form. While the present disclosure has been described with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made, and equivalents may be substituted without departing from the scope of the present disclosure. In addition, many modifications may be made to adapt a situation or material to the teachings of the present disclosure without departing from its scope. Therefore, it is intended that the present disclosure is not limited to the embodiment disclosed, but that the present disclosure will include all embodiments that fall within the scope of the appended claims.