The present disclosure relates to an electronic device, and more specifically to a method and the electronic device for single-handed operation assistance.
Now-a-days modern electronic devices such as foldable smartphones and tablets have large screens to enhance the viewing experience of a user. Due to the large size of the screen, the user may often struggle while performing a single-handed operation on the screen. In the single-handed operation mode, the user may want to provide inputs to the electronic device while holding the electronic device in one hand. In such scenarios, the user may face difficulty in reaching every corner of the screen while holding the electronic device in one hand.
In order to overcome this challenge, the state-of-the-art electronic device (12) is configured to provide a User Interface (UI) (13) that can be pulled down as shown in (11) of
However, when the UI is pulled down as shown in (11) of
Accordingly, the embodiments herein provide single-handed operation assistance method for an electronic device. The method includes detecting, by the electronic device, a single-hand use state of the electronic device. The method includes determining, by the electronic device, a location of a finger of a hand of a user relative to a location of a UI element being displayed on a screen of the electronic device. The method further includes altering, by the electronic device, a location of the at least one UI element to be closer to the finger based on the location of the finger of the hand of the user.
In an embodiment, the single-hand use state of the electronic device is detected using a UWB sensor of the electronic device.
In an embodiment, the method comprises automatically activating a single-handed operation mode based on detecting the single-hand use state of the electronic device.
In an embodiment, the electronic device determines the location of the finger of the hand relative to the location of the UI element being displayed on the screen of the electronic device, by receiving a UWB signal reflected from the finger; estimating, by the electronic device, a type of the hand by providing the UWB signal to an AI model; determining, by the electronic device, a direction of the finger pointing towards the UI element based on the UWB signal and the type of the hand; identifying, by the electronic device, the UI element being displayed on the screen based on the direction of the finger; and determining, by the electronic device, the location of the finger of the hand relative to the location of the identified UI element.
In an embodiment, the electronic device determines the direction of the finger pointing towards the UI element based on the UWB signal and the type of the hand by determining, the permittivity of one or more parts of the finger based on the UWB signal; identifying, by the electronic device, the one or more parts of the finger based on the permittivity; determining, by the electronic device, proximity between the one or more parts of the finger based on the permittivity; determining, by the electronic device, a projection of the finger relative to the electronic device based on the UWB signal; and determining, by the electronic device, the direction of the finger pointing towards the UI element based on the one or more parts of the finger, the proximity between the one or more parts of the finger, and the projection of the finger.
In an embodiment, the electronic device further alters the location of the UI element such that the UI element is closer to the finger after the alteration, by determining a location on the screen closer to the finger based on the location of the finger; and altering, by the electronic device, the location of the UI element to the determined location on the screen.
Accordingly, the embodiments herein provide the electronic device the single-handed operation assistance method. The electronic device includes a single-hand mode assistance engine, memory, at least one processor, the UWB sensor, and a screen, where the single-hand mode assistance engine is coupled to the memory and the at least one processor. The single-hand mode assistance engine is configured to detect the single-hand use state of the electronic device. The single-hand mode assistance engine is configured to determine the location of the finger of the hand relative to the location of the UI element being displayed on the screen of the electronic device. The single-hand mode assistance engine is configured to alter a location of the at least one UI element to be closer to the finger after the alteration based on the location of the finger of the hand of the user.
These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments, and the embodiments herein include all such modifications.
This disclosure is illustrated in the accompanying drawings, throughout which reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. Also, the various embodiments described herein are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments. The term “or” as used herein, refers to a non-exclusive or, unless otherwise indicated. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein can be practiced and to further enable those skilled in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
As in related art, embodiments may be described and illustrated in terms of blocks which carry out a described function or functions. These blocks, which may be referred to herein as managers, units, modules, hardware components or the like, are physically implemented by analog and/or digital circuits such as logic gates, integrated circuits, microprocessors, microcontrollers, memory circuits, passive electronic components, active electronic components, optical components, hardwired circuits and the like, and may optionally be driven by firmware. The circuits may, for example, be embodied in one or more semiconductor chips, or on substrate supports such as printed circuit boards and the like. The circuits constituting a block may be implemented by dedicated hardware, or by a processor (e.g., one or more programmed microprocessors and associated circuitry), or by a combination of dedicated hardware to perform some functions of the block and a processor to perform other functions of the block. Each block of the embodiments may be physically separated into two or more interacting and discrete blocks without departing from the scope of the disclosure. Likewise, the blocks of the embodiments may be physically combined into more complex blocks without departing from the scope of the disclosure.
The accompanying drawings are used to help easily understand various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. As such, the present disclosure should be construed to extend to any alterations, equivalents and substitutes in addition to those which are particularly set out in the accompanying drawings.
One or more embodiments provide a method and an electronic device for single-handed operation assistance. The electronic device detects a direction of a finger (e.g. thumb) of a hand of a user pointing towards a UI element displayed on a screen of the electronic device using an Ultra-Wide Band (UWB) sensor during a single-handed operation mode. Further, the electronic device automatically brings the UI element closer to the finger to enable the user to easily access/interact with the UI element. The method further allows the user to reach every corner of the screen in a full-screen mode without minimizing the screen or scaling down and/or pulling down a UI of the electronic device, which improves the user experience.
Accordingly, the embodiments herein provide a single-handed operation assistance method for an electronic device. The method includes detecting, by the electronic device, a single-hand use state of the electronic device. The method includes determining, by the electronic device, a location of a finger of a hand of a user relative to a location of a UI element being displayed on a screen of the electronic device. The method includes altering, by the electronic device, the location of the UI element such that the UI element is closer to the finger after the alteration.
Accordingly, the embodiments herein provide the electronic device with the single-handed operation assistance method. The electronic device includes a single-hand mode assistance engine, a memory, a processor, a UWB sensor, and the screen, where the single-hand mode assistance engine is coupled to the memory and the processor. The single-hand mode assistance engine is configured to detect the single-hand use state of the electronic device. The single-hand mode assistance engine is configured to determine the location of the finger of the hand of the user relative to the location of the UI element being displayed on the screen of the electronic device. The single-hand mode assistance engine is configured for altering the location of the UI element such that the UI element is closer to the finger after the alteration.
Unlike existing methods and systems, the electronic device detects a direction of the finger (e.g., thumb) of the hand of the user pointing towards the UI element displayed on the screen of the electronic device using the UWB sensor during the single-handed operation mode. Further, the electronic device automatically brings the UI element closer to the location of the finger to enable the user to easily access/interact with the UI element. The method further allows the user to reach every corner of the screen in a full-screen mode without minimizing the screen or scaling down and/or pulling down a UI of the electronic device, which improves the user experience.
Referring now to the drawings, and more particularly to
In an embodiment, the single-hand mode assistance engine (110) includes a single-hand detector (111), a hand type & finger direction estimator (112), a UI positioning engine (113), and an Artificial Intelligence (AI) model (114). The single-hand detector (111), the hand type & finger direction estimator (112), the UI positioning engine (113), and the Artificial Intelligence (AI) model (114) are implemented by processing circuitry such as logic gates, integrated circuits, microprocessors, microcontrollers, memory circuits, passive electronic components, active electronic components, optical components, hardwired circuits, or the like, and may optionally be driven by a firmware. The circuits may, for example, be embodied in one or more semiconductor chips, or on substrate supports such as printed circuit boards and the like.
The single-hand detector (111) detects a single-hand use state of the electronic device (100). In an embodiment, the single-hand use state of the electronic device (100) is detected using the UWB sensor (160). The UWB sensor (160) transmits a UWB signal which hits on the hand of the user that holds the electronic device (100). The UWB signal is the reflected back to the UWB sensor (160). Thus, the UWB sensor (160) receives the reflected UWB signal. The single-hand detector (111) automatically activates a single-handed operation mode in response to detecting the single-hand use state of the electronic device (100). The hand type & finger direction estimator (112) determines the location of the finger of the hand relative to the location of the UI element being displayed on the screen (150). The UI positioning engine (113) alters the location of the UI element such that the UI element is closer to the finger after the alteration.
In an embodiment, for determining the location of the finger of the hand of the user relative to the location of the UI element, the single-hand detector (111) receives the UWB signal reflected from the finger. In response to receiving the UWB signal, the single-hand detector (111) and pre-processes the received UWB signal and forwards the pre-processed UWB signal to the hand type & finger direction estimator (112). The hand type & finger direction estimator (112) estimates a type of the hand (i.e. left hand or right hand) by providing the pre-processed UWB signal to the AI model (114). The hand type & finger direction estimator (112) determines the direction of the finger pointing towards the UI element based on the UWB signal and the type of the hand. The hand type & finger direction estimator (112) identifies the UI element being displayed on the screen (150) based on the direction of the finger. The hand type & finger direction estimator (112) determines the location of the finger of the hand relative to the location of the identified UI element.
In an embodiment, for determining the direction of the finger pointing towards the UI element based on the UWB signal and the type of the hand, the hand type & finger direction estimator (112) determines permittivity (i.e., dielectric constant) of one or more parts of the finger based on the UWB signal. The hand type & finger direction estimator (112) identifies one or more parts of the finger based on the determined permittivity. The hand type & finger direction estimator (112) further determines a proximity between the one or more parts of the finger based on the determined permittivity. The hand type & finger direction estimator (112) determines a projection of the finger relative to the location of the electronic device (100) based on the UWB signal. The hand type & finger direction estimator (112) further determines the direction of the finger pointing towards the UI element based on one or more parts of the finger, the proximity between one or more parts of the finger, and the projection of the finger.
In an embodiment, the UI positioning engine (113) determines a location on the screen (150) closer to the finger based on the location of the finger. The UI positioning engine (113) alters the location of the UI element to the determined location on the screen (150).
The single-hand mode assistance engine (110) recognizes the location of the finger, i.e. an angle of projection of a nail by determining the permittivity (i.e., dielectric constant) for various parts of the finger from the reflected UWB signals, reconstructing the structure of the finger based on the determined permittivity of the various parts, and identifying the positioning of the nail of the finger based on the determined permittivity of the nail, where the fingernail has different permittivity compared to other soft tissues of the finger and the bone. Further, the single-hand mode assistance engine (110) uses the determined location of the fingernail as a benchmark to render a pointer for a selection of the UI element.
The memory (120) stores instructions to be executed by the processor (130). The memory (120) may include non-volatile storage elements. Examples of such non-volatile storage elements may include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. In addition, the memory (120) may, in some examples, be considered a non-transitory storage medium. The term “non-transitory” may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. However, the term “non-transitory” should not be interpreted that the memory (120) is non-movable. In some examples, the memory (120) can be configured to store larger amounts of information than its storage space. In certain examples, a non-transitory storage medium may store data that can, over time, change (e.g., in Random Access Memory (RAM) or cache). The memory (120) can be an internal storage unit or it can be an external storage unit of the electronic device (100), cloud storage, or any other type of external storage.
The processor (130) is configured to execute instructions stored in the memory (120). The processor (130) may be a general-purpose processor, such as a Central Processing Unit (CPU), an Application Processor (AP), or the like, a graphics-only processing unit such as a Graphics Processing Unit (GPU), a Visual Processing Unit (VPU) and the like. The processor (130) may include multiple cores to execute the instructions. The communicator (140) is configured for communicating internally between hardware components in the electronic device (100). Further, the communicator (140) is configured to facilitate communication between the electronic device (100) and other devices via one or more networks (e.g. Radio technology). The communicator (140) includes an electronic circuit specific to a standard that enables wired or wireless communication.
A function associated with the AI model (114) may be performed through the non-volatile/volatile memory (120), and the processor (130). One or more processors (130) control the processing of the input data in accordance with a predefined operating rule or the AI model (114) stored in the non-volatile/volatile memory (120). The predefined operating rule or the AI model (114) is provided through training or learning. Here, being provided through learning means that, by applying a learning method to a plurality of learning data, the predefined operating rule or the AI model (114) using desired characteristic is made. The learning may be performed in the electronic device (100) itself in which the AI model (114) according to an embodiment is performed, and/or may be implemented through a separate server/system. The AI model (114) may consist of a plurality of neural network layers. Each layer has a plurality of weight values, and performs a layer operation through the calculation of a previous layer and an operation of a plurality of weights. Examples of neural networks include, but are not limited to, convolutional neural network (CNN), deep neural network (DNN), recurrent neural network (RNN), restricted Boltzmann Machine (RBM), deep belief network (DBN), bidirectional recurrent deep neural network (BRDNN), generative adversarial networks (GAN), and deep Q-networks. The learning method is a method for training a predetermined target device (for example, a robot) using a plurality of learning data to cause, allow, or control the target device to make a determination or prediction. Examples of the learning method include, but are not limited to, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning.
Although
The various actions, acts, blocks, steps, operations, or the like in the flow diagram (300) may be performed in the order presented, in a different order than presented, or simultaneously. Further, in some embodiments, some of the actions, acts, blocks, steps, operations, or the like may be omitted, added, modified, skipped, or the like without departing from the scope of the disclosure.
The direction (01) indicated by the finger (401) is fed to a track pointer movement of an operating system framework, where the location of the pointer on the screen (150) is further controlled by the electronic device (100) using an Application Programming Interface (API) of the operating system framework based on a change in the direction (01) indicated by the finger (401). The electronic device (100) selects the UI element (503) upon locating the pointer on the UI element (503) for a time duration (e.g., 1 second). The electronic device (100) changes the pointer location with the user pointing towards different locations on the screen (150), i.e. the pointer follows the user's finger-pointing direction which is determined by establishing the permittivity of the fingernail. Alternatively, the electronic device (100) identifies the location in the screen (150) nearest to the finger (401) and moves the UI element (503) at the identified nearest location as shown in 505.
Consider, the finger (502) is pointing towards the UI element (504) displayed on the screen (150) of the electronic device (100) as shown in 506. Further, the electronic device (100) determines the direction (02) indicated by the finger (401) using the UWB sensor (160), the initial location of the pointer based on the determined direction (02) and locates the pointer on the UI element (504). The direction (02) indicated by the finger (401) is fed to the track pointer movement of the operating system framework, where the location of the pointer on the screen (150) is further controlled by the electronic device (100) using the API of the operating system framework based on the change in the direction (02) indicated by the finger (401).
The electronic device (100) selects the UI element (504) upon locating the pointer on the UI element (503) for the time duration. The electronic device (100) changes the pointer location with the user pointing towards different locations on the screen (150), i.e. the pointer follows the user's finger-pointing direction which is determined by establishing the permittivity of the fingernail. Alternatively, the electronic device (100) identifies the location in the screen (150) nearest to the finger (401) and moves the UI element (504) towards the nearest location as shown in 507.
The hand type & finger direction estimator (112) extracts real and imaginary parts of the pre-processed UWB signal and generates a spectrogram using the real and imaginary parts. Further, the hand type & finger direction estimator (112) provides the spectrogram to the AI model (170), where the AI model (170) is trained on the signal spectrograms to infer/predict the hand type [(left hand or right hand) and the finger direction (i.e. the direction of the finger (614)). Upon receiving the prediction on the hand type and the finger direction from the AI model (170), the hand type & finger direction estimator (112) provides the hand type and the finger direction to the UI positioning engine (113). Further, the UI positioning engine (113) performs UI operation mapping and application (612) and moves the UI element close to the finger (613).
where d is the thickness of each part of the finger (701), c is the speed of light in free space, ΔT is the difference of time between 2 consecutive reflected UWB signals, where d=c*T2. Using the pre-processed UWB signal, the electronic device (100) determines ΔT and the thickness of a particular tissue (e.g. fingernail) i.e. a part of the finger. The electronic device (100) provides the estimated dielectric constant of a particular tissue as an input to the AI model (114) to identify that particular tissue.
As shown in the
As shown in the
As shown in the
The embodiments disclosed herein can be implemented using at least one hardware device and performing network management functions to control the elements.
The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the scope of the embodiments as described herein.
Number | Date | Country | Kind |
---|---|---|---|
202241060990 | Oct 2022 | IN | national |
This application is a continuation of International Application No. PCT/KR2023/012729, filed on Aug. 28, 2023, with the Korean Intellectual Property Office, which claims priority from Indian Patent Application No. 202241060990, filed on Oct. 26, 2022, with the Indian Intellectual Property Office, the disclosures of which are incorporated herein by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/KR2023/012729 | Aug 2023 | WO |
Child | 19024553 | US |