1. Technical Field
The present disclosure relates to an electronic device and a method of opening a user interface on a screen.
2. Description of Related Art
With the fast development of the electronics industry and information technology, electronic products have become more popular. Conventionally, many electronic devices, such as computers or mobile phones, have screens.
As to a small electronic device, the touch screen is limited in size. A user comes to grips with this small sized touch screen, so that errors in operation are extremely common. In view of the foregoing, there is an urgent need in the related field to provide a way to operate the screen ergonomically.
The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements of the present invention or delineate the scope of the present invention. Its sole purpose is to present some concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
In one or more various aspects, the present disclosure is directed to an electronic device and a method of operating a screen.
According to one embodiment of the present invention, the electronic device includes a screen, a user interface module and a processing module. The screen is capable of displaying a working window and an executing window. The user interface module, wherein when a pointer is positioned on the executing window, the user interface module generates a first sensing signal for displaying at least one item on the screen. When the pointer selects the item, the user interface module generates a second sensing signal. When the pointer drags the item to the working window, the user interface module generates a third sensing signal. The processing module can continuously receive the first, second and third sensing signals that are sequentially generated by the user interface module to open a user interface corresponding to the item in the working window, where the user interface is adjacent to the pointer.
According to another embodiment of the present invention, a screen is capable of displaying a working window and an executing window, and a user interface module is capable of generating first, second, and third sensing signals. The method for opening a user interface on a screen includes following steps:
(a) When a pointer is positioned on the executing window, a first sensing signal is generated, and at least one item is displayed;
(b) When the pointer selects the item, a second sensing signal is gendered;
(c) When the pointer drags the item to the working window, a third sensing signal is generated; and
(d) When a processing module continuously receives the first, second and third sensing signals that are sequentially generated by the user interface module, a user interface corresponding to the item is opened in the working window, wherein the user interface is adjacent to the pointer.
When using the electronic device and the method for operating the user interface, a user moves the pointer to the executing window and then drags the item to the working window for opening the user interface corresponding to the item, where the user interface is adjacent to the pointer. This operating manner conforms to ergonomics, so as to provide convenience in use.
Many of the attendant features will be more readily appreciated, as the same becomes better understood by reference to the following detailed description considered in connection with the accompanying drawings.
The present description will be better understood from the following detailed description read in light of the accompanying drawing, wherein:
a-1b are schematic drawings of an electronic device according to one or more embodiments of the present invention;
a-2d are schematic drawings of opening a user interface on a screen of the electronic device;
a-3b are schematic drawings of the electronic device according to a first embodiment of the present invention;
a-4b are schematic drawings of the electronic device according to a second embodiment of the present invention;
In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to attain a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
a is a block diagram of an electronic device 100 according to one or more embodiments of the present invention. As shown in
The screen 110 is capable of displaying a working window 112 and an executing window 114. As shown in
Moreover, the screen 110 may display the executing window 114 that is overlapped on the working window 112. Alternatively, the working window 112 is reduced from a first window range (as shown in
Additionally or alternatively, the screen 110 may display the working window 112 and the executing window 114 simultaneously without having the predetermined executing window area 116.
During the screen 110 displays a frame, the working window 112 is used for displaying an application program interface, an icon or the like, so that a user can operate the electronic device 100 through the working window 112 of the screen 110. The executing window 114 functions as a menu for displaying a special item instruction or an express instruction that the user defined.
Please refer to
If the screen 110 is the non-touch screen, the user interface module 130 could be a mouse or a touch pad which can control the pointer's movement. Alternatively, an image capture apparatus captures the user's gesture and then analyzes image variation to generate a control signal for controlling the pointer's movement.
In use, as shown in
As shown in
As shown in
As shown in
Furthermore, the processing module 120 opens the user interface corresponding to the item 150 in a predetermined window range. The predetermined window range may be equal to the entire display region of the screen 110, so that the processing module 120 can open the user interface 170 in full screen mode.
In this way, the user moves the pointer to the executing window 114 to select the item 150 and then drags the item 150 to the working window 112 for opening the user interface 170. This operating manner conforms to ergonomics, so as to provide convenience in use.
For a more complete understanding of opening the user interface and the interaction between the screen 110, the user interface module 130, and the processing module 120, the description will be made as to the first, second and third embodiments of the present disclosure in conjunction with the accompanying drawings.
As shown in
As shown in
In the first embodiment, the screen 110 has the preset trigger positions A1, A2 and A3. Therefore, the processing module 120 opens the user interface corresponding to the item 150, where the user interface is adjacent to the trigger position A1. It should be appreciated that foresaid three trigger positions A1, A2 and A3 corresponding to the items 150, 152 and 154 illustrated in
In the second embodiment, when determining that the pointer is positioned on the executing window 114 and selects the item 150, the user interface module 130 generates the first and second sensing signals. As shown in
As shown in
As shown in
In the third embodiment, the conditions for generating the first and second signals are disclosed in the first and second embodiments and, thus, are not repeated herein. As shown in
In practice, when the pointer drags the item from a first direction to a second direction, and when an included angle between the first and second directions is larger than 90°, the user interface module 130 generates the third sensing signal. If the included angle is less than 90°, the pointer may move back on the executing window 114; this motion signifies the user doesn't want to open the user interface corresponding to the item. Therefore, the included angle being larger than 90° conforms to ergonomics, so as to facilitate operation.
The processing module 120 may be hardware, software, and/or firmware. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle; alternatively, if flexibility is paramount, the implementer may opt for a mainly software implementation; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware. Hence, there are several possible vehicles by which the processes and/or devices and/or other technologies described herein may be effected, none of which is inherently superior to the other in that any vehicle to be utilized is a choice dependent upon the context in which the vehicle will be deployed and the specific concerns (e.g., speed, flexibility, or predictability) of the implementer, any of which may vary.
In step S310, when a pointer is positioned on the executing window, the first sensing signal is generated, and at least one item is displayed;
In step S320, when the pointer selects the item, the second sensing signal is generated;
In step S330, when the pointer drags the item to the working window, a third sensing signal is generated; and
In step S340, when a processing module continuously receives the first, second and third sensing signals that are sequentially generated by the user interface module, a user interface corresponding to the item is opened in the working window, where the user interface is adjacent to the pointer.
In this method, first, second and third operating modes are proposed in accordance with the foresaid first, second and third embodiments with regard to the electronic device. In the first operating mode, at least one trigger position is preset in the working window, and the third sensing signal is generated when the pointer drags the item to the trigger position. In the second operating mode, the third sensing signal is generated when the pointer stops dragging the item. In the third operating mode, the third sensing signal is generated when the pointer drags the item and changes the direction for dragging the item. Therefore, the processing module continuously receives the first, second and third sensing signals to open the user interface corresponding to the item, wherein the user interface is adjacent to the pointer. The more detail of the first, second and third operating modes are disclosed in the above first, second and third embodiments and, thus, are not repeated herein.
The foresaid method may take the form of a computer program product on a computer-readable storage medium having computer-readable instructions embodied in the medium. Any suitable storage medium may be used including non-volatile memory such as read only memory (ROM), programmable read only memory (PROM), erasable programmable read only memory (EPROM), and electrically erasable programmable read only memory (EEPROM) devices; volatile memory such as SRAM, DRAM, and DDR-RAM; optical storage devices such as CD-ROMs and DVD-ROMs; and magnetic storage devices such as hard disk drives and floppy disk drives.
In view of above, technical advantages are generally achieved, by one or more embodiments of the present invention, as follows:
1. The user can intuitively open the user interface corresponding to the item by means of dragging this selected item; and
2. The user can intuitively drag the item to the working window and then open the user interface corresponding to the item by means of dragging the item to the trigger position, stopping dragging the item or changing the direction for dragging the item.
The reader's attention is directed to all papers and documents which are filed concurrently with his specification and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.
All the features disclosed in this specification (including any accompanying claims, abstract, and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
Any element in a claim that does not explicitly state “means for” performing a specified function, or “step for” performing a specific function, is not to be interpreted as a “means” or “step” clause as specified in 35 U.S.C. §112, 6th paragraph. In particular, the use of “step of” in the claims herein is not intended to invoke the provisions of 35 U.S.C. §112, 6th paragraph.
This application claims priority to U.S. provisional Application Ser. No. 61/164,918, filed Mar. 31, 2009, which is herein incorporated by reference.
Number | Date | Country | |
---|---|---|---|
61164918 | Mar 2009 | US |