The following description relates to an input device, and more particularly to a direction input device.
Using an input device, a user is able to manipulate an object displayed on a screen of an electronic device. For example, the user may change a location or direction of a mouse pointer displayed on the screen. Examples of the input device includes a mouse, a joystick, a trackball, a touch pad, a track pad, and the like. The mouse is the most commonly used input device. However, a surface is indispensable when using a mouse. Thus, it is difficult to use the mouse in a mobile environment. In addition, as the surface needs to be large and it is inconvenient to use the mouse on a small desk, the work space should be large enough to use the mouse freely.
In the mobile environment, a touch pad and a track pad are commonly used. The two devices are convenient to use, but an incorrect input occurs due to an unintended touch of a user or even a correct input may not be sensed due to occurrence of static electricity. For those reasons, people engaged in drawing pictures or diagrams in detail or performing sensitive tasks requiring precise control prefer using a mouse rather than touch in many cases.
Using a joystick or track ball makes it relatively easy to input a direction, but inconvenient to control a moving distance, and thus, similarly to touching, it is inappropriate for precision-oriented tasks, for example, drawing pictures or performing CAD.
Among conventional devices, a joystick uses mechanic operations and a simple sensor and thus is appropriate for detailed inputs; a mouse is inconvenient because a flat surface is essential or it is necessary to lift up and then put down the mouse in order to extend a moving distance; and a track pad is hard to control using a precise movement due to a different degree of friction between fingers.
According to an exemplary embodiment, an input device and a method for operating a user interface using the same are proposed, which, unlike a mouse, does not require a surface to input and control a direction or a distance and is not influenced by finger friction or static electricity generated by a touch.
In one general aspect, there is provided a direction input device including: a pad unit configured to comprise a marked surface formed on one side thereof and having marks of different codes or to be integrated with the marked surface; an optical unit physically connected to the pad unit in a direction toward the marked surface and configured to irradiate light through a light source onto the marked surface of the pad unit, to sense light reflected from a specific mark on the marked surface of the pad unit by using a sensor, and to convert the reflected light into an image signal; and a connecting unit configured to connect the pad unit and the optical unit.
In another general aspect, there is provided a method for operating a user interface using a direction input device, the method including: receiving, by a pad of a pad unit, generated light from a light source; in response to a user's force being applied, moving, by a marked surface of the pad unit and an optical unit, in a relative direction to reflect received from the light source on a specific mark on the marked surface; sensing, by a sensor of the optical unit, the light reflected from the specific mark on the marked surface and converting the reflected light into an image signal; and calculating input parameters including a user input direction and distance information by analyzing the image signal that is converted by the sensor.
According to an exemplary embodiment, the present disclosure is portable and convenient to use. That is, it does not need a surface for support, unlike a mouse, and a configuration of an integrated pad unit with an optical unit as one body allows mobile use in a three-dimensional (3D) space.
In addition, the present disclosure enables precise inputs. That is, unlike a touch pad, the present disclosure is able to precisely respond according to a magnitude of an input signal, without being influenced by finger friction or static electricity generated by a touch.
Further, an input device and a space may be more compact. Even though a mouse has become smaller, because the mouse requires sufficient space in which to move freely, an area larger than the size of the specific structure of the mouse is necessary; however, if the present disclosure is used, it is possible to manufacture a compact direction input device.
The invention is described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments of the invention are shown. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. The terms used herein are defined in consideration of the functions of elements in the present invention. The terms can be changed according to the intentions or the customs of a user and an operator. Therefore, definitions of the terms should be made on the basis of the overall context.
An input device 1 is a pointing device that enables user manipulation for an object to be displayed on a screen of an electronic device. The object to be displayed includes a mouse pointer to be displayed on the screen. The input device 1 may be in a portable form, in a form separate from the electronic device, or in a form embedded in a portable electronic device. The electronic device may change direction or location of an object displayed on a screen by receiving a magnitude, a direction, a speed and distance information of an input, which are input parameters, from the input device (1). The electronic device includes all devices with a display function, for example, any kind of computers, Personal Digital Assistant (PDA), portable electronic device, mobile phone, cell-type smart phone, notebook computer, and the like.
Referring to
The pad 100, the light source 120 and the sensor 122 are optically connected. Herein, the optical connection indicates an arbitrary connection that allows light to reach a specific target using only air through a light guide member/medium, a physical channel, or a combination thereof.
The light source 120 irradiates with light, and the irradiating light may include either or both a visible ray and invisible ray. One example of a visible ray is infrared ray. The light source 120 may be in a form of light emitting diode (LED). Light irradiated from the light source 120 reaches the pad 100, and some of them may be reflected. At this point, below the pad 100 moved by an input object, such as a user's fingertip or palm, the sensor 122 at a fixed location detachable from the pad 100 may receive light reflected from the marked surface 100a of the pad.
As shown in
According to an exemplary embodiment, the optical unit 12 moving by a user's input is fixed on the marked surface 100a of the pad 100. Accordingly, the marked surface 100a moves against a moving direction of the optical unit 12 according to a force applied by a user's input from the user's fingertip or palm. According to another exemplary embodiment, the marked surface 100a may be fixed and the optical compartment 12 may be configured to move. In this case, in response to a force of a user's input, which is applied by a user's fingertip or palm, the optical unit 12 moves in a reverse direction relative to the marked surface 100a.
In response to the user's input, a specific mark among all the marks formed on the marked surface 100a reflects light received from the light source 120 to the sensor 122. Without the marked surface 100a, input controlling according to a finger touch movement may not be precise and inconsistent due to friction of a finger. In addition, in a case of a long moving distance, repeated touch by a finger is required. However, in a case of using the marked surface 100a as described in the present disclosure, the current relative location of the marked surface 100a may be identified, so it is possible to recognize an input that has moved relative to the optical unit 12 and to control the marked surface 100a to keep moving at a speed corresponding to magnitude of a vector that has moved relative to a specific direction), and thereby, repeated touch is unnecessary.
The sensor 122 senses light reflected from a marked surface 100a of the pad 100: that is, the sensor 122 senses light reflected from a specific mark on the marked surface 100a and converts the reflected light into an electric signal. The sensor 122 may be an image sensor or a camera.
According to another exemplary embodiment of the present disclosure, lenses may be further included between the light source 120 and the pad 100 and between the pad 100 and the sensor 122. A lens between the light source 120 and the pad 100 collects light generated in the light source 120, and the lens collects light reflected from the pad 100 and transfers the reflected light to the sensor 122.
Referring to
The input device 1a may be made in portable form. For example, the input device 1a may be made in stick form, just like the ballpoint pen type as shown in
As illustrated in
The connecting unit 14 connecting the pad unit 10 and the optical unit 12 may be, for example, a connection member for a joystick or a button. The connecting unit 14 may be configured to have two axes so as to enable horizontal and vertical movement, or may be configured to allow movement in a plane.
According to an exemplary embodiment, the pad unit 10 may be in the form of a button or a capsule. The pad unit 10 may move in a specific direction, such as a horizontal or a vertical direction, or rotate freely regardless a specific direction.
As shown in
As the marked surface 100a formed on one surface of the pad unit 10 or the optical unit 12 is configured to move, it is possible to calculate not just a relative moving direction of the marked surface 100a, but a relative location thereof. That is, by calculating direction and magnitude of movement that occurs relative to a center of the marked surface 100a in response to a user's input, it is possible to cause a vector input, such as a mouse-based vector input, to occur. Finger touch allows only measurement of a moving direction of an image by comparing the image with previous and subsequent images and movement of the finger touch is not smooth due to friction; however, using the pad unit 10 having marks printed therein allows not just to identify a moving direction, such as a direction of a joystick, but to precisely calculate relative location and distance from a starting point and make a user input more smooth.
The difference between the input device 1b in
According to another exemplary embodiment of the present disclosure, the input device 1b include a restoration component 16. The restoration component 16 may be formed between the pad 100 of the pad unit 16 and the connecting unit 14. In a case where no force is applied by a user, the restoration component 16 functions to restore relative locations of the pad unit 10 and the optical unit 12 to be starting points, and the restoration component 16 may be a spring and the like. The starting point may be desirably a center of a marked surface; however, it may be hard to adjust the starting point to be the very center of the marked surface due to looseness of the restoration component 16, and thus, it is possible to always reset the relative location as a starting point when a user's force is not applied.
Referring to
As shown in
Meanwhile, as shown in
According to an exemplary embodiment, the pad unit 10 may be in the form of a joystick having a convex outer surface, as shown in
Referring to
Configurations of the pad 100, the light source 120, and the sensor 122 are described with reference to the above-described drawings, and thus, the following descriptions are provided mainly about the processor 150.
The processor 150 controls the light source 120 to irradiate light. In addition, the processor 150 calculates the current relative location of the pad 100 by analyzing an image signal acquired from the sensor 122 and calculating a location of a mark on the marked surface of the pad 100 at a time when the light is irradiated. In addition, the processor 150 calculates a difference between a previously acquired relative location of the pad 100 and the current relative location of the pad 100, and calculates a moving speed based on time required to move from the previously acquired relative location to the current relative location. Then, the processor 150 determines input parameters, which includes a magnitude, speed, and a direction vector of an input, by using the calculated relative location and moving speed.
Using a marked surface of the pad 100 that relatively moves by a user's input, the processor 150 may calculate an input vector value: the more distant a location of a mark is from a starting point, the more quickly an input may be caused to occur constantly, wherein the input is identical to moving a mouse quickly; and the closer a location of a mark is to a starting point, the more slowly an input may be caused to occur, wherein the input is identical to slowly moving a mouse in a corresponding vector direction. That is, without an operation of constantly moving or repeatedly lifting and putting down a mouse and the like so as to extend a moving distance, the present disclosure allows constant inputs in a corresponding direction through a movement in one direction, which is the same manner as when a joystick is used.
The difference between the input device 1 of the present disclosure and a joystick lies in the fact the input device 1 is capable of precisely controlling a magnitude of a vector value or a moving speed according to a location. Although it is possible to input a magnitude using a pressure sensor or a moving distance in the case of the joystick, it is less precise than using optical characteristics, as described in the present disclosure. In addition, the input device 1 may determine an input speed or a magnitude of a direction vector according to a speed of the marked surface which has been moved from the previous image (that is, of which coordinates have been changed). That is, the input device 1 may calculate an input vector of a corresponding direction (e.g., a moving speed of a mouse) based on a speed of movement from a starting point to the current coordinates of the marked surface and on a function value for a distance of the marked surface from the starting point.
The difference of the input device 1 of the present disclosure and a touch pad lies in the fact that, unlike the touch pad, the input device 1 is able to precisely respond according to magnitude of an input signal without being influenced by static electricity generated by finger friction and touching.
Referring to
In this case, as illustrated in
Then, when analyzing an image signal acquired by a sensor, it is possible to easily identify a location of a valid mark simply through a projection onto X axis or Y axis. The location of a valid mark indicates an area where no three consecutive empty projections exist on any X axis or Y axis, and it is easy to read marks in the found area.
Meanwhile, a binary value is used in this description, but a code may be designed more sophisticatedly if a brightness or color value is used. Various designs are possible according to performance and characteristics of a sensor. A brightness value may be used as an absolute value or as a difference between relative values, or may be used by defining several levels within one mark.
With reference to
In order to read out a mark of 3×3, it is necessary to design resolution of the sensor 122 of 7×7. Of course, high resolution is required to fully cover a corresponding area, but to read a mark of 3×3, the distinguishable minimum resolution is designed by taking into consideration any error on the boundary. For example, it is appropriate for a pixel in charge of one cell to be at least 3×3 in size, and it is desirable for a pixel size of a sensor to be (7×3)×(7×3)=441 or greater. In this case, the precision may be embodied by a grid with 30 rows×3 column=90. Of course, if a smaller degree of input precision is appropriate, various modifications are possible, including reducing a mark size.
With reference to
Then, the processor 150 determines input parameters, which include magnitude, direction, speed and distance information of the user's input in 830 by analyzing the image signal that is converted by the sensor 122. According to an exemplary embodiment, the processor 150 calculates the current relative location of the pad 100 by analyzing an image signal acquired from the sensor 122 and calculating a location of a mark, which has reflected light, on the pad 100, and then calculates a moving speed based on a difference between a previously acquired relative location and the current relative location of the pad 100 and on time required for movement from the two locations. In addition, the processor 150 determines a magnitude, a speed, and a direction vector of an input by using the calculated relative locations and moving speed.
Meanwhile, according to another exemplary embodiment of the present disclosure, it is started to receive a user's input once a marked surface is pressed, and stops receiving the user's input if the pressure on the marked surface is relieved or if the marked surface moves to a starting point after the above-described process is performed.
It will be apparent to those skilled in the art that various modifications and variation can be made in the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2012-0052656 | May 2012 | KR | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/KR2013/003458 | 4/23/2013 | WO | 00 |