The present invention relates to a joystick, and more particularly, to a joystick capable of outputting different control commands resulted from edge pressing operation and laterally shifting operation.
A conventional mechanical joystick includes a sensor, a trackball, a lever arm and a handle. A user presses the handle to move the lever arm, the lever arm can be inclined and rotated via the trackball, and the sensor detects motion of the trackball to control a cursor signal output by the mechanical joystick. The handle is made by solid material and can be pushed and pulled to recline the lever arm for generating the cursor signal. While the mechanical joystick is reclined, the lever arm can be rotated or slanted towards specially designated directions, and the trackball recovers the lever arm via a spring. Therefore, the conventional mechanical joystick is operated by limited gestures due to the designated directions, and may easily result in mechanical fatigue by long-term usage. An advanced joystick may replace the lever arm by a resilient structure, and however the advanced joystick still cannot distinguish edge pressing operation from laterally shifting operation.
The present invention provides a joystick capable of outputting different control commands resulted from edge pressing operation and laterally shifting operation for solving above drawbacks.
According to the claimed invention, a joystick includes a first structural component, a second structural component, a light emitter, an optical sensor and a processor. The second structural component is assembled with the first structural component to form a chamber. The light emitter is disposed inside the chamber for illuminating one surface of the second structural component. The optical sensor is disposed inside the chamber for capturing the illuminated surface of the second structural component. The processor is electrically connected to the optical sensor and adapted to analyze an intensity distribution of the illuminated surface for determining if the joystick is obliquely pressed.
According to the claimed invention, a joystick includes a first structural component, a second structural component, an optical sensor and a processor. The second structural component is assembled with the first structural component to form a chamber, and an identification element is formed on the second structural component. The optical sensor is disposed inside the chamber and adapted to acquire a state of the identification element. The processor is electrically connected to the optical sensor and adapted to analyze the state of the identification element for generating a control signal. The first structural component and the second structural component are made by rigid material.
The joystick of the present invention can preset the luminous regions projected onto the illuminated surface, and analyze the intensity distribution of the illuminated surface, which can be affected by the distribution variation of the luminous regions, for determining whether the joystick is obliquely pressed or laterally shifted in the relative manner. Arrangement of the light emitter is predefined; the light emitter of the joystick not only can illuminate the identification elements therefore the optical sensor can capture the clear image about the identification spots, but also can project the luminous regions for distinguishing the laterally shifting operation from the obliquely pressing operation.
These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
Please refer to
The joystick 10 can include a first structural component 12, a second structural component 14, a light emitter 16, an optical sensor 18 and a processor 20. The first structural component 12 and the second structural component 14 can be movably assembled with each other to form a chamber 21. In the embodiment, the second structural component 14 is disposed above the first structural component 12; however, the first structural component 12 may be disposed inside and covered by the second structural component 14. The first structural component 12 can be separated from the second structural component 14, as shown in
In the first embodiment, the first structural component 12 can be a base of the joystick 10 and the second structural component 14 can be a cover of the joystick 10; the cover protects the base and can be pushed, pulled, pressed and twisted in many ways to generate a variety of control commands. In another possible embodiment, the first structural component 12 may be the cover and the second structural component 14 may be the base, which means the light emitter 16, the optical sensor 18 and the processor 20 can be disposed on the cover of the joystick 10 for providing the same function as mentioned above.
The joystick 10 may have several light emitters 16 respectively disposed around the optical sensor 18 for uniformly illuminating the surface 22 or projecting some luminous regions onto the surface 22. The plurality of light emitters 16 may emit lights in different wavelength, and so the optical sensor 18 can easily identify different luminous region by colors of the luminous region formed in the captured image. Arrangement of the light emitters 16 may project light onto the surface 22 to form at least one luminous region P. A plurality of luminous regions P emitted by the light emitters 16 may be partly overlapped, and each luminous region P may vary its dimension and position in accordance with motion between the first structural component 12 and the second structural component 14. In addition, a plurality of identification elements 24 can be formed on the surface 22 of the second structural component 14 as identification spots Q. The plurality of identification spots Q can be viewed as dark identification spots or luminous identification spots within the identification image.
When the surface 22 is moved along a direction perpendicular to the normal direction of the optical sensor 18 (i.e. the joystick 10 is shifted), positions of the identifications spots Q in the captured image will change accordingly but positions of the luminous regions P in the captured image are not changed. As shown in
The luminous region P and the identification spot Q can be differentiated via a specific quality of the captured image. When the identification element 24 is made by material capable for absorbing the lights emitted from the light emitter 16, some pixels within the captured image having intensity larger than an intensity threshold and having an amount greater than an amount threshold can be represented as the luminous region P, and other pixels within the captured image having intensity ranged in an intensity scope and having an amount in an amount scope can be represented as the identification spot Q. For example, the intensity of the pixels larger than gray level 200 can be set as the luminous region P, and the pixels having the intensity ranged from gray level 30 to gray level 50 and having the amount scope between 2-15 can be set as the identification spot Q. The identification element 24 is used to magnify intensity difference between the luminous region P and the identification spot Q.
In addition, the joystick 10 may switch the light emitter 16 to alternately emit the light with different wavelengths for differentiating the luminous region P and the identification spot Q. For a start, the light emitter 16 can emit the light having a first wavelength, such as the red light, and the optical sensor 18 receives the red light image, therefore the luminous region P can be identified. Then, the light emitter 16 can be switched to emit the light having a second wavelength, such as the blue light, and the identification element 24 is made by the material capable of absorbing the blue light, so the optical sensor 18 can capture the image showing the dark identification spot Q but without interference of the luminous region P.
When the joystick 10 is not pressed and shifted, the luminous regions P emitted by the light emitters 16 may be projected onto an area within an identification image I1 captured by the optical sensor 18, and the identification spots Q can be located on a middle of the area, as shown in
As shown in
At least one of position change, brightness change and shape change of the luminous regions P on the illuminated surface 22 can be analyzed for determining if the joystick 10 is obliquely pressed. The distribution of the identification spots Q (which corresponds to the state of the plurality of identification elements 24) can be captured by the optical sensor 18 and used to determine a category, a direction and a scale of motion between the first structural component 12 and the second structural component 14; however, distribution variation of the identification spots Q resulted from the laterally-shifted joystick 10 is similar to the distribution variation of the identification spots Q resulted from the obliquely pressed joystick 10, therefore the position change or the shape change of the luminous regions P (which corresponds to the intensity distribution of the illuminated surface 22) can be used to distinguish the laterally shifting operation from the obliquely pressing operation. A slanting check design pattern shown in
The processor 20 can analyze the position change of the above-mentioned slanting check design pattern to identify if the joystick 10 is obliquely pressed or laterally shifted and then generate a plurality of candidate control commands. If the intensity distribution of the illuminated surface 22 conforms to a predefined condition, such as the slanting check design pattern containing the luminous regions P being motionless, the plurality of candidate control commands can be generated as possible operation of laterally shifting in a right direction, in a left direction, in an upper direction and in a lower direction; then, the state of the identification elements 24 can be changed and detected as the slanting check design pattern is motionless, so that the processor 20 can analyze the distribution variation of the identification spots Q within the image to select a final control command from the plurality of candidate control commands. For example, the final control command about the identification image I1_1 shown in
If the intensity distribution of the illuminated surface 22 does not conform to the predefined condition, such as the slanting check design pattern containing the luminous regions P being moved and/or deformed, the plurality of candidate control commands can be generated as the possible operation of obliquely pressing in the right direction, in the left direction, in the upper direction and in the lower direction. Because the state of the identification elements 24 is changed and detected as the slanting check design pattern is moved and/or deformed, the processor 20 can analyze the distribution variation of the identification spots Q within the image to select the final control command from the plurality of candidate control commands. For instance, the final control command about the identification image I2_1 shown in
Please refer to
Please refer to
Please refer to
In the second embodiment, the first structural component 32 is the base of the joystick, and the second structural component 34 is the cover of the joystick; the second structural component 34 can be pressed, pushed and pulled by the user to move relative to the first structural component 32. In the third embodiment, the first structural component 32 is the cover of the joystick, and the second structural component 34 is the base of the joystick; the first structural component 32 can be pressed, pushed and pulled by the user to move relative to the second structural component 34. Besides, the first structural component 32 and the second structural component 34 can be made by rigid material.
In conclusion, the joystick of the present invention can preset the luminous regions projected onto the illuminated surface, and analyze the intensity distribution of the illuminated surface, which can be affected by the distribution variation of the luminous regions, for determining whether the joystick is obliquely pressed or laterally shifted in the relative manner. Arrangement of the light emitter is predefined; the light emitter of the joystick not only can illuminate the identification elements therefore the optical sensor can capture the clear image about the identification spots, but also can project the luminous regions for distinguishing the laterally shifting operation from the obliquely pressing operation.
Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
This application is a continuation application of U.S. patent application Ser. No. 16/395,226, filed on 2019 Apr. 25, which is a continuation in part of U.S. application Ser. No. 15/681,415, filed on 2017 Aug. 20.
Number | Name | Date | Kind |
---|---|---|---|
3644972 | Evans | Feb 1972 | A |
6300940 | Ebina | Oct 2001 | B1 |
9939850 | Hoellwarth | Apr 2018 | B2 |
20010055001 | Sakamaki | Dec 2001 | A1 |
20030128194 | Pettersson | Jul 2003 | A1 |
20050062721 | Hsu | Mar 2005 | A1 |
20050275623 | Chadha | Dec 2005 | A1 |
20060079328 | Wang | Apr 2006 | A1 |
20070236692 | Schebesta | Oct 2007 | A1 |
20080001918 | Hsu | Jan 2008 | A1 |
20080016711 | Baebler | Jan 2008 | A1 |
20080192025 | Jaeger | Aug 2008 | A1 |
20090201031 | Morimoto | Aug 2009 | A1 |
20090211820 | Lin | Aug 2009 | A1 |
20100103140 | Hansson | Apr 2010 | A1 |
20100135534 | Weston | Jun 2010 | A1 |
20100286498 | Dacquay | Nov 2010 | A1 |
20110157056 | Karpfinger | Jun 2011 | A1 |
20110168874 | Phan Le | Jul 2011 | A1 |
20120001860 | Phan Le | Jan 2012 | A1 |
20120194457 | Cannon | Aug 2012 | A1 |
20150319821 | Yoshida | Nov 2015 | A1 |
20160361635 | Schmitz | Dec 2016 | A1 |
20170038867 | Buckett | Feb 2017 | A1 |
20170153456 | Huang | Jun 2017 | A1 |
20170293371 | Yao | Oct 2017 | A1 |
20180059850 | Kerr | Mar 2018 | A1 |
20190056759 | Wang | Feb 2019 | A1 |
20220276405 | Watanabe | Sep 2022 | A1 |
Number | Date | Country |
---|---|---|
101135944 | Mar 2008 | CN |
101377915 | Mar 2009 | CN |
102279655 | Dec 2011 | CN |
104731310 | Jun 2015 | CN |
105051660 | Nov 2015 | CN |
106182154 | Dec 2016 | CN |
2000-148391 | May 2000 | JP |
10-2006-0092422 | Aug 2006 | KR |
200937262 | Sep 2009 | TW |
2021100261 | May 2021 | WO |
Number | Date | Country | |
---|---|---|---|
20210191533 A1 | Jun 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16395226 | Apr 2019 | US |
Child | 17191572 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15681415 | Aug 2017 | US |
Child | 16395226 | US |