This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2018-221259, filed on Nov. 27, 2018, the entire contents of which are incorporated herein by reference.
The present disclosure relates to an input device.
In various applications, input devices are used to convert user's intentions into electrical signals for various purposes. In the related art, the input devices are mechanical switches or buttons, and in recent years, electronic input devices such as touch panels have been provided.
Whether an input is intended or not intended by a user cannot be distinguished only in the touch panel. Further, when a capacitance switch is used, water drops may be erroneously detected.
Further, in the related art, it may be said that the touch panel is optimized for fine input because of its high resolution. However, in a case where the touch panel is operated during driving, since the user cannot watch the touch panel, it is difficult for the user to perform fine input. Therefore, in such a case, an input device suitable for detecting an intuitive or rough input using a palm, a finger, an arm, or the like is desired.
Some embodiments of the present disclosure provide an input device.
According to one embodiment of the present disclosure, there is provided an input device. The input device includes: a housing including a transparent base; a touch sensor installed on the transparent base; and a camera installed inside the housing to monitor an outside of the housing via the transparent base.
Further, arbitrarily combining the foregoing components or substituting the expressions of the present disclosure with each other among a method, an apparatus, a system, and the like is also effective as an embodiment of the present disclosure.
An embodiment of the present disclosure relates to an input device. The input device includes a housing including a transparent base, a touch sensor installed on the base, and a camera which is installed inside the housing and monitors the outside via the base.
According to the present embodiment, it is possible to provide an operation input, which cannot be provided by an input device of the related art, by combining a camera image and an output of the touch sensor.
The input device may further include a processing part for receiving an output of the camera and the output of the touch sensor. Alternatively, the processing by the processing part may be externally performed.
The input device may be used by a predetermined user. At this time, the processing part may determine whether or not a current input to the touch sensor is by the predetermined user based on the output of the camera. Thus, an input other than the predetermined user may be invalidated.
The processing part may determine a direction of the user's palm based on the output of the camera. In so doing, it can be distinguished whether the user's palm is directed to the input device, the back of the user's hand is directed to the input device, or the user's palm is directed vertically to the input device, thereby enabling gesture input using the direction of the hand. Alternatively, only a specific state may be treated as a valid input.
The processing part may determine a shape of the user's hand based on the output of the camera. Rock, scissors, paper, hand signs, and shapes in which the number of fingers bent (the numbers of fingers stretched) is different are exemplified as variations of the shape of the hand.
The input device may be used by a plurality of users accessing from different directions. The processing part may determine whether the current input to the touch sensor is by which user based on the output of the camera. For example, when two users face each other across the input device, it is possible to determine which user's input is input to the touch sensor by determining the hand of that user based on the camera image.
The base may be flat. The base may have a curved surface.
The present disclosure will now be described with reference to the drawings based on an exemplary embodiment. Like or equivalent components, members, and processes illustrated in each drawing are given like reference numerals and a repeated description thereof will be properly omitted. Further, the embodiment is presented by way of example only, and is not intended to limit the present disclosure, and any feature or combination thereof described in the embodiment may not necessarily be essential to the present disclosure.
In the present disclosure, “a state where a member A is connected to a member B” includes a case where the member A and the member B are physically directly connected or even a case where the member A and the member B are indirectly connected through any other member that does not substantially affect an electrical connection state between the members A and B or does not impair functions and effects achieved by combinations of the members A and B.
Similarly, “a state where a member C is installed between a member A and a member B” includes a case where the member A and the member C or the member B and the member C are indirectly connected through any other member that does not affect an electrical connection state between the member A and the member C or the member B and the member C or does not impair function and effects achieved by combinations of the member A and the member C or the member B and the member C, in addition to a case where the member A and the member C or the member B and the member C are directly connected.
The touch sensor 120 is installed on the base 112. The touch sensor 120 may be a capacitance switch in some embodiments, and in the present disclosure, the touch sensor 120 may include a non-contact type proximity sensor. In
The input device 100 monitors a state of a hand 2 or finger of the user to determine an occurrence of effective input and a type of the input.
The camera 130 is installed inside the housing 110 and monitors the outside of the housing 110 via the base 112. Therefore, the panel of the base 112 and the touch sensor 120 is transparent at least in a wavelength region where the camera 130 has sensitivity.
The processing part 140 receives the output of the camera 130 and the output of the touch sensor 120, integrally processes them, and determines whether there is an input by the user, a type of the input, which user is providing an input when there is a plurality of users, and the like. The processing part 140 may be, for example, a processor.
The basic configuration of the input device 100 has been described above. According to the input device 100, it is possible to provide a novel or new operation input, which could not be provided by the input device of the related art, by combining the image of the camera 130 and the output of the touch sensor 120. Furthermore, it is possible to discriminate a change in capacitance due to adhesion of water droplets or dust and a change in capacitance due to user's input based on the image of the camera.
Hereinafter, a specific usage mode of the input device 100 will be described.
For example, when only the input by the palm is permitted, the output of the touch sensor 120 may be validated only when the palm is detected by the camera 130 as illustrated in
For example, a case where the input device 100 according to the embodiment is used as an interface of an automobile and is grounded to a center console of a driver seat and a passenger seat is considered. In this case, by determining the right hand and the left hand based on the camera image, it is possible to distinguish whether the current input is by the driver or a passenger. Thus, it is possible to perform a control such as permitting the input only by a specific person (for example, only by the driver) and excluding the input by other passengers. Of course, conversely, the input by the driver may be excluded and only the input by other passengers may be permitted.
The size and arrangement of the electrodes may be determined according to an assumed user's input part (fingertip, palm, or the like).
(Applications)
The input device 100 has various applications, but may be mounted on, for example, an automobile.
Another example of the installation location of the input device 100B is a console between the driver seat and the passenger seat. An operation required for driving, and an operation of an audio system, a car navigation system, an air conditioner, or the like may be accepted by the input device 100B.
The present disclosure has been described above based on the embodiments. The present embodiments are presented by way of example, and it is understood by a person skilled in the art that various modifications may be made in combinations of each component and each processing process, and such modifications are also within the scope of the present disclosure. Hereinafter, the modifications will be described.
(Modification 1)
In the description so far, the shape of the base 112 is rectangular and flat, but is not limited thereto.
(Modification 2)
(Modification 3)
(Modification 4)
In the embodiments, the touch sensor 120 of capacitance type is used, but is not limited thereto. When the input is limited to contact, a resistive film sensor may be employed as the touch sensor 120.
(Modification 5)
The application of the input device 100 is not limited to the automobile. For example, the input device 100 is also suitable as an operation interface for medical equipment, game equipment, industrial machines, and industrial vehicles.
(Modification 6)
In the embodiments, the processing part 140 is incorporated in the input device 100, but the present disclosure is not limited thereto, and the processing in the processing part 140 may be assigned to an external processor. In this case, an interface circuit with the external processor may be installed instead of the processing part 140 so that the image data of the camera and the output of the touch sensor 120 can be supplied from the interface circuit to the external processor.
According to the present disclosure in some embodiments, it is possible to provide an input device suitable for detecting an intuitive or rough input using a palm, a finger, an arm, or the like.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the disclosures. Indeed, the embodiments described herein may be embodied in a variety of other forms. Furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the disclosures. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the disclosures.
Number | Date | Country | Kind |
---|---|---|---|
2018-221259 | Nov 2018 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
5483261 | Yasutake | Jan 1996 | A |
7420155 | Mizota | Sep 2008 | B2 |
7707001 | Obinata | Apr 2010 | B2 |
20070222766 | Bolender | Sep 2007 | A1 |
20080211779 | Pryor | Sep 2008 | A1 |
20080218515 | Fukushima | Sep 2008 | A1 |
20090267921 | Pryor | Oct 2009 | A1 |
20110181545 | Takahashi | Jul 2011 | A1 |
20120287081 | Akai | Nov 2012 | A1 |
20130030815 | Madhvanath | Jan 2013 | A1 |
20130147743 | Ludwig | Jun 2013 | A1 |
20140201674 | Holz | Jul 2014 | A1 |
20150082897 | Kim | Mar 2015 | A1 |
20160259473 | Kim | Sep 2016 | A1 |
20180300051 | Kim | Oct 2018 | A1 |
20190311190 | Wang | Oct 2019 | A1 |
20200257373 | Jeon | Aug 2020 | A1 |
20210229555 | Salahat | Jul 2021 | A1 |
Number | Date | Country |
---|---|---|
2000040147 | Feb 2000 | JP |
2003216321 | Jul 2003 | JP |
2009252105 | Oct 2009 | JP |
2009252105 | Oct 2009 | JP |
2014209336 | Nov 2014 | JP |
2015512540 | Apr 2015 | JP |
2015092422 | May 2015 | JP |
2016051436 | Apr 2016 | JP |
2016511488 | Apr 2016 | JP |
2017102770 | Jun 2017 | JP |
Entry |
---|
JPO Notice of Reasons for Refusal corresponding to JP Application No. 2018-221259, dated Sep. 20, 2022. |
JPO Notice of Reasons for Refusal corresponding to JP Application No. 2018-221259; dated Mar. 14, 2023. |
Number | Date | Country | |
---|---|---|---|
20200167035 A1 | May 2020 | US |