INPUT DEVICE AND INPUT DEVICE CONTROL METHOD

Information

  • Patent Application
  • 20240036678
  • Publication Number
    20240036678
  • Date Filed
    May 31, 2021
    2 years ago
  • Date Published
    February 01, 2024
    3 months ago
Abstract
An input device includes an optical detection mechanism detecting a position of the user's fingertip in the aerial-image display region in which an aerial image is displayed and a control portion for controlling the input device. In the input device, the aerial-image display region is an input portion for inputting information, and the aerial image includes images of a plurality of buttons for identifying information input in the input portion. Assuming that a region in a display region of an image of the button displayed in the aerial-image display region and recognized by the control portion such that the image of the predetermined button was pointed by the user on the basis of the detection result of the detection mechanism is a recognition region, the recognition region is narrower than the display region in this input device.
Description
TECHNICAL FIELD

The present invention relates to an input device for users to input information using their fingertips. The present invention also relates to a control method of such input devices.


BACKGROUND ART

Conventionally, automated transaction devices such as ATMs (Automated Teller Machines) including an aerial-image display device and a PIN (Personal Identification Number) display and input portion are known (see Patent Literature 1, for example). In the automated transaction device described in Patent Literature 1, the aerial-image display device includes an aerial-image forming mechanism and a display portion. The PIN display and input portion includes a PIN display portion and a PIN input portion. On the display portion, a keypad for inputting the PIN is displayed. The aerial-image forming mechanism projects the keypad displayed on the display portion into a space so as to form an image as an aerial image to display it on the PIN display portion.


In the automated transaction device in Patent Literature 1, the PIN input portion includes a detection mechanism which detects an operation performed by the user on the aerial image of the keypad displayed in the PIN display portion. The detection mechanism is, for example, an infrared sensor, a camera or the like that detects a position of the user's fingertip in a plane containing the aerial image of the keypad displayed on the PIN display portion. In the automatic transaction device of Patent Literature 1, the PIN can be input by the user moving his/her fingertips sequentially to predetermined positions on the aerial image of the keypad displayed on the PIN display portion. In other words, in this automated transaction device, the user can input a PIN by sequentially pointing a fingertip at a predetermined number on the keypad displayed as an aerial image on the PIN display portion.


CITATION LIST
Patent Literature



  • [Patent Literature 1] Japanese Unexamined Patent Application Publication 2020-134843



SUMMARY OF THE INVENTION
Problems to be Solved by the Invention

In the automated transaction device described in Patent Literature 1, if PIN input errors are repeated, subsequent procedures cannot be performed anymore in some cases. Therefore, it is desirable that this automated transaction device is less prone to PIN input errors.


Thus, an object of at least an embodiment of the present invention is, in an input device for a user to input information by using an aerial image displayed in an aerial-image display region, to provide an input device which can suppress input errors of information such as PINs. Moreover, an object of at least an embodiment of the present invention is, in an input device for a user to input information by using an aerial image displayed in an aerial-image display region, to provide a control method of an input device, which can suppress input errors of information such as PINs.


Means for Solving the Problem

In order to solve the above problem, an input device of an aspect of the present invention is an input device inputting information by using a user's fingertip, characterized by including a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, an optical detection mechanism detecting a position of the user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, and a control portion for controlling the input device, in which the aerial-image display region is an input portion for inputting information, the aerial image includes images of a plurality of buttons for identifying information input in the input portion, and by assuming that a region in a display region for the images of the buttons displayed in the aerial-image display region, which is a region recognized by the control portion that the image of a predetermined button was pointed by the user on the basis of a detection result of the detection mechanism is a recognition region, the recognition region is narrower than the display region.


In the input device of this aspect, the aerial-image display region in which the aerial image is displayed is an input portion for inputting information, and the aerial image includes images of a plurality of buttons for identifying information input in the input portion. Moreover, in this aspect, by assuming that a region in the display region for the images of the buttons displayed in the aerial-image display region, which is a region recognized by the control portion that the image of a predetermined button was pointed by the user on the basis of a detection result of the detection mechanism is a recognition region, the recognition region is narrower than the display region. Thus, in this aspect, it is possible to cause the control portion to recognize that the image of the predetermined button was pointed by the user, only when the user reliably pointed at the image of the intended button. Therefore, in the input device of this aspect, it becomes possible to suppress input errors of information such as PINs.


In this aspect, it is preferable that a center part of the display region is the recognition region. By configuring as above, it becomes easier for the control portion to recognize that a predetermined button image was pointed by the user, only when the user reliably pointed at the image of the intended button.


In this aspect, the detection mechanism is, for example, a reflective sensor array.


In order to solve the above problem, an input device of another aspect of the present invention is an input device inputting information by using a user's fingertip, characterized by including a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, and an optical detection mechanism detecting a position of the user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, in which the aerial-image display region serving as an input portion for inputting information, the aerial image includes images of a plurality of buttons for identifying information input in the input portion, and by assuming that directions crossing each other in a plane including the aerial image are a first direction and a second direction, the detection mechanism includes a first detection mechanism detecting a position in the first direction of the user's fingertip and a second detection mechanism detecting a position in the second direction of the user's fingertip, when each of the plurality of button images was pointed by the user's fingertip.


In the input device of this aspect, the aerial-image display region in which the aerial image is displayed is an input portion for inputting information, and the aerial image includes images of a plurality of buttons for identifying information input in the input portion. Moreover, in this aspect, by assuming that directions crossing each other in a plane including the aerial image are a first direction and a second direction, the detection mechanism includes a first detection mechanism detecting a position in the first direction of the user's fingertip and a second detection mechanism detecting a position in the second direction of the user's fingertip, when each of images of the plurality of buttons was pointed by the user's fingertip.


Thus, in this aspect, on the basis of the detection result of the first detection mechanism and the detection result of the second detection mechanism (that is, on the basis of the detection results of the position in the two directions of the user's fingertip), it is possible to cause the input device to recognize that the image of the predetermined button was pointed by the user's fingertip, only when the user's fingertip reliably pointed at the intended button image. Therefore, in the input device of this aspect, it becomes possible to suppress input errors of information such as PINs.


In this aspect, the first and second detection mechanisms are preferably transmissive sensors having a plurality of light emitting portions and a plurality of light receiving portions. By configuring as above, as compared with a case where the first detection mechanism and the second detection mechanism are reflective sensors, the position of the user's fingertip in the input portion can be identified more accurately on the basis of the detection result of the first detection mechanism and the detection result of the second detection mechanism. Therefore, on the basis of the detection result of the first detection mechanism and the detection result of the second detection mechanism, it becomes possible to cause the input device to recognize that the image of the predetermined button was pointed by the user's fingertip more easily, only when the user's fingertip reliably pointed the intended button image.


In this aspect, the optical axis of the light emitted from the light emitting portion preferably pass through the center of the image of the button displayed in the aerial-image display region. By configuring as above, on the basis of the detection result of the first detection mechanism and the detection result of the second detection mechanism, it becomes possible to cause the input device to recognize that the image of the predetermined button was pointed by the user's fingertip more easily, only when the user's fingertip reliably pointed the intended button image.


In this aspect, for example, the first direction and the second direction are orthogonal to each other, and images of the plurality of buttons displayed in the aerial-image display region are arranged in a matrix.


In this aspect, for example, the aerial image is an image of a keypad containing an image of a plurality of numeric buttons.


Moreover, in order to solve the above problem, a control method of an input device in a further different aspect of the present invention is a control method of an input device including a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, and an optical detection mechanism detecting a position of a user's fingertip in the aerial-image display region, which is a region in which the aerial image is displayed, the aerial-image display region serving as an input portion for inputting information by using the user's fingertip, and the aerial image including images of a plurality of buttons for identifying information input in the input portion, in which it is recognized that the image of the button was pointed on the basis of a detection result of the detection mechanism, when the user points to the recognition region, which is narrower than a display region in a display region of the button image displayed in the aerial-image display region.


In this aspect, the aerial-image display region in which the aerial image is displayed is an input portion for inputting information, and the aerial image includes images of a plurality of buttons for identifying the information input at the input portion. Moreover, in the control method of the input device of this aspect, when the user points at a recognition region, which is narrower than the display region within the display region of the image of the button displayed in the aerial-image display region, it is recognized on the basis of the detection result of the detection mechanism that the image of the button was pointed. Thus, by controlling the input device by the control method of this aspect, it becomes possible to cause the input device to recognize that the image of the predetermined button was pointed by the user, only when the user reliably pointed at the image of the intended button. Therefore, by controlling the input device by the control method of this aspect, it becomes possible to suppress input errors of information such as PINs.


Effect of the Invention

As described above, in the present invention, in an input device for a user to input information using an aerial image displayed in the aerial-image display region, it becomes possible to suppress input errors of information such as PINs.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments will now be described, by way of example only, with reference to the accompanying drawings which are meant to be exemplary, not limiting, and wherein like elements are numbered alike in several Figures, in which:



FIG. 1 is a schematic diagram for explaining a configuration of an input device according to Embodiment 1 of the present invention.



FIG. 2 is a block diagram for explaining a configuration of the input device shown in FIG. 1.



FIG. 3 is a schematic diagram for explaining a configuration of an aerial-image display device used in the input device shown in FIG. 1.



FIG. 4A is a diagram showing an example of an aerial image displayed in the aerial-image display region shown in FIG. 1, and FIG. 4B is a diagram for explaining a display region of button images and a recognition region shown in FIG. 4A.



FIG. 5 is a schematic diagram for explaining a configuration of an input device according to Embodiment 2 of the present invention.



FIG. 6 is a schematic diagram for explaining a configuration of a detection mechanism according to a variation of Embodiment 2.





MODE FOR CARRYING OUT THE INVENTION

In the following, embodiments of the present invention will be described with reference to the drawings.


Other Embodiments

Configuration of Input Device



FIG. 1 is a schematic diagram for explaining a configuration of an input device 1 according to Embodiment 1 of the present invention. FIG. 2 is a block diagram for explaining a configuration of the input device 1 shown in FIG. 1. FIG. 3 is a schematic diagram for explaining a configuration of an aerial-image display device 3 used in the input device 1 shown in FIG. 1. FIG. 4A is a diagram showing an example of an aerial image displayed in an aerial-image display region R shown in FIG. 1, and FIG. 4B is a diagram for explaining a display region DR of images of buttons 19 and a recognition region KR shown in FIG. 4A.


The input device 1 in this embodiment is a device inputting information using a user's fingertip and is used by ATMs, authentication devices for credit card and other payments, automatic ticketing machines, vending machines, or access control devices, for example. In the input device 1 of this embodiment, a PIN is input. The input device 1 has an aerial-image display device 3 which displays an aerial image in a three-dimensional space, an optical detection mechanism 4 for detecting a position of the user's fingertip in the aerial-image display region R, which is a region in which the aerial image is displayed, and an enclosure 5 in which the aerial-image display device 3 and the detection mechanism 4 are accommodated. Moreover, the input device 1 includes a control portion 8 for controlling the input device 1.


The aerial-image display device 3 has a display mechanism 11 having a display surface 11a for displaying images, and an aerial-image forming mechanism 12 that projects the image displayed on the display surface 11a into a space to form an image as an aerial image. The display mechanism 11 and the aerial-image forming mechanism 12 are accommodated in the enclosure 5. The aerial-image forming mechanism 12 has a beam splitter 13 and a retroreflective material 14. In the following explanation, a Y-direction in FIG. 3, which is orthogonal to an up-down direction (vertical direction), is referred to as a left-right direction, and a direction orthogonal to the up-down direction and the left-right direction is referred to as a front-back direction. In addition, an X1-direction side in FIG. 3, which is one side in the front-back direction, is assumed to be a “front” side, and an X2-direction side in FIG. 3, which is a side opposite to that, is assumed to be a “back” side. In this embodiment, a user standing on a front side of the input device 1 performs a predetermined operation on a front surface side of the input device 1.


The display mechanism 11 is, for example, a liquid crystal display or an organic EL (electroluminescent) display, and the display surface 11a is a display screen. The display surface 11a faces diagonally forward and downward. The beam splitter 13 is formed as a flat plate. The beam splitter 13 is disposed on the front side of the display mechanism 11. The beam splitter 13 reflects a part of light emitted from the display surface 11a. That is, a surface on one side of the beam splitter 13 is a reflective surface 13a that reflects a part of the light emitted from the display surface 11a. The reflective surface 13a faces diagonally rearward and downward.


A retroreflective material 14 is formed as a flat plate. The retroreflective material 14 is disposed on a lower side of the display mechanism 11 and is disposed on a rear side of the beam splitter 13. To the retroreflective material 14, the light reflected by the beam splitter 13 is incident. The retroreflective material 14 reflects the incident light in the same direction as the incident direction toward the beam splitter 13. In other words, the surface on the one side of the retroreflective material 14 is a retroreflective surface 14a, to which the light reflected by the beam splitter 13 is incident, and also reflects the incident light in the same direction as the incident direction toward the beam splitter 13. A quarter-wavelength plate is attached to the retroreflective surface 14a. The retroreflective surface 14a faces diagonally forward and upward.


A part of the light emitted from the display surface 11a of the display mechanism 11 is reflected by the reflective surface 13a of the beam splitter 13 and enters the retroreflective surface 14a of the retroreflective material 14. The light reflected by the reflective surface 13a is directed diagonally rearward and downward. The light incident to the retroreflective surface 14a is reflected in the same direction as the incident direction of the light to the retroreflective surface 14a. The light reflected by the retroreflective surface 14a goes diagonally forward and upward and passes through the beam splitter 13. In this embodiment, an optical axis L1 of the light emitted from the display surface 11a and an optical axis L2 of the light reflected by the beam splitter 13 are orthogonal. The optical axis of the light reflected by the retroreflective material 14 matches the optical axis L2.


The light transmitted through the beam splitter 13 forms an aerial image in the aerial-image display region R. The aerial-image display region R is disposed diagonally forward and upward of the beam splitter 13. The aerial image formed in the aerial-image display region R is recognized by a user standing in front of the input device 1 as an image inclined downward as it moves toward the front side.


The enclosure 5 is formed, for example, having a shape of a cuboid. The enclosure 5 includes a frame body 17 that surrounds the aerial-image display region R. The frame body 17 is formed having a rectangular or regular-square frame shape and is formed having a flat plate shape. The frame body 17 constitutes a front upper surface of the enclosure 5. The frame body 17, which is formed having a flat plate shape, is inclined downward as it goes toward the front side. An inner peripheral side of the frame body 17 is an opening portion 17a that leads to an inside of the enclosure 5. The opening portion 17a is formed having a rectangular or regular-square shape. The aerial-image display region R is formed in the opening portion 17a. The aerial-image display region R serves as an input portion 18 for the user to input information using the fingertips. In this embodiment, the PIN is input in the input portion 18.


The detection mechanism 4 detects a position of the user's fingertip in the aerial-image display region R, as described above. In other words, the input portion 18 is included in a detection range of the detection mechanism 4. The detection mechanism 4 is a reflective sensor array. Specifically, the detection mechanism 4 is a reflective infrared sensor array including a plurality of light emitting portions that emit infrared light and a plurality of light receiving portions that receive the infrared light emitted from the light emitting portions and reflected by the user's fingertip. Moreover, the detection mechanism 4 is a line sensor in which the light emitting portion and the light receiving portion are arranged alternately and in a straight line. The detection mechanism 4 is disposed on the side of the opening portion 17a. The detection mechanism 4 detects the position of the user's fingertip in a plane containing the aerial image (that is, in the plane containing the input portion 18). Moreover, the detection mechanism 4 detects the position of the user's fingertip in the entire aerial-image display region R (that is, in the entire input portion 18).


When a PIN is to be input in the input portion 18, the display mechanism 11 displays an image of a keypad for inputting the PIN on the display surface 11a, and the aerial-image forming mechanism 12 displays the keypad image displayed on the display surface 11a as an aerial image in the aerial-image display region R (see FIG. 4A). In other words, the aerial image displayed in the aerial-image display region R when the PIN is to be input is the image of the keypad. The image of the keypad, which is an aerial image, contains images of a plurality of buttons 19 for identifying the information (that is, PIN) to be input in the input portion 18.


The images of the plurality of buttons 19 include images of a plurality of numeric buttons 19. Specifically, the images of the plurality of buttons 19 include images of 10 numeric buttons 19 from “0” to “9”. Moreover, the images of the plurality of buttons 19 include images of five non-numeric buttons 19. The images of the plurality of buttons 19 are arranged in a matrix-state (matrix). Specifically, the images of the 15 buttons 19 are arranged in a matrix of 5 rows and 3 columns.


The user inputs the PIN by using the keypad image displayed in the aerial-image display region R. Specifically, the user inputs the PIN by sequentially moving the fingertip to a position of the images of the predetermined numeric buttons 19 displayed in the aerial-image display region R. In other words, the user inputs the PIN by sequentially pointing the fingertip at the image of the predetermined numeric button 19 in the input portion 18 (pointing operation).


The control portion 8 recognizes the image of the numeric button 19 that is pointed in the pointing operation on the basis of the detection result of the detection mechanism 4 (that is, the detection result of the position of the user's fingertip). Assuming that a region in the display region DR of the image of the button 19 displayed in the aerial-image display region R and recognized by the control portion 8 such that the image of the predetermined button 19 was pointed by the user's fingertip on the basis of the detection result of the detection mechanism 4 is a recognition region KR, the recognition region KR is narrower than the display region DR in this embodiment (see FIG. 4B). In other words, the control portion 8 recognizes that the image of the predetermined button 19 was pointed on the basis of the detection result of the detection mechanism 4 when the user's fingertip points at the recognition region KR in the display region DR, which is narrower than the display region DR.


For example, in this embodiment, the entire image of the button 19 surrounded by a solid-line circle (round) is the display region DR, and a region surrounded by a broken-line circle (round) in the image of the button 19 is the recognition region KR (see FIG. 4B). In other words, in this embodiment, the center part of the display region DR is the recognition region KR. The control portion 8 recognizes that the image of the predetermined button 19 was pointed on the basis of the detection result of the detection mechanism 4 when the user pointed at the recognition region KR, which is the region surrounded by the broken-line circle, in the input portion 18. The recognition region KR is set by using application software installed in the control portion 8.


When the control portion 8 recognizes that the image of a predetermined numeric button 19 was pointed in the pointing operation on the basis of the detection result of the detection mechanism 4, it transmits a control instruction to the display mechanism 11, and the display mechanism 11 displays an image of, for example, a “*” mark on an upper-side part of the image of the plurality of buttons 19 on the display surface 11a, and the aerial-image forming mechanism 12 displays the image with the mark “*” displayed on the display surface 11a as an aerial image in the aerial-image display region R (see FIG. 4A and FIG. 4B).


Main Effect of this Embodiment

As described above, in this embodiment, the aerial-image display region R is the input portion 18 for inputting PINs, and the aerial image includes images of the plurality of buttons 19 for identifying PINs input in the input portion 18. Moreover, in this embodiment, the recognition region KR is narrower than the display region DR of the image of the button 19, and the control portion 8 recognizes that the image of the button 19 was pointed on the basis of the detection result of the detection mechanism 4 when the user pointed at the recognition region KR, which is narrower than the display region DR.


Thus, in this embodiment, for example, even if the user points at a space between the images of the two buttons 19 such as straddling the image of the adjacent button 19 with the number “1” and the image of the button 19 with the number “2”, the control portion 8 does not recognize that the image of a predetermined button 19 was pointed. In other words, in this embodiment, it becomes possible for the control portion 8 to recognize that the image of a predetermined button 19 was pointed by the user, only when the user reliably pointed at the image of the intended button 19. Therefore, in the input device 1 of this embodiment, it becomes possible to suppress input errors of PINs.


In particular, in this embodiment, since the center part of the display region DR is the recognition region KR, it becomes easier for the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user, only when the user reliably pointed at the image of the intended button 19. Therefore, in the input device 1 of this embodiment, it becomes possible to effectively suppress input errors of PINs.


Embodiment 2

Configuration of Input Device



FIG. 5 is a schematic diagram for explaining a configuration of the input device 1 according to Embodiment 2 of the present invention. In FIG. 5, the same symbols are given to configurations similar to those of Embodiment 1.


Embodiment 1 and Embodiment 2 are different in a configuration of the detection mechanism 4. Moreover, since Embodiment 1 and Embodiment 2 are different in the configuration of the detection mechanism 4, a recognition method by the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user is different. In the following, a configuration of the input device 1 according to Embodiment 2 will be explained mainly on the difference from Embodiment 1.


Similarly to Embodiment 1, in Embodiment 2, the image of the keypad is displayed as an aerial image in the aerial-image display region R when the PIN is to be input in the input portion 18. In the keypad image, the images of the 15 buttons 19 are arranged in a matrix of 5 rows and 3 columns. In the following explanation, in the plane containing the aerial image (that is, the plane containing the keypad image), the vertical direction (row direction, V direction in FIG. 5) of the image of the plurality of buttons 19 arranged in a matrix is assumed to be a first direction, and the lateral direction (column direction, W direction in FIG. 5) of the image of the plurality of buttons 19 arranged in the matrix as a second direction. In the plane containing the aerial image, the first direction and the second direction are orthogonal to each other.


The detection mechanism 4 has a first detection mechanism 24 for detecting the position of the user's fingertip in the first direction when each of the plurality of buttons 19 is pointed by the user's fingertip, and a second detection mechanism 25 for detecting the user's fingertip in the second direction. The detection mechanism 4 in this embodiment is constituted by the first detection mechanism 24 and the second detection mechanism 25. The first detection mechanism 24 is a transmissive sensor having a plurality of light emitting portions 26 and a plurality of light receiving portions 27. The second detection mechanism 25 is a transmissive sensor having a plurality of light emitting portions 28 and a plurality of light receiving portions 29, similarly to the first detection mechanism 24. The light emitting portions 26 and 28 emit infrared light, and the light receiving portions 27 and 29 receive the infrared light emitted from the light emitting portions 26 and 28. In other words, the first detection mechanism 24 and the second detection mechanism 25 are transmission-type infrared sensors.


The first detection mechanism 24 includes the same number of the light emitting portions 26 and the light receiving portions 27 as the number of rows of 15 buttons 19 arranged in a matrix. In other words, the first detection mechanism 24 includes five pieces of the light emitting portions 26 and five pieces of the light receiving portions 27. In other words, the first detection mechanism 24 includes five pairs of the light emitting portions 26 and the light receiving portions 27. The five light emitting portions 26 are arranged in the first direction at a certain pitch, and the five light receiving portions 27 are arranged in the first direction at a certain pitch. The light emitting portion 26 and the light receiving portion 27 forming a pair are disposed at the same position in the first direction. Moreover, the light emitting portion 26 and the light receiving portion 27 forming a pair are opposed and disposed so as to sandwich the aerial-image display region R in the second direction (that is, to sandwich the input portion 18).


The second detection mechanism 25 includes the same number of the light emitting portions 28 and the light receiving portions 29 as the number of columns of 15 buttons 19 arranged in a matrix. In other words, the second detection mechanism 25 includes three pieces of the light emitting portions 28 and three pieces of the light receiving portions 29. In other words, the second detection mechanism 25 includes three pairs of the light emitting portions 28 and the light receiving portions 29. The three light emitting portions 28 are arranged in the second direction at a certain pitch, and the three light receiving portions 29 are arranged in the second direction at a certain pitch. The light emitting portion 28 and the light receiving portion 29 forming a pair are disposed at the same position in the second direction. Moreover, the light emitting portion 28 and the light receiving portion 29 forming a pair are opposed and disposed so as to sandwich the aerial-image display region R in the first direction (that is, to sandwich the input portion 18).


The arrangement pitch of the light emitting portions 26 in the first direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the first direction displayed in the aerial-image display region R. An optical axis L5 of the light emitted from the light emitting portion 26 (specifically, the infrared light) passes through the center of the image of the button 19 displayed in the aerial-image display region R. More specifically, when viewed from a direction orthogonal to the plane containing the aerial image, the optical axis L5 passes through the center of the image of the button 19 displayed in the aerial-image display region R.


The arrangement pitch of the light emitting portions 28 in the second direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the second direction displayed in the aerial-image display region R. An optical axis L6 of the light emitted from the light emitting portion 28 (specifically, the infrared light) passes through the center of the image of the button 19 displayed in the aerial-image display region R. More specifically, when viewed from a direction orthogonal to the plane containing the aerial image, the optical axis L6 passes through the center of the image of the button 19 displayed in the aerial-image display region R.


In this embodiment, when the user performs the pointing operation to point at the center part of the image of the predetermined numeric button 19 in the input portion 18, the infrared light incident to the one light receiving portion 27 of the five light receiving portions 27 is shielded, and the infrared light incident to the one light receiving portion 29 of the five light receiving portions 29 is also shielded. The control portion 8 recognizes the image of the numeric button 19 that was pointed in the pointing operation on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25.


Main Effect of this Embodiment

As described above, in this embodiment, the detection mechanism 4 includes the first detection mechanism 24 for detecting the position of the user's fingertip in the first direction when each of the images of the plurality of buttons 19 is pointed by the user's fingertip and the second detection mechanism 25 for detecting the position of the user's fingertip in the second direction, and the control portion 8 recognizes the image of the numeric button 19 that was pointed in the pointing operation on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25.


Thus, in this embodiment, on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25 (that is, on the basis of the detection result of the position of the user's fingertip in the two directions), it is possible to cause the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user's fingertip, only when the user's fingertip reliably pointed the image of the intended button 19. Therefore, in the input device 1 of this embodiment, it becomes possible to suppress input errors of PINs similarly to Embodiment 1.


In this embodiment, the first detection mechanism 24 and the second detection mechanism 25 are transmissive sensors having a plurality of the light emitting portions 26, 28 and a plurality of the light receiving portions 27, 29. Thus, in this embodiment, as compared with the case where the first detection mechanism 24 and the second detection mechanism 25 are reflective sensors, on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25, the position of the fingertip when the user's fingertip points at the image of the predetermined button 19 in the input portion 18 can be identified more accurately. Therefore, in this embodiment, on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25, it becomes easier to cause the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user's fingertip, only when the user's fingertip reliably pointed at the image of the intended button 19. Therefore, in the input device 1 of this embodiment, it becomes possible to effectively suppress input errors of PINs.


In this embodiment, the optical axes L5 and L6 of the light emitted from the light emitting portions 26, 28 pass through the center of the image of the button 19 displayed in the aerial-image display region R. Thus, in this embodiment, on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25, it becomes easier to cause the input device to recognize that the image of the predetermined button 19 was pointed by the user's fingertip, only when the user's fingertip reliably pointed at the image of the intended button 19. Therefore, in the input device 1 of this embodiment, it becomes possible to more effectively suppress input errors of PINs.


Variation of Detection Mechanism



FIG. 6 is a schematic diagram for explaining a configuration of the detection mechanism 4 according to a variation of Embodiment 2. In FIG. 6, the same reference symbols are used to denote the components similar to those of the embodiment described above.


In Embodiment 2, the first detection mechanism 24 may be a reflective infrared sensor having a plurality of light emitting portions 26 and a plurality of light receiving portions 27. Moreover, the second detection mechanism 25 may be a reflective infrared sensor having a plurality of light emitting portions 28 and a plurality of light receiving portions 29. In this case, the first detection mechanism 24 includes the same number of the light emitting portions 26 and the light receiving portions 27 as the number of rows of 15 buttons 19 arranged in a matrix, and the second detection mechanism 25 includes the same number of the light emitting portions 28 and the light receiving portions 29 as the number of columns of 15 buttons 19 arranged in a matrix.


In this variation, too, the arrangement pitch of the light emitting portions 26 in the first direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the first direction displayed in the aerial-image display region R, and the optical axis L5 of the light emitted from the light emitting portion 26 passes through the center of the image of the button 19 displayed in the aerial-image display region R. Moreover, the arrangement pitch of the light emitting portions 28 in the second direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the second direction displayed in the aerial-image display region R, and the optical axis L6 of the light emitted from the light emitting portion 28 passes through the center of the image of the button 19 displayed in the aerial-image display region R.


In this variation, when the user performs the pointing operation to point at the center part of the image of the predetermined numeric button 19 in the input portion 18, a light amount of the infrared light incident to the one light receiving portion 27 of the five light receiving portions 27 is largely fluctuated, and a light amount of the infrared light incident to the one light receiving portion 29 of the five light receiving portions 29 is largely fluctuated. The control portion 8 recognizes the image of the numeric button 19 that was pointed in the pointing operation on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25. In this variation, too, on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25, it becomes possible to cause the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user's fingertip, only when the user's fingertip reliably pointed at the image of the intended button 19 and thus, input errors of the PIN can be suppressed.


Other Embodiments

The embodiment described above is an example of a preferred embodiment of the present invention, but it is not limiting, and various modifications can be made within a range not changing the gist of the present invention.


In Embodiment 1, a part shifted from the center in the display region DR may be the recognition region KR. Moreover, in Embodiment 1, the recognition region KR is a circular region, but the recognition region KR may be a non-circular region such as an oval-shaped region or a polygonal region.


In Embodiment 2, the optical axis L5 of the light emitted from the light emitting portion 26 may pass through a position shifted from the center of the image of the button 19 displayed in the aerial-image display region R, and the optical axis L6 of the light emitted from the light emitting portion 28 may pass through a position shifted from the center of the image of the button 19 displayed in the aerial-image display region R. In Embodiment 2, the first direction and the second direction are orthogonal to each other in the plane containing the aerial image, but the first direction and the second direction do not have to be orthogonal. Moreover, in Embodiment 2, the first detection mechanism 24 and the second detection mechanism 25 may be transmissive sensor arrays.


In Embodiment described above, the image of the keypad displayed in the aerial-image display region R may be configured by images of 12 buttons 19 arranged in 4 rows and 3 columns, for example. Moreover, in Embodiment described above, information other than the PIN may be input in the input device 1. In this case, the aerial image displayed in the aerial-image display region R may be an image other than the keypad. Even in this case, the aerial image contains an image of a button to identify the information input in the input portion 18. The above description relates to specific examples according to the present invention, and various modifications are possible without departing from the spirit of the present invention. The appended claims are intended to cover such applications within the true scope and spirit of the invention.


DESCRIPTION OF REFERENCE NUMERALS






    • 1 Input device


    • 4 Detection mechanism


    • 8 Control portion


    • 11 Display mechanism


    • 11
      a Display surface


    • 12 Aerial-image forming mechanism


    • 18 Input portion


    • 19 Button


    • 24 First detection mechanism


    • 25 Second detection mechanism


    • 26, 28 Light emitting portion


    • 27, 29 Light receiving portion

    • DR Display region

    • KR Recognition region

    • L5, L6 Optical axis of light emitted from light emitting portion

    • R Aerial-image display region

    • V First direction

    • W Second direction




Claims
  • 1. An input device inputting information by using a user's fingertip, the input device comprising: a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, an optical detection mechanism detecting a position of the user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, and a control portion for controlling the input device, whereinthe aerial-image display region is an input portion for inputting information;the aerial image includes images of a plurality of buttons for identifying information input in the input portion; andby assuming that a region in a display region for the images of the buttons displayed in the aerial-image display region, which is a region recognized by the control portion that the image of a predetermined button was pointed by the user based on a detection result of the optical detection mechanism, is a recognition region, the recognition region is narrower than the display region.
  • 2. The input device according to claim 1, wherein a center part of the display region is the recognition region.
  • 3. The input device according to claim 1, wherein the optical detection mechanism is a reflective sensor array.
  • 4. An input device inputting information by using a user's fingertip, comprising: a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image; and an optical detection mechanism detecting a position of the user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, whereinthe aerial-image display region is an input portion for inputting information;the aerial image includes images of a plurality of buttons for identifying information input in the input portion; andby assuming that directions crossing each other in a plane including the aerial image are a first direction and a second direction,the optical detection mechanism includes a first detection mechanism detecting a position in the first direction of the user's fingertip and a second detection mechanism detecting a position in the second direction of the user's fingertip, when each of the images of the plurality of buttons was pointed by the user's fingertip.
  • 5. The input device according to claim 4, wherein the first detection mechanism and the second detection mechanism are transmissive sensors having a plurality of light emitting portions and a plurality of light receiving portions.
  • 6. The input device according to claim 5, wherein an optical axis of light emitted from the light emitting portion passes through a center of an image of the button displayed in the aerial-image display region.
  • 7. The input device according to claim 6, wherein the first direction and the second direction are orthogonal to each other; andthe images of the plurality of buttons displayed in the aerial-image display region are arranged in a matrix.
  • 8. The input device according to claim 1, wherein the aerial image is an image of a keypad including images of a plurality of numeric buttons.
  • 9. A control method of an input device, the input device including: a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, and an optical detection mechanism detecting a position of a user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, the aerial-image display region serving as an input portion for inputting information by using the user's fingertip, and the aerial image including images of a plurality of buttons for identifying information input in the input portion, whereinwhen the user points at a recognition region, which is in a display region of an image of the button displayed in the aerial-image display region and is narrower than the display region, it is recognized that the image of the button was pointed based on a detection result of the optical detection mechanism.
  • 10. The input device according to claim 4, wherein the aerial image is an image of a keypad including images of a plurality of numeric buttons.
CROSS REFERENCE TO RELATED APPLICATIONS

This is the U.S. national stage of application No. PCT/JP2021/020743, filed on May 31, 2021. Priority under 35 U.S.C. § 119(e) is claimed to U.S. Provisional Application No. 63/054,0799 filed Jul. 22, 2020, the disclosure of which is also incorporated herein by reference.

PCT Information
Filing Document Filing Date Country Kind
PCT/JP2021/020743 5/31/2021 WO
Provisional Applications (1)
Number Date Country
63054799 Jul 2020 US