Embodiments described herein relate generally to an information processing apparatus, a control method and a storage medium.
In recent years, portable, battery-powered information processing apparatuses such as tablet computers and smartphones have become widely used. Such information processing apparatuses comprise, in most cases, touchscreen displays for easier input operation by users.
Users can instruct information processing apparatuses to execute functions related to icons or menus displayed on touchscreen displays by touching them with the finger.
Furthermore, the input operation using touchscreen displays is used not only for giving such operation instruction for the information processing apparatuses but also for handwriting input. When a touch input is performed on the touchscreen display, the locus is displayed on the touchscreen display.
On the touchscreen display, a transparent protective glass of a certain thickness is arranged to protect the display surface from an external force, and users in many cases see the touchscreen display from an oblique angle. Thus, the users often feel that the point of touch input is deviated from, for example, the point of locus displayed on the screen. There have been various proposals to prevent such apparent deviation.
In recent years, the information processing apparatuses with touchscreen displays comprise cameras to picture still and motion images. However, there has not been any finding that such cameras are applicable to solve the above-mentioned apparent deviation problem.
A general architecture that implements the various features of the embodiments will now be described with reference to the drawings. The drawings and the associated descriptions are provided to illustrate the embodiments and not to limit the scope of the invention.
Various embodiments will be described hereinafter with reference to the accompanying drawings.
In general, according to one embodiment, an information processing apparatus comprises a display, a protective glass, a camera, a sensor and a correction module. The protective glass is configured to protect the display. The sensor is configured to detect a touch input on the protective glass and to output positional data. The correction module is configured to correct the touch input position indicated by the positional data obtained by the sensor, by using an image obtained by the camera.
First embodiment is explained.
An information processing apparatus of the embodiment may be materialized as a touch-input operable mobile information processing apparatus such as a tablet terminal and a smartphone.
As shown in
The body 11 comprises a thin box-shaped casing. The touchscreen display 12 comprises a flat-panel display and a sensor configured to detect a touch input position on the touchscreen display 12. The flat-panel display is, for example, a liquid crystal display (LCD) 12A. The sensor is, for example, a capacitance type touch panel (digitizer) 12B. The touch panel 12B is provided to cover the screen of the flat-panel display.
Users use a pen (stylus) 100 to perform a touch input on the touchscreen display 12.
As shown in
Thus, the tablet terminal 10 performs suitable correction using an image obtained by the camera 13. Now, details of this technique are explained.
As shown in
CPU 101 is a processor to control operations of various components in the tablet terminal 10. CPU 101 executes various softwares loaded from the nonvolatile memory 106 into the main memory 103. These softwares comprise an operating system (OS) 210 and a touch input support application program 220 operated under the control of the OS 210 (this program is described later). The touch input support application program 220 comprises a correction module 221.
Furthermore, CPU 101 executes basic input/output system (BIOS) stored in BIOS-ROM 105. BIOS is a program for hardware control.
System controller 102 is a device used for connection between the local bus of CPU 101 and various components. System controller 102 comprises a memory controller used for access control of the main memory 103. Furthermore, system controller 102 comprises a function to execute communication with the graphics controller 104 via a serial bus of PCI EXPRESS standard, for example.
The graphics controller 104 is a display controller to control the LCD 12A used as a display monitor of the tablet terminal 10. Display signals generated by the graphics controller 104 are sent to the LCD 12A. LCD 12A displays screen images based on the display signals. The touch panel 12B is disposed on the LCD 12A. The touch panel 12B is, for example, a capacitance type pointing device used for the touch input on the touchscreen display 12. The point at which the stylus 100 touches is detected by the touch panel 12B.
The wireless communication device 107 is a device configured to execute wireless communication such as wireless LAN or 3G mobile communication. EC 108 is a single-chip microcomputer comprising an embedded controller for power management. EC 108 comprises a function to turn on/off the tablet terminal 10 based on a power button operation by the user.
In
The correction module 221 tracks the optical axis using the image captured by the camera 13 to calculate angles α and φ. Furthermore, the position of the camera is fixed, and thus, the correction module 221 detects the position of the touch input on the touchscreen display 12 to calculate a distance L between the camera 13 and the stylus 100. Furthermore, the distance between the camera 13 and the user's eyes can be estimated to be 20 to 50 cm, and thus, based on angles α and φ and distance L, the correction module 221 calculates distances a′ and a″ depicted in the figure using trigonometric function, and then calculates angle θ0 formed by the normal to the protective glass and the optical axis.
Based on the above, the correction module 221 calculates the degree of correction using the following formula:
g=h
1×tan θ1+ . . . +hm×tan θm
θm=arc sin(nm-1×sin θm-1/nm)
where g is the positional gap, hm (m=1, 2, . . . ) is the thickness of each device, nm (m=1, 2, . . . ) is the refractive index of each device, θm (m=1, 2, . . . ) is the angle of incidence with respect to each device of the optical axis, and θ0 is derived from angles α and φ formed by the camera and the eye, distance a between the eye and the tablet body, and distance L between the pen tip and the camera.
Using the above degree of correction, the correction is performed to reduce the positional gap and users can perform stress-free writing.
The camera 13 may estimate the position of the eye from the positional relationship of nose, mouth, ears, eye blows, and hair. Furthermore, a range captured by the camera 13 is limited and if the recognition fails, a predetermined gap is used for the correction.
The correction module 221 calculates the angles (α and φ) formed by the position of camera 13 and the direction of user's eyes from the image captured by the camera 13 (block A1). Further, the correction module 221 calculates the distance (L) between the pen tip and the camera (block A2). And then, the correction module 221 calculates an angle (θ) formed by the normal to the protective glass and the optical axis (block A3). Furthermore, the correction module 221 calculates a positional gap (g) (block A4).
As can be understood from the above, the tablet terminal 10 can correct the touch input position suitably using the image captured by the camera.
Furthermore, since the sensor is used as a digitizer (electromagnetic induction type) and the pen is used as a digitizer pen, the pen tip can be detected without being affected by a hand and the correction can be performed with higher accuracy.
Now, second embodiment is explained.
In the embodiment, a distance between the camera 13 and the user's eyes is measured to improve the accuracy for positional gap correction.
As can be understood from
Naturally, there are cases where the triangle of the eyes and nose cannot be captured by the camera, and only the eyes or nose and mouth are captured; however, such cases are used as reference values and a correspondence table between the eyes, nose, and mouth may be used to acquire distance a with a certain accuracy.
Now, third embodiment is explained.
In the embodiment, a plurality of cameras are used for better accuracy in the correction of a positional.
In a tablet terminal, a plurality of cameras may be provided for viewing 3D images. Using the precedent procedure, the correction module 221 calculates angles α and φ formed by the position of the camera [1] and the direction of the user's eyes, distance L between the position of camera [1] and the pen position, angles β and δ formed by the position of camera [2] and the direction of the user's eyes, and distance M between camera [2] and the pen position. Distance O between camera [1] and camera [2] is known, the correction module 221 can eventually calculate angle θ0 using trigonometric function.
Therefore, the correction of positional gap with high accuracy can be achieved.
As can be understood from the above, the tablet terminal 10 of each of the first to third embodiments can correct a touch input position suitably using the image captured by the camera.
Note that the operation procedure of the embodiments can all be achieved by software. By introducing the software in an ordinary computer via a computer readable, non-transitory storage medium, the advantage achieved by the embodiments can easily be achieved.
The various modules of the systems described herein can be implemented as software applications, hardware and/or software modules, or components on one or more computers, such as servers. While the various modules are illustrated separately, they may share some or all of the same underlying logic or code.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
This application is a Continuation Application of PCT Application No. PCT/JP2013/057702, filed Mar. 18, 2013, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2013/057702 | Mar 2013 | US |
Child | 14617627 | US |