This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2013-073868, filed Mar. 29, 2013, the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to an electronic device and a method for controlling the electronic device.
Mobile phones, tablets, personal digital assistants (PDA), small-sized mobile personal computers and the like are popularized. These electronic devices have a display panel and an operation panel that is formed integrally with the display panel as one piece.
The operation panel senses a position on its surface in which a user touches as a change in capacitance, for example, and generates a sensing signal. The sensing signal is supplied to a touch signal processing integrated circuit (IC) dedicated to the operation panel and integrated as the IC. The touch signal processing IC processes the sensing signal by a computational algorithm prepared in advance to convert the user's touched position into coordinate data and output the data.
As manufacturing technology advances, the display panel increases in resolution and size. Accordingly, the operation panel is required to sense a position with high accuracy. The operation panel is also required to process data input thereto at high speed depending on applications. Furthermore, a device capable of easily changing an application is desired.
Embodiments will be described hereinafter with reference to the accompanying drawings.
According to one embodiment, there are provided an electronic device which is flexibly adaptable to a variety of applications and which is able to provide a number of information items for the applications and a method for controlling the electronic device.
An electronic device according to one embodiment comprises a sensor-integrated display device including a display surface from which display information is output and a sensor surface to which operation information is input, the display surface and the sensor surface being formed integrally with the sensor-integrated display device as one piece, a data transfer unit which generates and outputs three-dimensional information in response to a signal sensed on the sensor surface, an image generation unit which generates three-dimensional image data in a plurality of points sensed on the sensor surface, based on the three-dimensional information output from the data transfer unit, and a coordinate computation unit which computes a coordinate value of a conductor operated on the sensor surface, based on the image data generated by the image generation unit.
According to the embodiment, different coordinate computation algorithms can be achieved by three-dimensional analysis of touch data.
Furthermore, a variety of computations can be achieved by analyzing and computing the touch data using a high-speed application processor that can be combined with a plurality of coordinate computation algorithms.
An embodiment will further be described with reference to the drawings.
Since the sensor-integrated display device 100 is formed integrally with the display surface and the sensor surface as one piece, it includes a display device component 110 and a sensor component 150.
The sensor-integrated display device 100 is supplied with a display signal (a pixel signal) from a driver 210, which will be described later. When the device 100 receives a gate signal from the driver 210, a pixel signal is input to a pixel of the display device component 110. A voltage between a pixel electrode and a common electrode depends upon the pixel signal. This voltage displaces direction of liquid crystal molecules between the electrodes to achieve brightness corresponding to the direction of displacement of the liquid crystal molecules.
The sensor-integrated display device 100 can be designated as an input sensor-integrated display unit, a user interface or the like.
A display unit that is, for example, formed of a liquid crystal display panel or a light-emitting element such as an LED or organic EL, can be employed as the display device component 110. The display device component 110 can be simply designated as a display. The sensor component 150 can be of a capacitive sensing type, an optical sensing type or the like. The sensor component 150 can be designated as a panel for sensing a touch input.
The sensor-integrated display device 100 is coupled to the application executing device 300 via the data transfer device 200.
The data transfer device 200 includes the driver 210 and a sensor signal detector 250. Basically, the driver 210 supplies the display device component 110 with graphics data that is transferred from the application executing device 300. The sensor signal detector 250 detects a sensor signal output from the sensor component 150.
The driver 210 and sensor signal detector 250 are synchronized with each other, and this synchronization is performed under control of the application executing device 300.
The application executing device 300 is, for example, a semiconductor integrated circuit (LSI) formed as, for example, an application processor, which is incorporated into an electronic device such as a mobile phone. The device 300 serves to complexly perform a plurality of functions, such as Web browsing and multimedia processing, using software such as an OS. The application processor can perform a high-speed operation and can be configured as a dual core or a quad core. Favorably, the operation speed of the application processor is, for example, 500 MHz and, more favorably, it is 1 GHz.
The driver 210 supplies a display signal (a signal into which the graphics data is analog-converted) to the display device component 110 on the basis of an application. In response to a timing signal from the sensor signal detector 250, the driver 210 outputs a sensor drive signal Tx for gaining access to the sensor component 150. In synchronization with the sensor drive signal Tx, the sensor component 150 outputs a sensor signal Rx and supplies it to the sensor signal detector 250.
The sensor signal detector 250 slices the sensor signal Rx, eliminates noise therefrom and supplies the noise-eliminated signal to the application executing device 300 as raw reading image data (three-dimensional image data). In this embodiment, the raw reading image data can be designated as raw data (RAW-D) or sign-eliminated raw data.
When the sensor component 150 is of a capacitive sensing type, the image data is not only two-dimensional data simply representing a coordinate but may have a plurality of bits (e.g., three to seven bits) which vary with the capacitance. Thus, the image data can be designated as three-dimensional data including a physical quantity and a coordinate. The capacitance varies with the distance (proximity) between a targeted conductor (e.g., a user's finger) and a touch panel and thus the variation can be considered to be a change in physical quantity.
Below is the reason that the sensor signal detector 250 of the data transfer device 200 directly supplies image data to the application executing device 300, as described above.
The application executing device 300 is able to perform its high-speed operating function to use the image data for various purposes.
New different applications are stored in the application executing device 300 according to user's different desires. As for the new applications, there is a case where an application requires to change or select an image data processing method, reading timing, a reading format, a reading area or a reading density in accordance with data processing.
If, in the above case, only the coordinate data is acquired as in the prior art device, the amount of acquired information is restricted. In the device of the present embodiment, however, if the raw three-dimensional image data is analyzed, for example, distance information corresponding to the proximity of the conductor as well as coordinate information can be acquired.
In order to expand the functions performed by the applications, it is desired that the data transfer device 200 should follow different operations under the control of the applications. Thus, as the simplest possible function, the data transfer device 200 is configured to select sensor signal reading timing, a reading area, a reading density or the like arbitrarily under the control of the applications. This will be described later.
In the present embodiment, the application executing device 300 is configured, for example, as a single semiconductor integrated circuit that is designated as what is called an application processor. The semiconductor integrated circuit incorporates a base band engine having a radio interface (see
As shown in
The common electrode 13 serves as a drive electrode for a sensor (a common drive electrode for a sensor) as well as a common drive electrode for display.
The data transfer device 200 mainly includes the driver 210 and the sensor signal detector 250. The driver 210 and the sensor signal detector 250 can be designated as a display driver IC and a touch IC, respectively. Though the driver 210 and sensor signal detector 250 are separated from each other in
The driver 210 receives display data from the application executing device 300. The display data is time-divided and has a blanking period. The display data is supplied to a timing circuit and digital-to-analog converter 212 through a video random access memory (VRAM) 211 serving as a buffer. In mobile terminal 1, the VRAM 211 may have a capacity of one frame or smaller.
Display data SigX indicative of an analog quantity output from the timing circuit and digital-to-analog converter 212 is amplified by an output amplifier 213 and supplied to the sensor-integrated display device 100 for writing it to a display element. The timing circuit and digital-to-analog converter 212 detects a blanking detection signal and supplies it to a timing controller 251 of the sensor signal detector 250. The timing controller 251 can be provided in the driver 210 and designated as a synchronization circuit.
The timing controller 251 generates a sensor access pulse to access the sensor during a given period of the display signal. The sensor access pulse is amplified by an output amplifier 214 and supplied to the sensor-integrated display device 100.
The drive signal Tx drives the sensor sensing electrode via the common electrode and thus the sensor signal Rx is output from the sensor-integrated display device 100. The sensor signal Rx is input to an integrating circuit 252 in the sensor signal detector 250. The sensor signal Rx is compared with a reference voltage (threshold value) Vref by the integrating circuit 252. If the level of the sensor signal Rx is equal to the reference voltage or higher, the integrating circuit 252 integrates the sensor signal Rx and outputs as an integral signal. The integrating circuit 252 is reset by a switch for each detection unit time period. In this way an analog signal Rx can be output from the integrating circuit 252. The output of the integrating circuit 252 is supplied to a sample hold and analog-to-digital converter 253 and converted into digital data. The digital data is supplied as raw data to the application executing device 300 through a digital filter 254.
The digital data is three-dimensional data (multivalued data) including both the detected data and non-detected data of an input operation. For example, a presence detector 255 operates when the application executing device 300 is in a sleep mode and no coordinates of a touched point on the operation surface are detected. If there is any conductor close to the operation surface, the presence detector 255 is able to sense the conductor and release the sleep mode.
The application executing device 300 receives and analyzes the digital data. In accordance with a result of the analysis, the device 300 is able to output the display data or select an operating function of the mobile terminal 1.
The application executing device 300 is able to execute each of the applications to set an operating procedure of the device, select a function, generate a display signal, select a display signal, and the like. Using a sensor signal (raw data) output from the sensor signal detector 250, the device 300 is able to analyze an operating position through a coordinate computation. The sensor signal is processed as image data and thus three-dimensional image data can be formed by an application. The device 300 is also able to, for example, register, erase and confirm the three-dimensional image data. The device 300 is also able to compare the acquired image data with the registered image data to lock or unlock an operating function.
Upon acquiring the sensor signal, the application executing device 300 is able to change the frequency of an access pulse to the sensor sensing electrode output from the timing controller 251 and control the output timing of the access pulse. Accordingly, the device 300 is able to select an access area of the sensor component 150 and set the access speed thereof.
Furthermore, the application executing device 300 is also able to set the sampling density of the sensor signal and add data to the sensor signal.
The application executing device 300 includes different filters (T1) for eliminating noise to flatten image data based on the sensor signal (raw data) and different coordinate computation algorithms (T2) for computing an operating position coordinate on the operation surface from the image data. A plurality of these filters (T1) and algorithms (T2) are prepared on the assumption that the coordinate values as computation results have deviation in accordance with the functions and conditions such as the applications and the operating positions on the sensor surface. One (one set) of the filters (T1) and coordinate computation algorithms (T2) is selected by a user or an application in accordance with usability and contents of the application. A configuration for selecting the filters (T1) and the coordinate computation algorithms (T2) are shown as Filter A, Filter B, Filter C, Algorithm A, Algorithm B and Algorithm C in
The display data SigX and the sensor drive signal Tx can be separated from each other by the timing circuit and digital-to-analog converter 212. It is also possible that the display data SigX and the sensor drive signal Tx can be supplied from the application executing device 300 to the driver 210 in a time divisional manner via the same bus. The sensor drive signal Tx is supplied to the common electrode 13, described above, via the timing controller 251 and the amplifier 214. For example, the timing at which the timing controller 251 outputs the sensor drive signal Tx and the frequency of the sensor drive signal TX can be varied according to an instruction of the application executing device 300. The timing controller 251 is able to supply a reset timing signal to the integrating circuit 252 of the sensor signal detector 250 and also supply a clock to the sample hold and analog-to-digital converter 253 and the digital filter 254.
When an operator places his or her ear on the sensor surface of the mobile terminal 1, the application executing device 300 is able to recognize a shape of the ear (Ia) to judge whether the operator is correct and control another function. In the judgment, if the operator is identified by the shape of the ear, the function of the mobile terminal 1 can be unlocked. In the function control, if the operator places his or her ear on the sensor surface, it is recognized that the operator starts a call to make it possible to change a function automatically, namely, change an operation mode to a call mode (reception state).
When the size of an operator's palm is recognized (Ib), it is possible to provide applications for each generation, provide applications for each user, allow an operator to use an apparatus or an application or inhibit the operator from using it, and the like.
When a specific gesture and an operation are combined (Ic), if an operator touches the operation surface continuously two times with his or her index and middle fingers shaping a peace-sign, a camera application is started to allow a picture to be taken, and if the operator touches the operation surface continuously three times with the peace-sign fingers, a music player application is started to allow music to be played back.
When an operator uses his or her fingers properly (Id) to scroll with his or her thumb, tap with his or her index finger and zoom with his or her little finger, an operating function need not be changed.
When an operator's touch with a finger's back (Ie) and an operator's touches with a fingertip (If) are distinguished from each other, their respective applications can be started.
The three-dimensional image data for use in the above authentications can smoothly be registered by, for example, either or both of an image registration screen and audio guidance.
The application executing device 300 includes in advance an operating procedure of performing an image registration process and an authentication process to fulfill an application function corresponding to the operations based on the three-dimensional image data.
As an application of the function shown in
In order to achieve the above different application functions, a high-precision position coordinate computation function is required in accordance with the characteristics of the application functions. In recent years, the sensor-integrated display device has been increased in precision to require a very fine operation.
Under the above situation, as a touch user interface in the sensor-integrated display device, there occurs a sense of operation (a difference between a point that a user wishes to touch and a touch coordinate recognized by the device) which varies from user to user.
To solve the above problem, in the present embodiment, the application executing device 300 performs a high-speed computation function to compute a correct coordinate that is adapted to a user's operation using three-dimensional image data based on raw data (RAW-D).
In the present embodiment, a plurality of filters (T1) and a plurality of coordinate computation algorithms (T2), which correspond to those as shown in
To use the filters (T1) and coordinate computation algorithms (T2) properly according to an application, a correspondence table in which the applications in the application executing device 300 correspond to the filters (T1) and coordinate computation algorithms (T2), is prepared. Referring to the correspondence table in accordance with a starting application, a user selects a filter (T1) and a coordinate computation algorithm (T2) and performs a computation for an operating position coordinate using the selected filter (T1) and coordinate computation algorithm (T2). When a user arbitrarily selects a filter (T1) and a coordinate computation algorithm (T2), a coordinate recognition process that is the most suitable for a user's habit, a user's liking or a specific application operation, is carried out.
The filters and coordinate computation algorithms include operating conditions or computing elements, in which the values of the result of computation of each coordinates do not necessarily coincide.
In calculating a touch coordinate from the three-dimensional image data based on the raw data (RAW-D), one coordinate computation algorithm, which is adapted to a user or an application, is selected, by the user or the application, from a plurality of filters (Filter A, Filter B and Filter C) and a plurality of coordinate computation algorithms (Algorithm A, Algorithm B and Algorithm C) which are prepared.
In the process of obtaining a coordinate from the three-dimensional image data based on the raw data (RAW-D), image data from which noise is eliminated using a selected filter (e.g., Filter A) is acquired, and a touch operation coordinate is computed using a selected coordinate computation algorithm (e.g., Algorithm B). In a high-precision panel that requires a touch operation with a very fine pitch, therefore, a coordinate can be designated correctly in accordance with a user or an application for a variety of application operations, thereby providing a coordinate input function that is decreased in operation error and improved in usability.
Instead of the above coordinate computation configurations, for example, a coordinate value complementary table can be prepared to register therein a correction factor of each coordinate value for each of the applications (coordinate computation algorithms) for processing coordinate data so as to correspond to the application and correct the coordinate value using the correction factor registered in the coordinate value complementary table.
The above embodiments of the present disclosure are each described as an example and do not aim at limiting the scope of the present disclosure. The embodiments can be reduced to practice in different ways, and their structural elements can be omitted, replaced and modified in different ways without departing from the spirit of the invention. Even though the structural elements are each expressed in a divided manner or they are expressed in a combined manner, they fall within the scope of the present invention. Even though the claims are recited as method claims, step claims or program claims, these claims are applied to the device according to the invention. The embodiments and their modifications fall within the scope and spirit of the invention and also fall within the scope of the invention recited in the claims and its equivalents.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2013-073868 | Mar 2013 | JP | national |