This relates generally to the use of optical in-LCD sensing panels, and more particularly, to configurations that allow transmission and/or reception of data, such as data communication and scanning, using optical in-LCD sensing panels.
Many types of input devices are presently available for performing operations in a computing system, such as buttons or keys, mice, trackballs, joysticks, touch sensor panels, touch screens and the like. Touch screens, in particular, are becoming increasingly popular because of their ease and versatility of operation as well as their declining price. Touch screens can include an optical in-LCD sensing panel, which can be a liquid crystal display (LCD) with embedded photodiodes. Touch screens can allow a user to perform various functions by touching the touch sensor panel using a finger, stylus or other object at a location dictated by a user interface (UI) being displayed by the display device. In general, touch screens can recognize a touch event and the position of the touch event on the touch sensor panel, and the computing system can then interpret the touch event in accordance with the display appearing at the time of the touch event, and thereafter can perform one or more actions based on the touch event.
However, with the exception of conventional configurations directed to sensing touch input, optical in-LCD sensing panels have found limited use.
Configurations using optical in-LCD sensing panels can provide transmission and/or reception of data. In one transmit/receive configuration, an optical in-LCD sensing panel can be used for communication. Data is transmitted by displaying a communication image encoding the data on the sensing panel, and data is received by capturing, with the EM sensors embedded in the panel, a communication image displayed in proximity to the panel. In another configuration, data is received by scanning an object using an optical in-LCD sensing panel. The motion of a handheld device including the panel can be determined by scanning a surface with the EM sensors at different times and comparing the corresponding scan images. A control signal based on the motion of the handheld device can be transmitted to an external device, for example, to control a mouse cursor. In another configuration, scan images can be combined based on the motion to generate a combined scan image of a surface.
a illustrates an example mobile telephone including an optical in-LCD sensing panel and configured to operate according to embodiments of the invention.
b illustrates an exemplary digital media player including an optical in-LCD sensing panel and configured to operate according to embodiments of the invention.
c illustrates an exemplary personal computer including an optical in-LCD sensing panel and configured to operate according to embodiments of the invention.
In the following description of preferred embodiments, reference is made to the accompanying drawings which form a part hereof, and in which it is shown by way of illustration specific embodiments in which the invention can be practiced. It is to be understood that other embodiments can be used and structural changes can be made without departing from the scope of the embodiments of this invention.
This relates to the use of optical in-LCD sensing panels, and more particularly, to configurations that transmit and/or receive data using optical in-LCD sensing panels. One transmit/receive configuration can be a communication configuration in which data is transmitted by displaying a communication image encoding the data on the sensing panel. Data can be received by capturing, with the EM sensors, a communication image displayed in proximity to the sensing panel. In another configuration, data can be received by scanning an object using an optical in-LCD sensing panel.
Although embodiments of the invention may be described and illustrated herein in terms of optical in-LCD sensing panels, it should be understood that embodiments of this invention are not so limited, but are additionally applicable to different types of panel displays (in addition to LCD-type displays) that include embedded EM sensors. Likewise, embodiments of this invention are not limited to “LCD”-type displays with embedded EM sensors, but may include other types of display panels with embedded EM sensors.
While some devices may be configured to use optical in-LCD sensing panels to detect touches on the panel, other devices can be configured to operate optical in-LCD sensing panels in different ways. An example device 301 that can include one or more embodiments of the invention will now be described with reference to
Detection processor 413 can access RAM 411, receive sensor signals 421 from panel 303, transmit control signals 422 to panel 303, and communicate with graphics driver 417, decoder 415, stitching module 409, and host processor 403. Detection processor 413 can operate to detect touch input based on sensor signals 421 and to send the detected touch input to host processor 403. Thus, panel 303 can be used as a touch screen user interface. In addition, detection processor 313 can process sensor signals 421 to obtain the other information, as described in more detail below. Graphics driver 417 can communicate with host processor 403 to receive image data of an image to be displayed, and can transmit display signals 423 to panel 303 to cause pixels 419 to display the image. For example, graphics driver 417 can display a graphical user interface (GUI) for user input when panel 303 is used as a touch screen input device.
Host processor 403 may perform actions based on inputs received from sensing display subsystem 407 that can include, but are not limited to, moving an object such as a cursor or pointer, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, making a selection, executing instructions, operating a peripheral device, answering a telephone call, placing a telephone call, terminating a telephone call, changing the volume or audio settings, storing information related to telephone communications such as addresses, frequently dialed numbers, received calls, missed calls, logging onto a computer or a computer network, permitting authorized individuals access to restricted areas of the computer or computer network, loading a user profile associated with a user's preferred arrangement of the computer desktop, permitting access to web content, launching a particular program, encrypting or decoding a message, and/or the like. Host processor 403 can also perform additional functions that may not be related to panel processing.
Note that one or more of the functions described above can be performed by firmware stored in memory (e.g. one of the peripherals 406 in
The firmware can also be propagated within any transport medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “transport medium” can be any medium that can communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The transport readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic or infrared wired or wireless propagation medium.
An example communication operation according to embodiments of the invention will now be described with reference to
Referring again to
Communication by displaying and capturing images with an optical in-LCD sensing panel can take a variety of forms. Communication may be one-way, for example, a first device transferring data, such as an electronic business card, to a second device. Communication may be bidirectional, for example, a first communication image displayed by a first device may be captured by the sensors of a second device, and vice versa over the course of a data transmission. Communication may be encoded in a variety of different ways. For example, a communication image may be an image of text that could be captured and read directly by the recipient with minimal processing. Communication of an image of text may provide an easy way to communicate information between electronic devices without the need for particular protocols or predetermined coding schemes to be coordinated beforehand. In addition, the captured image of text may be processed using, for example, optical character recognition (OCR). In other forms, a communication image may be a coded visual structure, such as a one-dimensional barcode, a QR code, which is essentially a two-dimensional barcode, etc. Using QR codes for communication images may allow more flexibility in the positioning of the communicating devices, because QR codes can allow reading at different angles/orientations.
The data transfer could be done asynchronously, that is, without any synchronization between the devices. In another embodiment, the data transfer could be synchronous. For example, the devices could be synchronized to a common clock, such as a GPS satellite clock, and then the data transfers could occur by displaying/capturing at predetermined times. This might allow the devices to reject or reduce some noise in a similar way that synchronous modulation is used to reduce noise in other applications.
In another form of communication, a black and white image displayed by a first device, for example, might be captured by the display of a second device and interpreted as if the black areas are occluded areas and the white areas are non-occluded areas (e.g., as if the device was reading ambient light). In this case, black areas may be interpreted as a “touches” on the display of the second device. Thus, this form of communication might be used, for example, to perform “touch” inputs, e.g., a single touch, a multiple touch, a gesture, etc., on the second device. Each “touch” input may be received by a GUI displayed on the second device and interpreted as a user input, for example, pressing a button, scrolling a scrollbar, selecting from a list, typing in text, double-clicking an icon, etc. A series of “touch” inputs (i.e., communication images representing touch inputs) may be stored in a memory of the first device and, similar to a macro, may be used to automatically perform a series of tasks on another device. However, unlike a conventional macro, a communication image macro could be entered directly into the second device's GUI without the need to communicate the macro to the second device by other channels of communication. This may be helpful, for example, to automate diagnostic and testing routines, particularly when the testing device and the device under test (DUT) cannot communicate by other means, e.g., the DUT's WiFi is malfunctioning, the communication cable cannot be found, etc. Although the example of black-and-white images is used, other types of images, such as color images, etc., could be used so long as the image would be interpreted as the intended “touch” by the second device, as one skilled in the art would recognize in light of the present disclosure.
Referring again to
The effective range of communication (i.e., maximum distance between the two panels) might vary from substantially zero (i.e., the panels are touching) to a distance at which an error rate of the communication becomes unacceptable. Of course, the range of communication may increase or decrease depending on, for example, the form of communication used, including the type of communication images, the EM spectrum used, such as black/white, color, the rate of display of the images, the resolution and/or size of the displayed images, etc. The range of communication may also vary depending on, for example, the physical parameters of the devices, such as panel resolution, brightness, size, sensor sensitivity, etc., and/or external factors, such as the brightness and/or color of ambient light, electromagnetic interference, mechanical vibration of the device, etc.
In other words, the positioning of the devices, along with other factors such as those listed above, may affect the error rate of the communication between the devices. For example, moving the devices further apart may increase the error rate, while reducing the brightness of the ambient light may decrease the error rate.
In some embodiments, the communication operation can sense an error rate of communication and can modify one or more factors affecting the error rate to either increase or decrease the error rate, depending upon a desired result. Some embodiments may detect error rate by, for example, including an error detection code, such as a cyclic redundancy check code (CRC) in the data transfer, and may modify one or more system parameters in order to maintain the detected error rate within a predetermined range. If the detected error rate exceeds the upper bound of the predetermined range, the resolution/size of the displayed communication images may be modified by decreasing the resolution and/or increasing the size. For example, displaying a single “dot” that covers the entire panel (i.e., using the entire panel to display one “pixel”) as a communication image may result in a lower error rate/longer range of communication versus using a higher resolution (more detailed) communication image. However, the effective bandwidth of the communication may also be reduced because only one bit of information would be displayed/captured at a time. Conversely, if the detected error rate drops below the lower bound of the predetermined range, communication image resolution may be increased in order to increase the bandwidth of the communication.
In another embodiment, other factors that affect communication may be detected and the system may adjust accordingly. In one example, if the device senses that the intensity of ambient light is high (e.g., in a brightly lit room, outside in direct sunlight), the device can increase the brightness/contrast of the display panel to increase the readability of displayed communication images. On the other hand, if the intensity of ambient light is low, the device can decrease the brightness/contrast of the display panel to save power.
In other embodiments, the stitching operation can be performed externally. For example, rather than sending the images of surface 701 to stitching module 409, the images can be sent directly to computer 703, which would process the images internally.
It is noted that, unlike a conventional computer mouse, the position data generated by the foregoing example operation can include information about the rotational motion of device 301, in addition to information about the translational motion of the device. The rotational motion data may be useful in applications such as, for example, computer games.
In contrast to conventional handheld scanners, the foregoing example scanner mode may not require the user to maintain a constant velocity when moving the device across the surface to be scanned because, for example, the stitching algorithm can determine the motion of device 301 and compensate for the motion when stitching together the images. In addition, conventional handheld scanners typically require the user to move the scanner in a straight line across a page, and to refrain from rotating the scanner. In contrast, in the foregoing example, the user would be free to move the device freely around on the page in practically any manner, e.g., the device could be rotated, move along an irregular path, moved at different speeds and directions, and the device would be able to compensate for the regular motion and to generate a scanned image as the user “paints” the page with the device.
a illustrates example mobile telephone 1036 including an optical in-LCD sensing panel 1024 and configured to operate according to embodiments of the invention.
b illustrates example digital media player 1040 including an optical in-LCD sensing panel 1026 and configured to operate according to embodiments of the invention.
c illustrates exemplary personal computer 1044 including an optical in-LCD sensing panel 1028 and configured to operate according to embodiments of the invention. The mobile telephone, media player and personal computer of
Although embodiments of this invention have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of embodiments of this invention as defined by the appended claims.