Intelligent stylus

Abstract
An intelligent stylus is disclosed. The stylus can provide a stylus condition in addition to a touch input. The stylus architecture can include multiple sensors to sense information indicative of the stylus condition, a microcontroller to determine the stylus condition based on the sensed information, and a transmitter to transmit the determined condition to a corresponding touch sensitive device so as to cause some action based on the condition.
Description
FIELD

This relates generally to touch sensing and more particularly, to providing an intelligent stylus for use with a touch sensitive device.


BACKGROUND

Many types of input devices are available for performing operations in a computing system, such as buttons or keys, mice, trackballs, touch sensor panels, joysticks, touch pads, touch screens, and the like. Touch sensitive devices, and touch screens, in particular, are becoming increasingly popular because of their ease and versatility of operation as well as their declining price. Touch sensitive devices can include a touch sensor panel, which can be a clear panel with a touch sensitive surface, and a display device such as a liquid crystal display (LCD) that can be positioned partially or fully behind the panel, or integrated with the panel, so that the touch sensitive surface can substantially cover the viewable area of the display device. Touch sensitive devices can generally allow a user to perform various functions by touching or hovering over the touch sensor panel using one or more fingers, a stylus or other object at a location often dictated by a user interface (UI) including virtual buttons, keys, bars, displays, and other elements, being displayed by the display device. In general, touch screens can recognize a touch event and the position of the touch event on the touch sensor panel or a hover event and the position of the hover event on the touch sensor panel, and the computing system can then interpret the touch or hover event in accordance with the display appearing at the time of the event, and thereafter can perform one or more operations based on the event.


When a stylus has been used as an input device, the stylus has traditionally provided simply a touch input to a touch sensitive device without additional information that can be helpful to operation of the touch sensitive device.


SUMMARY

This relates to an intelligent stylus that can provide a stylus condition in addition to a touch input. The stylus sensing circuitry can include multiple sensors to sense information indicative of the stylus condition, a microcontroller to determine the stylus condition based on the sensed information, and a transmitter to transmit the determined condition to a corresponding touch sensitive device to cause some action based on the condition. The stylus's ability to determine its condition and provide that condition to the touch sensitive device advantageously improves touch and hover sensing and increases device capabilities.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an exemplary intelligent stylus for use with a touch sensitive device according to various embodiments.



FIG. 2 illustrates exemplary sensing circuitry of an intelligent stylus according to various embodiments.



FIG. 3 illustrates an exemplary intelligent stylus with a replaceable tip according to various embodiments.



FIG. 4 illustrates an exemplary intelligent stylus with sensing circuitry including a pressure sensor according to various embodiments.



FIG. 5 illustrates an exemplary intelligent stylus with sensing circuitry including a surrounding touch sensor according to various embodiments.



FIG. 6 illustrates an exemplary intelligent stylus with sensing circuitry including a rotation sensor according to various embodiments.



FIG. 7 illustrates an exemplary intelligent stylus with sensing circuitry including a pushbutton according to various embodiments.



FIG. 8 illustrates an exemplary intelligent stylus with sensing circuitry including a camera according to various embodiments.



FIG. 9 illustrates an exemplary intelligent stylus with sensing circuitry including a light emitter according to various embodiments.



FIG. 10 illustrates an exemplary intelligent stylus with sensing circuitry including contact sensors disposed on the stylus and on a removable stylus cap according to various embodiments.



FIG. 11 illustrates an exemplary bristle-tipped intelligent stylus with sensing circuitry including bristle sensors according to various embodiments.



FIG. 12 illustrates an exemplary intelligent stylus with sensing circuitry including a color sensor according to various embodiments.



FIG. 13 illustrates an exemplary lead-tipped intelligent stylus with sensing circuitry according to various embodiments.



FIG. 14 illustrates an exemplary intelligent stylus with sensing circuitry including a microcontroller for selecting multiple sensors according to various embodiments.



FIG. 15 illustrates an exemplary method for detecting a condition of an intelligent stylus according to various embodiments.



FIG. 16 illustrates another exemplary method for detecting a condition of an intelligent stylus according to various embodiments.



FIG. 17 illustrates an exemplary computing system for use with an intelligent stylus according to various embodiments.



FIG. 18 illustrates an exemplary mobile telephone for use with an intelligent stylus according to various embodiments.



FIG. 19 illustrates an exemplary digital media player for use with an intelligent stylus according to various embodiments.



FIG. 20 illustrates an exemplary personal computer for use with an intelligent stylus according to various embodiments.





DETAILED DESCRIPTION

In the following description of example embodiments, reference is made to the accompanying drawings in which it is shown by way of illustration specific embodiments that can be practiced. It is to be understood that other embodiments can be used and structural changes can be made without departing from the scope of the various embodiments.


This relates to an intelligent stylus that includes sensing circuitry to provide the stylus condition in addition to a touch input. The sensing circuitry can include multiple sensors to sense information indicative of the stylus condition, a microcontroller to determine the stylus condition based on the sensed information, and a transmitter to transmit the determined condition to a corresponding touch sensitive device to cause some action based on the condition. The stylus's ability to determine its condition and provide that condition to the touch sensitive device advantageously improves touch and hover sensing and increases panel capabilities. Additionally, the stylus can be used with the touch sensitive device to provide these advantages without making design modifications to the panel.


Although some embodiments are described herein in terms of a stylus, it is to be understood that other input devices and/or pointing devices can be used according to various embodiments.


Although some embodiments are described herein in terms of a touch sensitive device, it is to be understood that any device capable of sensing a touch or hover event thereat and/or processing the sensed event can be used according to various embodiments. The touch sensitive device (or any other suitable device) can include touch sensing circuitry for touch and hover sensing and, in some instances, a processor and memory for touch and hover data processing.



FIG. 1 illustrates an exemplary intelligent stylus for use with a touch sensitive device according to various embodiments. In the example of FIG. 1, touch sensitive device 120 can include an array of pixels 106 formed at the crossing points of conductive rows 101 and columns 102. Though FIG. 1 depicts the conductive elements 101, 102 in rows and columns, other configurations of conductive elements are also possible according to various embodiments.


When stylus 110 touches or hovers over a surface of the touch sensitive device 120, the stylus can form a capacitance with one or more of the conductive rows 101 and/or columns 102 that can be detected by device sensing circuitry (not shown). The stylus touch can be represented in an image captured at the touch sensitive device 120 and processed for touch input information, e.g., the location on the touch sensitive device that the stylus touched or hovered over.


In addition to providing the touch input information, the stylus 110 can provide information sensed by the stylus regarding its condition, which can be used by the touch sensitive device 120 to perform some action. In some embodiments, the information can be used by the stylus to perform some action. The stylus 110 can include multiple sensors to provide information about its condition, where the sensors can be selectively used alone or in various combinations. The ability to provide information beyond simply touch input information, particularly information about the stylus's condition that the stylus determines itself, gives this stylus an intelligence absent from a traditional stylus.



FIG. 2 illustrates a cross-sectional view of an exemplary intelligent stylus having sensing circuitry according to various embodiments. In the example of FIG. 2, stylus 210 can include contact/proximity sensor 212, motion/orientation sensor 214, microcontroller (MCU) 222, and transmitter 232. The contact/proximity sensor 212 can sense a touch at the stylus tip by an object or a proximity of the stylus to an object. In some embodiments, the contact/proximity sensor 212 can be a capacitive sensor, a resistive sensor, a force sensor, or any other suitable sensor capable of sensing a touch at or proximity to the stylus. The motion/orientation sensor 214 can sense the motion and/or orientation of the stylus. In some embodiments, the motion/orientation sensor 214 can be an accelerometer, a gyroscope, an magnetometer, or any other suitable six degree-of-freedom sensor capable of sensing a motion and/or orientation of the stylus.


In some embodiments, an additional contact/proximity sensor can be disposed at the stylus non-tip end to sense a touch at that end of the stylus by an object or a proximity of that end of the stylus to an object.


The MCU 222 can receive, select, and process stylus sensor measurements to determine a condition of the stylus. For example, the MCU 222 can receive a contact/proximity measurement from the contact/proximity sensor 212 and determine the stylus's condition as touching (contacting) or hovering over (proximate to) a surface. Similarly, the MCU 222 can receive a motion/orientation measurement from the motion/orientation sensor 214 and determine the stylus's condition as moving in a particular direction at a particular pace. The MCU 222 can also receive a motion/orientation measurement and determine the stylus's condition as having a particular orientation. Similarly, the MCU 222 can receive both contact/proximity and motion/orientation measurements and determine the stylus's condition as moving on a surface. In some embodiments, the MCU 222 can be a single application specific integrated circuit (ASIC) that can include one or more programmable processors, random access memory (RAM), and input-output (I/O) ports. In some embodiments, the MCU 222 can also include data compression software and/or hardware.


For the additional contact/proximity sensor at the stylus non-tip end, the MCU can determine the stylus's condition at the non-tip end as touching or hovering over a surface, e.g., to emulate an eraser.


The transmitter 232 can transmit stylus information, e.g., the stylus's condition, from the MCU 222 to a touch sensitive device (or some other device in communication with the stylus) and cause the device to perform some action based on the transmitted information. For example, the transmitter 232 can transmit that the stylus is touching or hovering over the touch sensitive device, causing the device to perform some touch operation, such as displaying writing or drawings or executing a program corresponding to the location the stylus touched. The transmitter 232 can transmit that the stylus is moving at an angle along the touch sensitive device, causing the device to display a thicker line corresponding to the stylus angle and the stylus motion. The transmitter 232 can transmit that the stylus has rotated at least 180 degrees so that the eraser end of the stylus is touching the touch sensitive device, causing the device to erase the displayed information. The transmitter 232 can be wireless or wired. Transmission between the transmitter 232 and the touch sensitive device can be via WiFi, Bluetooth, zigbee, RFID, NFC, ultrasound, and the like.


In addition to a stylus, other input devices can transmit input information and/or condition information to the touch sensitive device, thereby causing an action based on the information, according to various embodiments. For example, a bar code scanner can transmit bar code data and/or a condition of the scanner to the touch sensitive device to cause the device and/or the scanner to perform some action based on the transmitted information. A camera can transmit images and videos and/or a camera condition to the touch sensitive device to cause the device and/or the camera to perform some action based on the transmitted information. An ultrasound device can transmit ultrasound data and/or a condition of the device to the touch sensitive device to cause the touch sensitive device and/or the ultrasound device to perform some action based on the transmitted information. An NFC/RFID reader can transmit product identification codes and/or a condition of the reader to the touch sensitive device to cause the device and/or the reader to perform some action based on the transmitted information. Other input devices performing similarly can be used according to various embodiments. Transmission between the input devices' transmitters and the touch sensitive device can be via WiFi, Bluetooth, zigbee, RFID, NFC, ultrasound, and the like.


In some embodiments, the transmitter 232 can be replaced with a transceiver, such that the stylus can both send signals to the touch sensitive device and receive signals from the touch sensitive device. For example, after the touch sensitive device performs its action based on the information sent from the stylus, the device can send a signal back to the stylus indicating that the action is complete or requesting some action by the stylus in return.


In some embodiments, the motion/orientation sensor 214 can be used to sense a wake-up motion of the stylus 210. For example, a user can move the stylus 210 in a predetermined motion to power up the stylus. Note that powering up can be from a low power mode to a full power mode and powering down can be the reverse. In some embodiments, the predetermined motion can be programmed into the stylus to be unique to the user. Such a motion can in effect be a stylus password. The motion/orientation sensor 214 can sense the motion and transmit it to the MCU 222 for processing. The MCU 222 can recognize the motion and power up the stylus. A similar process can be done for powering down the stylus 210. The transmitter 232 can transmit that the stylus is powered either up or down to the touch sensitive device.


Although only two sensors are illustrated in FIG. 2, it is to be understood that additional and/or other sensors can be used according to the requirements of the stylus. Examples of additional sensors will be described below. The sensors can be selectively used in the stylus alone or in various combinations. Also, the sensing circuitry is not limited to the sensors, the MCU, and the transmitter illustrated here, but can include additional and/or other components capable of determining a condition of an intelligent stylus according to various embodiments.



FIG. 3 illustrates a cross-sectional view of an exemplary intelligent stylus with a replaceable tip according to various embodiments. In the example of FIG. 3, stylus 310 can include replaceable tip 302 having connector 342 embedded therein to connect to connector 343 in stylus body 304. The tip connector 342 can have a unique identification code that identifies the particular tip 302. The body connector 343 can have an identification sensor 313 to sense the code upon connecting with the tip connector 342. The identification sensor 313 can transmit the code to the MCU for processing. The code can be used by the MCU to determine how the stylus will be used and can be used by the touch sensitive device to perform some action based on that use. For example, if the code indicates that the replaceable tip is a stylus brush, the MCU can determine that the stylus will be used as a brush and the touch sensitive device can perform an action according to brush motions and touches.


The body connector 343 can also have a unique identification code that identifies the stylus owner. The identification sensor 313 can transmit the owner code to the MCU for processing. The owner code can be used by the MCU to identify the owner and by the touch sensitive device to perform some action based on that owner. For example, if two stylus users interact with the touch sensitive device, the panel can differentiate between the two users' inputs and use different colors, symbols, fonts, etc., or otherwise distinguish between the users and/or interact differently with each user according to the owner code. In some embodiments, the body connector 343 can be programmable, so that the current user can program in his/her unique identification code when using the stylus and delete the code when done. The MCU and the touch sensitive device can then operate based on that particular user's code.


The body connector 343 can also have a unique identification code that identifies the stylus itself. The identification sensor 313 can transmit the stylus code to the MCU, which can then transmit the stylus code to the touch sensitive device. The touch sensitive device can then perform some action based on the stylus code. For example, the touch sensitive device can authenticate the stylus as being an acceptable input device for the panel and can accept stylus input if the stylus is authenticated and deny stylus input if the stylus is not. The touch sensitive device can also display or speak the authentication result to the user.



FIG. 4 illustrates a cross-sectional view of an exemplary intelligent stylus with sensing circuitry that includes a pressure sensor according to various embodiments. In the example of FIG. 4, stylus 410 can include pressure sensor 414. The pressure sensor 414 can sense a force being applied by the stylus to a surface and can transmit a force measurement to the MCU for processing. The MCU can determine that the stylus is applying force to a surface and/or the amount of force being applied and can transmit that determination to the touch sensitive device. The touch sensitive device can then perform some action based on the applied force. For example, the touch sensitive device can display a darker, thicker image for a heavier force and a lighter, thinner image for a lighter force.


In some embodiments, the pressure sensor 414 can act in combination with the motion/orientation sensor 214 of FIG. 2 to provide motion and pressure measurements to the MCU. The MCU can then determine the cadence of the stylus as a user signs with the stylus. The touch sensitive device can use the determined cadence to verify the user's signature.



FIG. 5 illustrates a cross-sectional view of an exemplary intelligent stylus with sensing circuitry that includes a surrounding touch sensor according to various embodiments. In the example of FIG. 5, stylus 510 can include surrounding touch sensor 515, which can encircle the stylus body at the stylus handheld position. The touch sensor 515 can sense a position of a hand holding the stylus and can transmit a touch measurement to the MCU for processing. The MCU can determine how the hand is holding the stylus and can transmit that determination to the touch sensitive device. The touch sensitive device can then perform some action based on the hand position. For example, the touch sensitive device can display or speak a message to the user that the hand position is correct, too low on the stylus, too light on the stylus and so on and how the hand position can be adjusted or changed.


Alternatively, the MCU can extract fingerprints from the touch measurement and can identify the user of the stylus. The MCU can transmit the user identification to the touch sensitive device. The touch sensitive device can then perform some action based on the identification. For example, the touch sensitive device can authenticate the user of the stylus and can accept stylus input if the user is authenticated and deny stylus input if the user is not. The touch sensitive device can also display or speak the authentication result to the user.



FIG. 6 illustrates a cross-sectional view of an exemplary intelligent stylus with sensing circuitry that includes a rotation sensor according to various embodiments. In the example of FIG. 6, stylus 610 can include rotation sensor 616. The rotation sensor 616 can sense a rotation of a portion of the stylus. Here the stylus body can be in two parts separated by a break 672, so that the two parts can rotate independently. The rotation sensor 616 can sense when one or both parts rotate away from a default position and can transmit a rotation measurement to the MCU for processing. The MCU can determine a stylus mode or setting based on the rotation measurement and can transmit that determination to the touch sensitive device. The touch sensitive device can then perform some action based on the stylus mode or setting. For example, the touch sensitive device can know from the stylus that the stylus is in a color mode and can display any stylus inputs in color, or that the stylus is in an erase mode and can erase any displayed information where the stylus touches, and so on.


In some embodiments, the MCU can power up the stylus upon receipt of a rotation indication from the rotation sensor 616. The MCU can then send a determination that the stylus is powered up to the touch sensitive device. Similar actions can be done for a power down of the stylus. Alternatively, the MCU can send the stylus's condition as rotated to the touch sensitive device and the touch sensitive device can then send a signal back to the stylus to either power up or power down.



FIG. 7 illustrates a cross-sectional view of an exemplary intelligent stylus with sensing circuitry that includes a pushbutton according to various embodiments. In the example of FIG. 7, stylus 710 can include pushbutton 717. The pushbutton 717 can sense a push thereon and can transmit the push indication to the MCU for processing. The MCU can determine a stylus setting change based on the push indication and can transmit that determination to the touch sensitive device. The touch sensitive device can then perform some action based on the stylus setting change. For example, the touch sensitive device can change the color of stylus writing displayed at the panel when the pushbutton is pushed.


In some embodiments, the MCU can power up the stylus upon receipt of a push indication from the pushbutton 717. The MCU can then send a determination that the stylus is powered up to the touch sensitive device. Similar actions can be done for a power down of the stylus. Alternatively, the MCU can send the stylus's push indication to the touch sensitive device and the touch sensitive device can then send a signal back to the stylus to either power up or power down.



FIG. 8 illustrates a cross-sectional view of an exemplary intelligent stylus with sensing circuitry that includes a camera according to various embodiments. In the example of FIG. 8, stylus 810 can include camera 852. The camera 852 can capture images and/or video and transmit the captured images and/or video to the MCU for processing. The MCU can extract relevant information from the captured images and/or video about the stylus's location, environment, motion, orientation, and the like or about the touch sensitive device's displayed information, position, motion, orientation, and the like. The MCU can transmit the extracted information to the touch sensitive device, which can then perform some action based on that information. For example, the touch sensitive device can confirm its orientation and location based on information sent from the stylus.


The camera 852 can be mounted in a replaceable tip with a unique identification code that can be transmitted to the MCU to identify the camera, as described previously. The camera identification code can indicate to the MCU that the incoming information includes captured images and/or video for processing.



FIG. 9 illustrates a cross-sectional view of an exemplary intelligent stylus with sensing circuitry that includes a light emitter according to various embodiments. In the example of FIG. 9, stylus 910 can include light emitter 962. The light emitter 962 can emit light onto a proximate surface and can transmit an indication that light is being emitted to the MCU for processing. The MCU can determine that the stylus is acting as a light source and can transmit that determination to the touch sensitive device, which can then perform some action based on that determination. For example, suppose the touch sensitive device has optical sensors. The determination that the stylus is emitting light can be used by the touch sensitive device to detect the emission and perform some action based on the location relative to the panel, the duration, etc., of the detected emission. Alternatively, the touch sensitive device can know that the stylus is currently acting as a light source (e.g., a flashlight) and await the stylus's return to being an input device for the panel.


The light emitter 962 can be mounted in a replaceable tip with a unique identification code that can be transmitted to the MCU to identify the light emitter. The light emitter identification code can indicate to the MCU that the incoming information includes a light indication for processing.



FIG. 10 illustrates a cross-sectional view of an exemplary intelligent stylus with sensing circuitry including contact sensors according to various embodiments. In the example of FIG. 10, stylus 1010 can include removable stylus cap 1071 with contact 1018 embedded therein. The stylus body can have corresponding contact 1018 embedded therein. When the cap 1071 covers the stylus tip, the contacts 1018 can be engaged and can transmit an indication to the MCU for processing. The MCU can determine that the cap covers the stylus and can transmit that determination to the touch sensitive device. The touch sensitive device can then perform some action based on that determination. For example, the touch sensitive device can know that the stylus is not in usage and can await an indication that the stylus cap has been removed so as to receive stylus input.


In some embodiments, the MCU can power up the stylus upon receipt of a indication from the contacts 1018 that the stylus cap has been removed. Conversely, the MCU can power down the stylus upon receipt of an indication from the contacts 1018 that the stylus cap has been placed on the stylus. The MCU can send a determination that the stylus is powered either up or down to the touch sensitive device. Alternatively, the MCU can send the stylus's cap condition to the touch sensitive device and the touch sensitive device can send a signal back to the stylus to either power up or power down.


In some embodiments, the contacts 1018 can act in combination with the motion/orientation sensor 214 of FIG. 2 to provide contact and motion measurements to the MCU. The MCU can then use the measurements to determine whether the stylus cap covers the stylus and whether the stylus is moving. If the cap have been off the stylus for a particular amount of time and the stylus has not been moving, the MCU can determine that the stylus is not in use and can power down the stylus. Conversely, if the stylus is powered down and begins to move with the cap off, the MCU can power up the stylus. The MCU can send a determination that the stylus is powered either up or down to the touch sensitive device. Alternatively, the MCU can send the stylus' cap and motion conditions to the touch sensitive device and the touch sensitive device can send a signal back to the stylus to either power up or power down.



FIG. 11 illustrates a cross-sectional view of an exemplary intelligent brush-tipped stylus with sensing circuitry including bristle sensors according to various embodiments. In the example of FIG. 11, stylus 1110 can include a brush tip having multiple bristles 1183, each bristle having a bristle sensor 1119. FIG. 11 shows a plan view of the bristle sensors 1119. The bristles sensors 1119 can detect force applied to a surface by the bristles 1183 and the deflection of the bristles on the surface. The bristle sensors 1119 can transmit force and deflection measurements to the MCU for processing. The MCU can determine that the stylus bristles are applying force to a surface and/or the amounts of force applied and that the stylus bristles are deflected on the surface and/or the amounts of deflection. The MCU can transmit the determination to the touch sensitive device, which can then perform some action based on the determination. For example, the touch sensitive device can display a pattern corresponding to the deflection pattern, where the thickness or darkness of the pattern corresponds to the amounts of force applied.


In some embodiments, the bristle sensors 1119 can act in combination with the motion/orientation sensors 214 of FIG. 2 to provide force, deflection, and motion measurements. The MCU can then determine the brushing motion of the stylus. The touch sensitive device can use the determined brushing motion to fill in a displayed area, for example.


The brush can be mounted in a replaceable tip with a unique identification code that can be transmitted to the MCU to identify the brush. The brush identification code can indicate to the MCU that the incoming information includes bristle force and deflection measurements for processing



FIG. 12 illustrates a cross-sectional view of an exemplary intelligent stylus with sensing circuitry including a color sensor according to various embodiments. In the example of FIG. 12, stylus 1210 can include color sensor 1294. The color sensor 1294 can detect a color at a proximate surface and transmit the detected color measurement to the MCU for processing. In some embodiments, the color sensor 1294 can be a colorimeter. The MCU can determine the color components from the measurement and transmit the determined components to the touch sensitive device, which can perform some action based on the components. For example, the touch sensitive device can display the detected color using the transmitted components. The touch sensitive device can also change the font colors or some other display feature to that of the transmitted components. The touch sensitive device can additionally store the color information for later use.


The color sensor 1294 can be mounted in a replaceable tip with a unique identification code that can be transmitted to the MCU to identify the sensor. The sensor identification code can indicate to the MCU that the incoming information includes color measurements for processing.



FIG. 13 illustrates a cross-sectional view of an exemplary intelligent lead-tipped stylus with sensing circuitry according to various embodiments. In the example of FIG. 13, stylus 1310 can include lead tip 1385 for writing on real paper. The lead tip 1385 can act in combination with motion/orientation sensor 1314 (same as sensor 214 of FIG. 2) to provide motion measurements of the lead tip 1385 as it writes on the real paper. The motion measurements can be transmitted to the MCU for processing. The MCU can determine that the stylus is writing on real paper, based on the lead tip's identification code, and can transfer the determination along with the writing motions to the touch sensitive device, which can perform some action based on the measurements. For example, the touch sensitive device can reproduce the writing from the real paper electronically on the touch sensitive device. In some embodiments, the motion/orientation sensor 1314 can also provide orientation measurements of the lead tip 1385 as it writes on the real paper and transmit the orientation measurements to the MCU for processing. The MCU can determine a thickness of the writing based on the orientation and can transfer that determined thickness to the touch sensitive device. The touch sensitive device can then reproduce the writing from the real paper, including the thickness of the writing strokes, electronically on the touch sensitive device.


The lead can be mounted in a replaceable tip with a unique identification code that can be transmitted to the MCU to identify the lead tip. The lead tip identification code can indicate to the MCU that the incoming information includes writing for processing.


As described previously, the intelligent stylus can include multiple sensors that can be used alone or in various combinations to determine the stylus's condition. The stylus sensing circuitry can be scalable, such that the stylus can use none of the sensors and act as simply a touch input device for a minimal touch sensitive device or the stylus can use one or more of the sensors and act as an intelligent device for a maximal touch sensitive device having logic to interpret and process all the various combinations of stylus sensor information. FIG. 14 illustrates exemplary scalable stylus sensing circuitry within a stylus according to various embodiments. In the example of FIG. 14, stylus 1410 can include multiplexer (MUX) 1498 (or any suitable switch) coupled to MCU 1422 and multiple sensors 1411 for selecting which sensor(s) to enable and transmit a measurement to the MCU for processing. The MCU 1422 can send control signal 1421 to the MUX 1498 to indicate which sensors 1411 to select. The selection can be made according to the capabilities of the touch sensitive device in communication with the stylus via transmitter 1432. The selection can also be made according to the purpose of the stylus.



FIG. 15 illustrates an exemplary method for detecting a condition of an intelligent stylus according to various embodiments. In the example of FIG. 15, one or more sensors can selectively make a stylus measurement (1510). A stylus condition can be determined from the sensor measurements (1520). Examples of a stylus condition can include the stylus being in use, the stylus having a particular orientation, the stylus moving, the stylus making a brushing motion, the stylus being held, the stylus being used by a particular user, and so on. The determined condition can cause an action therefrom (1530). In some embodiments, the condition can be transmitted to a touch sensitive device, for example, causing the touch sensitive device to perform an action. The touch sensitive device can then display some information, update the display with new information, execute a function, or perform some other appropriate action based on the condition. Alternatively, the condition can be transmitted to a component of the stylus, such as the power supply, for example, causing the stylus component to perform an action. The stylus can then switch between low and full power or perform some other appropriate action based on the condition.



FIG. 16 illustrates another exemplary method for detecting a condition of an intelligent stylus according to various embodiments. In the example of FIG. 16, one or more stylus measurements can be provided (1610). The stylus measurements can be provided by the stylus itself and transmitted to a touch sensitive device for processing. One or more sensors can selectively make measurements corresponding to the stylus measurements (1620). The measurements can be made by the touch sensitive device, for example, in communication with the stylus. The measurements can be made coincident with the stylus measurements and based on the same operation(s) by the stylus and/or the touch sensitive device. The two sets of measurements can be correlated (1630). The correlation can include comparing the two sets of measurements and determining how close the two sets are. If close, the measurements can be considered correct. If not close, the measurements can be considered in error and can be discarded. If the correlation is favorable (1640), a stylus condition can be determined from the measurements (1650). The fact that the correlation is favorable can confirm that the subsequently determined stylus condition is valid. Examples of a stylus condition are described above regarding FIG. 15. An action can be performed based on the determined stylus condition (1660). The action can be performed by the touch sensitive device or by the stylus, for example. Example actions are described above regarding FIG. 15. In some embodiments, as part of the performed action, a signal can be transmitted from the touch sensitive device to the stylus to cause the stylus to perform an action based on the determined stylus condition.


In alternate embodiments, a stylus can use its stylus measurements to determine its condition and then provide the determined condition. A touch sensitive device can use its measurements to also determine the stylus condition. The touch sensitive device can then correlate the provided stylus condition with its determined stylus condition to confirm the condition. If the condition is confirmed, the touch sensitive device can then perform an associated action. The stylus can also perform an associated action.



FIG. 17 illustrates an exemplary computing system that can use an intelligent stylus according to various embodiments. In the example of FIG. 17, computing system 1700 can include touch controller 1706. The touch controller 1706 can be a single application specific integrated circuit (ASIC) that can include one or more processor subsystems 1702, which can include one or more main processors, such as ARM968 processors or other processors with similar functionality and capabilities. However, in other embodiments, the processor functionality can be implemented instead by dedicated logic, such as a state machine. The processor subsystems 1702 can also include peripherals (not shown) such as random access memory (RAM) or other types of memory or storage, watchdog timers and the like. The touch controller 1706 can also include receive section 1707 for receiving signals, such as touch (or sense) signals 1703 of one or more sense channels (not shown), other signals from other sensors such as sensor 1711, etc. The touch controller 1706 can also include demodulation section 1709 such as a multistage vector demodulation engine, panel scan logic 1710, and transmit section 1714 for transmitting stimulation signals 1716 to touch panel 1724 to drive the panel. The scan logic 1710 can access RAM 1712, autonomously read data from the sense channels, and provide control for the sense channels. In addition, the scan logic 1710 can control the transmit section 1714 to generate the stimulation signals 1716 at various frequencies and phases that can be selectively applied to rows of the touch panel 1724.


The touch controller 1706 can also include charge pump 1715, which can be used to generate the supply voltage for the transmit section 1714. The stimulation signals 1716 can have amplitudes higher than the maximum voltage by cascading two charge store devices, e.g., capacitors, together to form the charge pump 1715. Therefore, the stimulus voltage can be higher (e.g., 6V) than the voltage level a single capacitor can handle (e.g., 3.6 V). Although FIG. 17 shows the charge pump 1715 separate from the transmit section 1714, the charge pump can be part of the transmit section.


Computing system 1700 can include host processor 1728 for receiving outputs from the processor subsystems 1702 and performing actions based on the outputs that can include, but are not limited to, moving an object such as a cursor or pointer, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, making a selection, executing instructions, operating a peripheral device coupled to the host device, answering a telephone call, placing a telephone call, terminating a telephone call, changing the volume or audio settings, storing information related to telephone communications such as addresses, frequently dialed numbers, received calls, missed calls, logging onto a computer or a computer network, permitting authorized individuals access to restricted areas of the computer or computer network, loading a user profile associated with a user's preferred arrangement of the computer desktop, permitting access to web content, launching a particular program, encrypting or decoding a message, and/or the like. The host processor 1728 can also perform additional functions that may not be related to touch processing, and can be connected to program storage 1732 and display device 1730 such as an LCD for providing a UI to a user of the device. Display device 1730 together with touch panel 1724, when located partially or entirely under the touch panel, can form a touch screen.


Touch panel 1724 can include a capacitive sensing medium having drive lines and sense lines. It should be noted that the term “lines” can sometimes be used herein to mean simply conductive pathways, as one skilled in the art can readily understand, and is not limited to structures that can be strictly linear, but can include pathways that change direction, and can include pathways of different size, shape, materials, etc. Drive lines can be driven by stimulation signals 1716 and resulting touch signals 1703 generated in sense lines can be transmitted to receive section 1707 in touch controller 1706. In this way, drive lines and sense lines can be part of the touch and hover sensing circuitry that can interact to form capacitive sensing nodes, which can be thought of as touch picture elements (touch pixels), such as touch pixels 1726. This way of understanding can be particularly useful when touch panel 1724 can be viewed as capturing an “image” of touch. In other words, after touch controller 1706 has determined whether a touch or hover has been detected at each touch pixel in the touch panel, the pattern of touch pixels in the touch panel at which a touch or hover occurred can be thought of as an “image” of touch (e.g. a pattern of fingers touching or hovering over the touch panel).


An intelligent stylus according to various embodiments can touch or hover over the touch panel 1724 to provide touch input information. The intelligent stylus can transmit additional information about the stylus condition to the processor subsystem 1702 or to the host processor 1728 for processing. The processor subsystem 1702 or the host processor 1728 can include logic to interpret and process the additional information from the intelligent stylus.


Note that one or more of the functions described above, can be performed, for example, by firmware stored in memory (e.g., one of the peripherals) and executed by the processor subsystem 1702, or stored in the program storage 1732 and executed by the host processor 1728. The firmware can also be stored and/or transported within any non-transitory computer readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “non-transitory computer readable storage medium” can be any non-transitory medium that can contain or store the program for use by or in connection with the instruction execution system, apparatus, or device. The non-transitory computer readable storage medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, a portable computer diskette (magnetic), a random access memory (RAM) (magnetic), a read-only memory (ROM) (magnetic), an erasable programmable read-only memory (EPROM) (magnetic), a portable optical disc such a CD, CD-R, CD-RW, DVD, DVD-R, or DVD-RW, or flash memory such as compact flash cards, secured digital cards, USB memory devices, memory sticks, and the like.


The firmware can also be propagated within any transport medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “transport medium” can be any medium that can communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The transport readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic or infrared wired or wireless propagation medium.


It is to be understood that the touch panel, as described in FIG. 17, can sense touch and hover according to various embodiments. In addition, the touch panel described herein can be either single- or multi-touch.



FIG. 18 illustrates an exemplary mobile telephone 1800 that can include touch panel 1824, display device 1836, and other computing system blocks for use with an intelligent stylus according to various embodiments.



FIG. 19 illustrates an exemplary digital media player 1900 that can include touch panel 1924, display device 1936, and other computing system blocks for use with an intelligent stylus according to various embodiments.



FIG. 20 illustrates an exemplary personal computer 2000 that can include touch pad 2024, display 2036, and other computing system blocks for use with an intelligent stylus according to various embodiments.


The mobile telephone, media player, and personal computer of FIGS. 18 through 20 can improve touch and hover sensing and improve device capabilities by utilizing a stylus according to various embodiments.


Although embodiments have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the various embodiments as defined by the appended claims.

Claims
  • 1. An input device comprising: a camera configured to capture one or more images;a motion sensor;an orientation sensor; anda controller configured to: receive the one or more captured images, the motion data, and the orientation data;detect that the input device is not proximate to a touch screen of an associated host device; andin response to detecting that the input device is not proximate to the touch screen of the associated host device, transmit information associated with one or more of the captured one or more images, the motion data, and the orientation data so as to cause a writing action at the associated host device.
  • 2. The input device of claim 1, further comprising an identification sensor configured to detect a code associated with the input device, wherein the transmitter is further configured to transmit the code so as to cause an authentication of the input device.
  • 3. The input device of claim 1, further comprising a light emitter configured to emit light, and transmit to the controller an indication the light emitter emits light.
  • 4. The input device of claim 1, wherein the one or more captured images comprise one of video footage and one or more still images.
  • 5. A method of operating an input device, the method comprising: capturing, with a camera of the input device, one or more images;capturing, with a motion sensor of the input device, motion data;capturing, with an orientation sensor of the input device, orientation data;receiving the one or more captured images, the motion data, and the orientation data at a controller;detecting that the input device is not proximate to a touch screen of an associated host device; andin response to detecting that the input device is not proximate to the touch screen of the associated host device, transmitting information associated with one or more of the captured one or more images, the motion data, and the orientation data so as to cause a writing action at the associated host device.
  • 6. The method of claim 5, wherein the input device further comprises an identification sensor configured to detect a code associated with the input device, and the transmitter is further configured to transmit the code so as to cause an authentication of the input device.
  • 7. The method of claim 5, further comprising: emitting light with a light emitter; andtransmitting to the controller an indication that the light emitter emits light.
  • 8. The method of claim 5, wherein the one or more captured images comprise one of video footage and one or more still images.
  • 9. A non-transitory computer readable storage medium storing executable program instructions, which, when executed by an input device with a controller, cause the input device to: capture, with a camera of the input device, one or more images;capture, with a motion sensor of the input device, motion data;capture, with an orientation sensor of the input device, orientation data;receive the one or more captured images, the motion data, and the orientation data at the controller;detect that the input device is not proximate to a touch screen of an associated host device; andin response to detecting that the input device is not proximate to the touch screen of the associated host device, transmit information associated with one or more of the captured one or more images, the motion data, and the orientation data so as to cause a writing cation at the associated host device.
  • 10. The non-transitory computer readable storage medium of claim 9, wherein the input device further comprises an identification sensor configured to detect a code associated with the input device, and the transmitter is further configured to transmit the code so as to cause an authentication of the input device.
  • 11. The non-transitory computer readable storage medium of claim 9, wherein the program instructions further cause the input device to: emit light with a light emitter; andtransmit to the controller an indication that the light emitter emits light.
  • 12. The non-transitory computer readable storage medium of claim 9, wherein the one or more captured images comprise one of video footage and one or more still images.
  • 13. The input device of claim 1, wherein detecting that the input device is not proximate to the touch screen of the associated host device comprises detecting that a tip of the input device is proximate to piece of paper.
  • 14. The input device of claim 1, wherein detecting that the input device is not proximate to the touch screen of the associated host device comprises extracting one or more of the input device's location, the input device's environment, the input device's motion, the input device's orientation, the host device's displayed information, the host device's position, the host device's motion, and the host device's orientation from the one or more captured images.
  • 15. The method of claim 5, wherein detecting that the input device is not proximate to the touch screen of the associated host device comprises detecting that a tip of the input device is proximate to piece of paper.
  • 16. The method of claim 5, wherein detecting that the input device is not proximate to the touch screen of the associated host device comprises extracting one or more of the input device's location, the input device's environment, the input device's motion, the input device's orientation, the host device's displayed information, the host device's position, the host device's motion, and the host device's orientation from the one or more captured images.
  • 17. The non-transitory computer readable storage medium of claim 9, wherein detecting that the input device is not proximate to the touch screen of the associated host device comprises detecting that a tip of the input device is proximate to piece of paper.
  • 18. The non-transitory computer readable storage medium of claim 9, wherein detecting that the input device is not proximate to the touch screen of the associated host device comprises detecting that a tip of the input device is proximate to piece of paper.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 13/166,711, filed Jun. 22, 2011 and published on Dec. 27, 2012 as U.S. Publication No. 2012/0331546, the contents of which are incorporated by reference herein in their entirety for all intended purposes.

US Referenced Citations (601)
Number Name Date Kind
3462692 Bartlett Aug 1969 A
3970846 Schofield et al. Jul 1976 A
4220815 Gibson et al. Sep 1980 A
4281407 Tosima Jul 1981 A
4289927 Rodgers Sep 1981 A
4320292 Oikawa et al. Mar 1982 A
4334219 Paülus et al. Jun 1982 A
4345248 Togashi et al. Aug 1982 A
4405921 Mukaiyama Sep 1983 A
4439855 Dholakia Mar 1984 A
4476463 Ng et al. Oct 1984 A
4481510 Hareng et al. Nov 1984 A
4484179 Kasday Nov 1984 A
4490607 Pease et al. Dec 1984 A
4496981 Ota Jan 1985 A
4520357 Castleberry et al. May 1985 A
4542375 Alles et al. Sep 1985 A
4602321 Bornhorst Jul 1986 A
4603356 Bates Jul 1986 A
4642459 Casewell et al. Feb 1987 A
4644338 Aoki et al. Feb 1987 A
4655552 Togashi et al. Apr 1987 A
4662718 Masubuchi May 1987 A
4671671 Suetaka Jun 1987 A
4677428 Bartholow Jun 1987 A
4679909 Hamada et al. Jul 1987 A
4684939 Streit Aug 1987 A
4698460 Krein et al. Oct 1987 A
4705942 Budrikis et al. Nov 1987 A
4720869 Wadia Jan 1988 A
4736203 Sidlauskas Apr 1988 A
4740782 Aoki et al. Apr 1988 A
4749879 Peterson et al. Jun 1988 A
4759610 Yanagisawa Jul 1988 A
4767192 Chang et al. Aug 1988 A
4772101 Liu Sep 1988 A
4782327 Kley et al. Nov 1988 A
4782328 Denlinger Nov 1988 A
4785564 Gurtler Nov 1988 A
4794634 Torihata et al. Dec 1988 A
4814760 Johnston et al. Mar 1989 A
4823178 Suda Apr 1989 A
4838655 Hunahata et al. Jun 1989 A
4846559 Kniffler Jul 1989 A
4877697 Vollmann et al. Oct 1989 A
4893120 Doering et al. Jan 1990 A
4904056 Castleberry Feb 1990 A
4917474 Yamazaki et al. Apr 1990 A
4940901 Henry et al. Jul 1990 A
5003356 Wakai et al. Mar 1991 A
5037119 Takehara et al. Aug 1991 A
5039206 Wiltshire Aug 1991 A
5051570 Tsujikawa et al. Sep 1991 A
5063379 Fabry et al. Nov 1991 A
5083175 Hack et al. Jan 1992 A
5105186 May Apr 1992 A
5113041 Blonder et al. May 1992 A
5117071 Greanias et al. May 1992 A
5140153 Heikkinen et al. Aug 1992 A
5151688 Tanaka et al. Sep 1992 A
5153420 Hack et al. Oct 1992 A
5172104 Tanigaki et al. Dec 1992 A
5182661 Ikeda et al. Jan 1993 A
5204661 Hack et al. Apr 1993 A
5236850 Zhang Aug 1993 A
5237314 Knapp Aug 1993 A
5239152 Caldwell et al. Aug 1993 A
5243332 Jacobson Sep 1993 A
5276538 Monji et al. Jan 1994 A
5301048 Huisman Apr 1994 A
5308964 Kwon May 1994 A
5339090 Crossland et al. Aug 1994 A
5339091 Yamazaki et al. Aug 1994 A
5341133 Savoy et al. Aug 1994 A
5349174 Van Berkel et al. Sep 1994 A
5360426 Muller et al. Nov 1994 A
5365461 Stein et al. Nov 1994 A
5369262 Dvorkis et al. Nov 1994 A
5376948 Roberts Dec 1994 A
5381251 Nonomura et al. Jan 1995 A
5386543 Bird Jan 1995 A
5387445 Horiuchi et al. Feb 1995 A
5414283 den Boer et al. May 1995 A
5422693 Vogeley et al. Jun 1995 A
5430462 Katagiri et al. Jul 1995 A
5445871 Murase et al. Aug 1995 A
5446564 Mawatari et al. Aug 1995 A
5461400 Ishii et al. Oct 1995 A
5475398 Yamazaki et al. Dec 1995 A
5483261 Yasutake Jan 1996 A
5483263 Bird et al. Jan 1996 A
5485177 Shannon et al. Jan 1996 A
5488204 Mead et al. Jan 1996 A
5502514 Vogeley et al. Mar 1996 A
5510916 Takahashi Apr 1996 A
5515186 Fergason et al. May 1996 A
5525813 Miyake et al. Jun 1996 A
5532743 Komobuchi Jul 1996 A
5559471 Black Sep 1996 A
5568292 Kim Oct 1996 A
5581378 Kulick et al. Dec 1996 A
5585817 Itoh et al. Dec 1996 A
5589961 Shigeta et al. Dec 1996 A
5598004 Powell et al. Jan 1997 A
5608390 Gasparik Mar 1997 A
5610629 Baur Mar 1997 A
5635982 Zhang et al. Jun 1997 A
5637187 Takasu et al. Jun 1997 A
5652600 Khormaei et al. Jul 1997 A
5659332 Ishii et al. Aug 1997 A
5677744 Yoneda et al. Oct 1997 A
5709118 Ohkubo Jan 1998 A
5712528 Barrow et al. Jan 1998 A
5734491 Debesis Mar 1998 A
5736980 Iguchi et al. Apr 1998 A
5751453 Baur May 1998 A
5757522 Kulick et al. May 1998 A
5767623 Friedman et al. Jun 1998 A
5777713 Kimura Jul 1998 A
5778108 Coleman, Jr. Jul 1998 A
5790106 Hirano et al. Aug 1998 A
5793342 Rhoads Aug 1998 A
5796121 Gates Aug 1998 A
5796473 Murata et al. Aug 1998 A
5812109 Kaifu et al. Sep 1998 A
5818037 Redford et al. Oct 1998 A
5818553 Koenck et al. Oct 1998 A
5818956 Tuli Oct 1998 A
5825352 Bisset et al. Oct 1998 A
5831693 McCartney, Jr. et al. Nov 1998 A
5834765 Ashdown Nov 1998 A
5835079 Shieh Nov 1998 A
5838290 Kuijk Nov 1998 A
5838308 Knapp et al. Nov 1998 A
5852487 Fujimori et al. Dec 1998 A
5854448 Nozaki et al. Dec 1998 A
5854881 Yoshida et al. Dec 1998 A
5877735 King et al. Mar 1999 A
5880411 Gillespie et al. Mar 1999 A
5883715 Steinlechner et al. Mar 1999 A
5890799 Yiu et al. Apr 1999 A
5917464 Stearns Jun 1999 A
5920309 Bisset et al. Jul 1999 A
5920360 Coleman, Jr. Jul 1999 A
5923320 Murakami et al. Jul 1999 A
5926238 Inoue et al. Jul 1999 A
5930591 Huang Jul 1999 A
5940049 Hinman et al. Aug 1999 A
5942761 Tuli Aug 1999 A
5956020 D'Amico et al. Sep 1999 A
5959617 Bird et al. Sep 1999 A
5959697 Coleman, Jr. Sep 1999 A
5962856 Zhao et al. Oct 1999 A
5966108 Ditzik Oct 1999 A
5973312 Curling et al. Oct 1999 A
5990980 Golin Nov 1999 A
5990988 Hanihara et al. Nov 1999 A
5995172 Ikeda et al. Nov 1999 A
6002387 Ronkka et al. Dec 1999 A
6020590 Aggas et al. Feb 2000 A
6020945 Sawai et al. Feb 2000 A
6023307 Park Feb 2000 A
6028581 Umeya Feb 2000 A
6049428 Khan et al. Apr 2000 A
6061117 Fujimoto May 2000 A
6064374 Fukuzaki May 2000 A
6067062 Takasu et al. May 2000 A
6067140 Woo et al. May 2000 A
6069393 Hatanaka et al. May 2000 A
6078378 Lu et al. Jun 2000 A
6087599 Knowles Jul 2000 A
6091030 Tagawa et al. Jul 2000 A
6100538 Ogawa Aug 2000 A
6118435 Fujita et al. Sep 2000 A
6133906 Geaghan Oct 2000 A
6163313 Aroyan et al. Dec 2000 A
6177302 Yamazaki et al. Jan 2001 B1
6181394 Sanelle et al. Jan 2001 B1
6182892 Angelo et al. Feb 2001 B1
6184863 Sibert et al. Feb 2001 B1
6184873 Ward Feb 2001 B1
6188391 Seely et al. Feb 2001 B1
6188781 Brownlee Feb 2001 B1
6232607 Huang May 2001 B1
6236053 Shariv May 2001 B1
6236063 Yamazaki et al. May 2001 B1
6239788 Nohno et al. May 2001 B1
6242729 Izumi et al. Jun 2001 B1
6262408 Izumi et al. Jul 2001 B1
6265792 Granchukoff Jul 2001 B1
6271813 Palalau Aug 2001 B1
6278423 Wald et al. Aug 2001 B1
6278444 Wilson et al. Aug 2001 B1
6284558 Sakamoto Sep 2001 B1
6295113 Yang Sep 2001 B1
6300977 Waechter Oct 2001 B1
6310610 Beaton et al. Oct 2001 B1
6316790 Kodaira et al. Nov 2001 B1
6320617 Gee et al. Nov 2001 B1
6323490 Ikeda et al. Nov 2001 B1
6323846 Westerman et al. Nov 2001 B1
6326956 Jaeger et al. Dec 2001 B1
6327376 Harkin Dec 2001 B1
6333544 Toyoda et al. Dec 2001 B1
6351076 Yoshida et al. Feb 2002 B1
6351260 Graham et al. Feb 2002 B1
6357939 Baron Mar 2002 B1
6364829 Fulghum Apr 2002 B1
6377249 Mumford Apr 2002 B1
6380995 Kim Apr 2002 B1
6392254 Liu et al. May 2002 B1
6399166 Khan et al. Jun 2002 B1
6400359 Katabami Jun 2002 B1
6441362 Ogawa Aug 2002 B1
6453008 Sakaguchi et al. Sep 2002 B1
6454482 Silverbrook et al. Sep 2002 B1
6465824 Kwasnick et al. Oct 2002 B1
6476447 Yamazaki et al. Nov 2002 B1
6489631 Young et al. Dec 2002 B2
6495387 French Dec 2002 B2
6504530 Wilson et al. Jan 2003 B1
6518561 Miura Feb 2003 B1
6521109 Bartic et al. Feb 2003 B1
6529189 Colgan et al. Mar 2003 B1
6552745 Perner Apr 2003 B1
6597348 Yamazaki et al. Jul 2003 B1
6603867 Sugino et al. Aug 2003 B1
6642238 Hester, Jr. Nov 2003 B2
6646636 Popovich et al. Nov 2003 B1
6667740 Ely et al. Dec 2003 B2
6679702 Rau Jan 2004 B1
6681034 Russo Jan 2004 B1
6690156 Weiner et al. Feb 2004 B1
6690387 Zimmerman et al. Feb 2004 B2
6700144 Shimazaki et al. Mar 2004 B2
6720594 Rahn et al. Apr 2004 B2
6738031 Young et al. May 2004 B2
6738050 Comiskey et al. May 2004 B2
6741655 Chang et al. May 2004 B1
6762741 Weindorf Jul 2004 B2
6762752 Perski et al. Jul 2004 B2
6803906 Morrison et al. Oct 2004 B1
6815716 Sanson et al. Nov 2004 B2
6831710 den Boer Dec 2004 B2
6862022 Slupe Mar 2005 B2
6864882 Newton Mar 2005 B2
6879344 Nakamura et al. Apr 2005 B1
6879710 Hinoue et al. Apr 2005 B1
6888528 Rai et al. May 2005 B2
6947017 Gettemy Sep 2005 B1
6947102 den Boer et al. Sep 2005 B2
6956564 Williams Oct 2005 B1
6972753 Kimura et al. Dec 2005 B1
6995743 den Boer et al. Feb 2006 B2
7006080 Gettemy Feb 2006 B2
7009663 Abileah et al. Mar 2006 B2
7015833 Bodenmann et al. Mar 2006 B1
7015894 Morohoshi Mar 2006 B2
7023503 den Boer Apr 2006 B2
7053967 Abileah et al. May 2006 B2
7068254 Yamazaki et al. Jun 2006 B2
7075521 Yamamoto et al. Jul 2006 B2
7098894 Yang et al. Aug 2006 B2
7109465 Kok et al. Sep 2006 B2
7157649 Hill Jan 2007 B2
7164164 Nakamura et al. Jan 2007 B2
7176905 Baharav et al. Feb 2007 B2
7177026 Perlin Feb 2007 B2
7184009 Bergquist Feb 2007 B2
7184064 Zimmerman et al. Feb 2007 B2
7190461 Han et al. Mar 2007 B2
7205988 Nakamura et al. Apr 2007 B2
7208102 Aoki et al. Apr 2007 B2
7242049 Forbes et al. Jul 2007 B2
7250596 Reime Jul 2007 B2
7292229 Morag et al. Nov 2007 B2
7298367 Geaghan et al. Nov 2007 B2
7348946 Booth, Jr. et al. Mar 2008 B2
7372455 Perski et al. May 2008 B2
7408598 den Boer et al. Aug 2008 B2
7418117 Kim et al. Aug 2008 B2
7450105 Nakamura et al. Nov 2008 B2
7456812 Smith et al. Nov 2008 B2
7463297 Yoshida et al. Dec 2008 B2
7483005 Nakamura et al. Jan 2009 B2
7522149 Nakamura et al. Apr 2009 B2
7535468 Uy May 2009 B2
7536557 Murakami et al. May 2009 B2
7545371 Nakamura et al. Jun 2009 B2
7598949 Han Oct 2009 B2
7609862 Black Oct 2009 B2
7612767 Griffin et al. Nov 2009 B1
7629945 Baudisch Dec 2009 B2
7649524 Haim et al. Jan 2010 B2
7649527 Cho et al. Jan 2010 B2
7663607 Hotelling et al. Feb 2010 B2
7719515 Fujiwara et al. May 2010 B2
7786978 Lapstun et al. Aug 2010 B2
7843439 Perski et al. Nov 2010 B2
7848825 Wilson et al. Dec 2010 B2
7859519 Tulbert Dec 2010 B2
7868873 Palay et al. Jan 2011 B2
7902840 Zachut et al. Mar 2011 B2
7924272 den Boer et al. Apr 2011 B2
8031094 Hotelling et al. Oct 2011 B2
8059102 Rimon et al. Nov 2011 B2
8094128 Vu et al. Jan 2012 B2
8169421 Wright et al. May 2012 B2
8174273 Geaghan May 2012 B2
8228311 Perski et al. Jul 2012 B2
8232977 Zachut et al. Jul 2012 B2
8269511 Jordan Sep 2012 B2
8278571 Orsley Oct 2012 B2
8373677 Perski et al. Feb 2013 B2
8390588 Vu et al. Mar 2013 B2
8400427 Perski et al. Mar 2013 B2
8479122 Hotelling et al. Jul 2013 B2
8481872 Zachut Jul 2013 B2
8493331 Krah et al. Jul 2013 B2
8536471 Stern et al. Sep 2013 B2
8537126 Yousefpor et al. Sep 2013 B2
8552986 Wong et al. Oct 2013 B2
8581870 Bokma et al. Nov 2013 B2
8605045 Mamba et al. Dec 2013 B2
8659556 Wilson Feb 2014 B2
8698769 Coulson et al. Apr 2014 B2
8723825 Wright et al. May 2014 B2
8816985 Tate et al. Aug 2014 B1
8847899 Washburn et al. Sep 2014 B2
8928635 Harley et al. Jan 2015 B2
8933899 Shahparnia et al. Jan 2015 B2
9013429 Krekhovetskyy et al. Apr 2015 B1
9092086 Krah et al. Jul 2015 B2
9146414 Chang et al. Sep 2015 B2
9170681 Huang et al. Oct 2015 B2
9310923 Krah et al. Apr 2016 B2
9329703 Falkenburg May 2016 B2
9519361 Harley et al. Dec 2016 B2
9557845 Shahparnia Jan 2017 B2
9582105 Krah et al. Feb 2017 B2
9652090 Tan et al. May 2017 B2
20010000026 Skoog Mar 2001 A1
20010000676 Zhang et al. May 2001 A1
20010003711 Coyer Jun 2001 A1
20010044858 Rekimoto et al. Nov 2001 A1
20010046013 Noritake et al. Nov 2001 A1
20010052597 Young et al. Dec 2001 A1
20010055008 Young et al. Dec 2001 A1
20020027164 Mault et al. Mar 2002 A1
20020030581 Janiak et al. Mar 2002 A1
20020030768 Wu Mar 2002 A1
20020052192 Yamazaki et al. May 2002 A1
20020063518 Okamoto et al. May 2002 A1
20020067845 Griffis Jun 2002 A1
20020071074 Noritake et al. Jun 2002 A1
20020074171 Nakano et al. Jun 2002 A1
20020074549 Park et al. Jun 2002 A1
20020080123 Kennedy et al. Jun 2002 A1
20020080263 Krymski Jun 2002 A1
20020126240 Seiki et al. Sep 2002 A1
20020149571 Roberts Oct 2002 A1
20020175903 Fahraeus et al. Nov 2002 A1
20030020083 Hsiung et al. Jan 2003 A1
20030038778 Noguera Feb 2003 A1
20030103030 Wu Jun 2003 A1
20030103589 Nohara et al. Jun 2003 A1
20030117369 Spitzer et al. Jun 2003 A1
20030127672 Rahn et al. Jul 2003 A1
20030137494 Tulbert Jul 2003 A1
20030151569 Lee et al. Aug 2003 A1
20030156087 den Boer et al. Aug 2003 A1
20030156100 Gettemy Aug 2003 A1
20030156230 den Boer et al. Aug 2003 A1
20030174256 Kim et al. Sep 2003 A1
20030174870 Kim et al. Sep 2003 A1
20030179323 Abileah et al. Sep 2003 A1
20030183019 Chae Oct 2003 A1
20030197691 Fujiwara et al. Oct 2003 A1
20030205662 den Boer et al. Nov 2003 A1
20030218116 den Boer et al. Nov 2003 A1
20030231277 Zhang Dec 2003 A1
20030234759 Bergquist Dec 2003 A1
20040008189 Clapper et al. Jan 2004 A1
20040046900 den Boer et al. Mar 2004 A1
20040081205 Coulson Apr 2004 A1
20040095333 Morag et al. May 2004 A1
20040113877 Abileah et al. Jun 2004 A1
20040125430 Kasajima et al. Jul 2004 A1
20040140962 Wang et al. Jul 2004 A1
20040189587 Jung et al. Sep 2004 A1
20040191976 Udupa et al. Sep 2004 A1
20040252867 Lan et al. Dec 2004 A1
20050040393 Hong Feb 2005 A1
20050091297 Sato et al. Apr 2005 A1
20050110777 Geaghan et al. May 2005 A1
20050117079 Pak et al. Jun 2005 A1
20050134749 Abileah Jun 2005 A1
20050146517 Robrecht et al. Jul 2005 A1
20050173703 Lebrun Aug 2005 A1
20050179706 Childers Aug 2005 A1
20050200603 Casebolt et al. Sep 2005 A1
20050206764 Kobayashi et al. Sep 2005 A1
20050231656 den Boer et al. Oct 2005 A1
20050270590 Izumi et al. Dec 2005 A1
20050275616 Park et al. Dec 2005 A1
20050285985 den Boer et al. Dec 2005 A1
20060007224 Hayashi et al. Jan 2006 A1
20060007336 Yamaguchi Jan 2006 A1
20060010658 Bigley Jan 2006 A1
20060012580 Perski et al. Jan 2006 A1
20060034492 Siegel et al. Feb 2006 A1
20060120013 Dioro et al. Jun 2006 A1
20060125971 Abileah et al. Jun 2006 A1
20060159478 Kikuchi Jul 2006 A1
20060170658 Nakamura et al. Aug 2006 A1
20060176288 Pittel et al. Aug 2006 A1
20060187367 Abileah et al. Aug 2006 A1
20060197753 Hotelling Sep 2006 A1
20060202975 Chiang Sep 2006 A1
20060249763 Mochizuki et al. Nov 2006 A1
20060250381 Geaghan Nov 2006 A1
20060279690 Yu et al. Dec 2006 A1
20060284854 Cheng et al. Dec 2006 A1
20070030258 Pittel et al. Feb 2007 A1
20070062852 Zachut et al. Mar 2007 A1
20070109239 den Boer et al. May 2007 A1
20070109286 Nakamura et al. May 2007 A1
20070131991 Sugawa Jun 2007 A1
20070146349 Errico Jun 2007 A1
20070216905 Han et al. Sep 2007 A1
20070279346 den Boer et al. Dec 2007 A1
20070285405 Rehm Dec 2007 A1
20070291012 Chang Dec 2007 A1
20080012835 Rimon et al. Jan 2008 A1
20080012838 Rimon Jan 2008 A1
20080029691 Han Feb 2008 A1
20080046425 Perski Feb 2008 A1
20080048995 Abileah et al. Feb 2008 A1
20080049153 Abileah et al. Feb 2008 A1
20080049154 Abileah et al. Feb 2008 A1
20080055507 den Boer et al. Feb 2008 A1
20080055295 den Boer et al. Mar 2008 A1
20080055496 Abileah et al. Mar 2008 A1
20080055497 Abileah et al. Mar 2008 A1
20080055498 Abileah et al. Mar 2008 A1
20080055499 den Boer et al. Mar 2008 A1
20080062156 Abileah et al. Mar 2008 A1
20080062157 Abileah et al. Mar 2008 A1
20080062343 den Boer et al. Mar 2008 A1
20080066972 Abileah et al. Mar 2008 A1
20080084374 Abileah et al. Apr 2008 A1
20080111780 Abileah et al. May 2008 A1
20080128180 Perski et al. Jun 2008 A1
20080129909 den Boer et al. Jun 2008 A1
20080129913 den Boer et al. Jun 2008 A1
20080129914 den Boer et al. Jun 2008 A1
20080142280 Yamamoto et al. Jun 2008 A1
20080158165 Geaghan et al. Jul 2008 A1
20080158167 Hotelling et al. Jul 2008 A1
20080158172 Hotelling et al. Jul 2008 A1
20080158180 Krah et al. Jul 2008 A1
20080162997 Vu et al. Jul 2008 A1
20080165311 Abileah et al. Jul 2008 A1
20080170046 Rimon et al. Jul 2008 A1
20080238885 Zachut et al. Oct 2008 A1
20080278443 Schelling et al. Nov 2008 A1
20080284925 Han Nov 2008 A1
20080297487 Hotelling et al. Dec 2008 A1
20080309625 Krah et al. Dec 2008 A1
20080309628 Krah et al. Dec 2008 A1
20080309631 Westerman et al. Dec 2008 A1
20090000831 Miller et al. Jan 2009 A1
20090009483 Hotelling et al. Jan 2009 A1
20090027354 Perski et al. Jan 2009 A1
20090065269 Katsurahira Mar 2009 A1
20090066665 Lee Mar 2009 A1
20090078476 Rimon et al. Mar 2009 A1
20090095540 Zachut et al. Apr 2009 A1
20090128529 Izumi et al. May 2009 A1
20090135492 Kusuda et al. May 2009 A1
20090153152 Maharyta et al. Jun 2009 A1
20090153525 Lung Jun 2009 A1
20090167713 Edwards Jul 2009 A1
20090167728 Geaghan et al. Jul 2009 A1
20090184939 Wohlstadter et al. Jul 2009 A1
20090189867 Krah et al. Jul 2009 A1
20090225210 Sugawa Sep 2009 A1
20090251434 Rimon et al. Oct 2009 A1
20090262637 Badaye et al. Oct 2009 A1
20090273579 Zachut et al. Nov 2009 A1
20090322685 Lee Dec 2009 A1
20090322696 Yaakoby et al. Dec 2009 A1
20100001978 Lynch et al. Jan 2010 A1
20100006350 Elias Jan 2010 A1
20100013793 Abileah et al. Jan 2010 A1
20100013794 Abileah et al. Jan 2010 A1
20100013796 Abileah et al. Jan 2010 A1
20100020037 Narita et al. Jan 2010 A1
20100020044 Abileah et al. Jan 2010 A1
20100033766 Marggraff Feb 2010 A1
20100045904 Katoh et al. Feb 2010 A1
20100051356 Stern et al. Mar 2010 A1
20100053113 Wu Mar 2010 A1
20100059296 Abileah et al. Mar 2010 A9
20100060590 Wilson et al. Mar 2010 A1
20100066692 Noguchi et al. Mar 2010 A1
20100066693 Sato et al. Mar 2010 A1
20100073323 Geaghan Mar 2010 A1
20100085325 King-Smith et al. Apr 2010 A1
20100118237 Katoh et al. May 2010 A1
20100155153 Zachut Jun 2010 A1
20100160041 Grant et al. Jun 2010 A1
20100194692 Orr et al. Aug 2010 A1
20100252335 Orsley Oct 2010 A1
20100271332 Wu et al. Oct 2010 A1
20100289754 Sleeman et al. Nov 2010 A1
20100302419 den Boer et al. Dec 2010 A1
20100309171 Hsieh et al. Dec 2010 A1
20100315384 Hargreaves et al. Dec 2010 A1
20100315394 Katoh et al. Dec 2010 A1
20100321320 Hung et al. Dec 2010 A1
20100322484 Hama Dec 2010 A1
20100327882 Shahparnia et al. Dec 2010 A1
20100328249 Ningrat et al. Dec 2010 A1
20110001708 Sleeman Jan 2011 A1
20110007029 Ben-David Jan 2011 A1
20110043489 Yoshimoto Feb 2011 A1
20110063993 Wilson et al. Mar 2011 A1
20110084857 Marino et al. Apr 2011 A1
20110084937 Chang et al. Apr 2011 A1
20110090146 Katsurahira Apr 2011 A1
20110090181 Maridakis Apr 2011 A1
20110155479 Oda et al. Jun 2011 A1
20110157068 Parker et al. Jun 2011 A1
20110169771 Fujioka et al. Jul 2011 A1
20110175834 Han et al. Jul 2011 A1
20110216016 Rosener Sep 2011 A1
20110216032 Oda et al. Sep 2011 A1
20110254807 Perski et al. Oct 2011 A1
20110273398 Ho et al. Nov 2011 A1
20110304577 Brown et al. Dec 2011 A1
20110304592 Booth et al. Dec 2011 A1
20120013555 Takami et al. Jan 2012 A1
20120019488 McCarthy Jan 2012 A1
20120050207 Westhues et al. Mar 2012 A1
20120050216 Kremin et al. Mar 2012 A1
20120056822 Wilson et al. Mar 2012 A1
20120062497 Rebeschi et al. Mar 2012 A1
20120062500 Miller et al. Mar 2012 A1
20120068964 Wright et al. Mar 2012 A1
20120086664 Leto Apr 2012 A1
20120105357 Li et al. May 2012 A1
20120105361 Kremin et al. May 2012 A1
20120105362 Kremin et al. May 2012 A1
20120146958 Oda et al. Jun 2012 A1
20120154295 Hinckley et al. Jun 2012 A1
20120154340 Vuppu et al. Jun 2012 A1
20120182259 Han Jul 2012 A1
20120212421 Honji Aug 2012 A1
20120242603 Engelhardt et al. Sep 2012 A1
20120274580 Sobel et al. Nov 2012 A1
20120293464 Adhikari Nov 2012 A1
20120320000 Takatsuka Dec 2012 A1
20120327040 Simon Dec 2012 A1
20120327041 Harley Dec 2012 A1
20130027361 Perski et al. Jan 2013 A1
20130033461 Silverbrook Feb 2013 A1
20130069905 Krah et al. Mar 2013 A1
20130088465 Geller et al. Apr 2013 A1
20130106722 Shahparnia et al. May 2013 A1
20130113707 Perski et al. May 2013 A1
20130127757 Mann et al. May 2013 A1
20130141342 Bokma et al. Jun 2013 A1
20130155007 Huang et al. Jun 2013 A1
20130176273 Li et al. Jul 2013 A1
20130176274 Sobel et al. Jul 2013 A1
20130207938 Ryshtun et al. Aug 2013 A1
20130215049 Lee Aug 2013 A1
20130257793 Zeliff et al. Oct 2013 A1
20140028576 Shahparnia Jan 2014 A1
20140028607 Tan Jan 2014 A1
20140077827 Seguine Mar 2014 A1
20140132556 Huang May 2014 A1
20140146009 Huang May 2014 A1
20140168142 Sasselli et al. Jun 2014 A1
20140168143 Hotelling et al. Jun 2014 A1
20140253462 Hicks Sep 2014 A1
20140253469 Hicks et al. Sep 2014 A1
20140267075 Shahparnia et al. Sep 2014 A1
20140267184 Bathiche et al. Sep 2014 A1
20140375612 Hotelling et al. Dec 2014 A1
20150022485 Chen et al. Jan 2015 A1
20150035768 Shahparnia et al. Feb 2015 A1
20150035769 Shahparnia Feb 2015 A1
20150035797 Shahparnia Feb 2015 A1
20150103049 Harley et al. Apr 2015 A1
20150338950 Ningrat et al. Nov 2015 A1
20160162011 Verma Jun 2016 A1
20160162101 Pant et al. Jun 2016 A1
20160162102 Shahparnia Jun 2016 A1
20160179281 Krah et al. Jun 2016 A1
Foreign Referenced Citations (98)
Number Date Country
1243282 Feb 2000 CN
1278348 Dec 2000 CN
1518723 Aug 2004 CN
2013297220 Oct 2009 CN
101393488 Oct 2010 CN
201837984 May 2011 CN
036 02 796 Aug 1987 DE
197 20 925 Dec 1997 DE
0 306 596 Mar 1989 EP
0 366 913 May 1990 EP
0 384 509 Aug 1990 EP
0 426 362 May 1991 EP
0 426 469 May 1991 EP
0 464 908 Jan 1992 EP
0 488 455 Jun 1992 EP
0 490 683 Jun 1992 EP
0 491 436 Jun 1992 EP
0 509 589 Oct 1992 EP
0 545 709 Jun 1993 EP
0 572 009 Dec 1993 EP
0 572 182 Dec 1993 EP
0 587 236 Mar 1994 EP
0 601 837 Jun 1994 EP
0 618 527 Oct 1994 EP
0 633 542 Jan 1995 EP
0 762 319 Mar 1997 EP
0 762 319 Mar 1997 EP
0 770 971 May 1997 EP
0 962 881 Dec 1999 EP
1 022 675 Jul 2000 EP
1 128 170 Aug 2001 EP
1 884 863 Feb 2008 EP
2 040 149 Mar 2009 EP
2 172 834 Apr 2010 EP
2 221 659 Aug 2010 EP
2 660 689 Nov 2013 EP
55-074635 Jun 1980 JP
57-203129 Dec 1982 JP
60-179823 Sep 1985 JP
64-006927 Jan 1989 JP
64-040004 Feb 1989 JP
1-196620 Aug 1989 JP
2-182581 Jul 1990 JP
2-211421 Aug 1990 JP
5-019233 Jan 1993 JP
5-173707 Jul 1993 JP
05-243547 Sep 1993 JP
8-166849 Jun 1996 JP
9-001279 Jan 1997 JP
9-185457 Jul 1997 JP
9-231002 Sep 1997 JP
9-274537 Oct 1997 JP
10-027068 Jan 1998 JP
10-040004 Feb 1998 JP
10-133817 May 1998 JP
10-133819 May 1998 JP
10-186136 Jul 1998 JP
10-198515 Jul 1998 JP
11-110110 Apr 1999 JP
11-242562 Sep 1999 JP
2000-020241 Jan 2000 JP
2000-163031 Jun 2000 JP
2002-342033 Nov 2002 JP
2005-129948 May 2005 JP
2005-352490 Dec 2005 JP
2009-054141 Mar 2009 JP
10-2013-0028360 Mar 2013 KR
10-2013-0109207 Oct 2013 KR
200743986 Dec 2007 TW
200925944 Jun 2009 TW
201115414 May 2011 TW
201118682 Jun 2011 TW
201324242 Jun 2013 TW
201419103 May 2014 TW
201504874 Feb 2015 TW
WO-9740488 Oct 1997 WO
WO-9921160 Apr 1999 WO
WO-9922338 May 1999 WO
WO-0145283 Jun 2001 WO
WO-2006104214 Oct 2006 WO
WO-2007145346 Dec 2007 WO
WO-2007145347 Dec 2007 WO
WO-2008018201 Feb 2008 WO
WO-2008044368 Apr 2008 WO
WO-2008044369 Apr 2008 WO
WO-2008044370 Apr 2008 WO
WO-2008044371 Apr 2008 WO
WO-2008047677 Apr 2008 WO
WO-2009081810 Jul 2009 WO
WO-2011008533 Jan 2011 WO
WO-2012177567 Dec 2012 WO
WO-2012177569 Dec 2012 WO
WO-2012177569 Dec 2012 WO
WO-2012177571 Dec 2012 WO
WO-2012177573 Dec 2012 WO
WO-2014018233 Jan 2014 WO
WO-2014143430 Sep 2014 WO
WO-2015017196 Feb 2015 WO
Non-Patent Literature Citations (158)
Entry
Abileah, A. et al. (2004). “59.3: Integrated Optical Touch Panel in a 14.1′ AMLCD,”SID '04 Digest (Seattle) pp. 1544-1547.
Abileah, A. et al. (2006). “9.3: Optical Sensors Embedded within AMLCD Panel: Design and Applications,” ADEAC '06, SID (Atlanta) pp. 102-105.
Abileah, A. et al. (2007). “Optical Sensors Embedded within AMLCD Panel: Design and Applications,” Siggraph-07, San Diego, 5 pages.
Anonymous. (2002). “Biometric Smart Pen Project,” located at http://www.biometricsmartpen.de/ . . . , last visited Apr. 19, 2011, one page.
Bobrov, Y. et al. (2002). “5.2 Manufacturing of a Thin-Film LCD,” Optiva, Inc., San Francisco, CA. 4 pages.
Brown, C. et al. (2007). “7.2: A 2.6 inch VGA LCD with Optical Input Function using a 1-Transistor Active-Pixel Sensor,” ISSCC 2007 pp. 132-133, 592.
Den Boer, W. et al. (2003). “56.3: Active Matrix LCD with Integrated Optical Touch Screen,” SID'03 Digest (Baltimore) pp. 1-4.
Chinese Search Report dated Sep. 6, 2015, for CN Application No. CN 201280030349.9, with English translation, six pages.
Chinese Search Report dated Oct. 23, 2015, for CN Application No. CN 201280030351.6, with English translation, four pages.
Echtler, F. et al. (Jan. 2010). “An LED-based Multitouch Sensor for LCD Screens,” Cambridge, MA ACM four pages.
European Search Report dated May 2, 2016, for EP Application No. 15196245.3, filed Nov. 25, 2015, twelve pages.
Final Office Action dated Mar. 4, 2004, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, 17 pages.
Final Office Action dated Jan. 21, 2005, for U.S. Appl. No. 10/329,217, filed Dec. 23, 2002, 13 pages.
Final Office Action dated Aug. 9, 2005, for U.S. Appl. No. 10/442,433, filed May 20, 2003, six pages.
Final Office Action dated Aug. 23, 2005, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, 10 pages.
Final Office Action dated Dec. 13, 2005, for U.S. Appl. No. 10/371,413, filed Feb. 20, 2003, six pages.
Final Office Action dated May 23, 2007, for U.S. Appl. No. 11/137,753, filed May 25, 2005, 11 pages.
Final Office Action dated Oct. 18, 2007, for U.S. Appl. No. 11/351,098, filed Feb. 8, 2006, six pages.
Final Office Action dated Oct. 31, 2007, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, nine pages.
Final Office Action dated Mar. 24, 2009, for U.S. Appl. No. 11/351,098, filed Feb. 8, 2006, 10 pages.
Final Office Action dated Feb. 10, 2011, for U.S. Appl. No. 11/901,649, filed Sep. 18, 2007, 20 pages.
Final Office Action dated May 18, 2011, for U.S. Appl. No. 11/978,031, filed Oct. 25, 2007, 17 pages.
Final Office Action dated Jun. 15, 2011, for U.S. Appl. No. 11/595,071, filed Nov. 8, 2006, 9 pages.
Final Office Action dated Jun. 24, 2011, for U.S. Appl. No. 11/978,006, filed Oct. 25, 2007, 12 pages.
Final Office Action dated Jul. 5, 2011, for U.S. Appl. No. 11/977,279, filed Oct. 24, 2007, 12 pages.
Final Office Action dated Sep. 29, 2011, for U.S. Appl. No. 11/977,911, filed Oct. 26, 2007, 22 pages.
Final Office Action dated Oct. 11, 2012, for U.S. Appl. No. 12/566,455, filed Sep. 24, 2009, 8 pages.
Final Office Action dated Oct. 25, 2012, for U.S. Appl. No. 12/568,302, filed Spetember 28, 2009, 13 pages.
Final Office Action dated Oct. 25, 2012, for U.S. Appl. No. 12/568,316, filed Sep. 28, 2009, 15 pages.
Final Office Action dated Jul. 26, 2013, for U.S. Appl. No. 13/166,726, filed Jun. 22, 2011, ten pages.
Final Office Action dated Oct. 31, 2013, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 13 pages.
Final Office Action dated Jan. 13, 2014, for U.S. Appl. No. 12/568,316, filed Sep. 28, 2009, 15 pages.
Final Office Action dated Apr. 28, 2014, for U.S. Appl. No. 13/652,007, filed Oct. 15, 2012, 16 pages.
Final Office Action dated Jul. 14, 2014, for U.S. Appl. No. 13/166,711, filed Jun. 22, 2011, 12 pages.
Final Office Action dated Dec. 2, 2014, for U.S. Appl. No. 13/560,963, filed Jul. 27, 2012,ten pages.
Final Office Action dated Dec. 16, 2014, for U.S. Appl. No. 13/560,958, filed Jul. 27, 2012, twelve pages.
Final Office Action dated Jan. 12, 2015, for U.S. Appl. No. 13/560,973, filed Jul. 27, 2012, six pages.
Final Office Action dated May 4, 2015, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 17 pages.
Final Office Action dated Aug. 20, 2015, for U.S. Appl. No. 13/166,711, filed Jun. 22, 2011, six pages.
Final Office Action dated Feb. 1, 2016, for U.S. Appl. No. 13/560,963, filed Jul. 27, 2012, 12 pages.
Final Office Action dated Feb. 3, 2016, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 15 pages.
Final Office Action dated Mar. 9, 2016, for U.S. Appl. No. 13/831,318, filed Mar. 14, 2013, nine pages.
Final Office Action dated Jun. 3, 2016, for U.S. Appl. No. 14/333,461, filed Jul. 16, 2014, eight pages.
Hong, S.J. et al. (2005). “Smart LCD Using a-Si Photo Sensor,” IMID'05 Digest pp. 280-283.
International Preliminary Report on Patentability and Written Opinion dated Oct. 8, 2004, for PCT Application No. PCT/US03/05300, filed Feb. 20, 2003, 15 pages.
International Preliminary Report on Patentability and Written Opinion dated Dec. 30, 2004, for PCT Application No. PCT/US02/25573, filed Aug. 12, 2002, 16 pages.
International Preliminary Report on Patentability and Written Opinion dated May 14, 2008, for PCT Application No. PCT/US06/43741, filed Nov. 10, 2006, four pages.
International Search Report dated Apr. 14, 2003, for PCT Application No. PCT/US02/25573, filed Aug. 12, 2002 two pages.
International Search Report dated Jun. 16, 2003, for PCT Application No. PCT/US03/05300, filed Feb. 20, 2003, two pages.
International Search Report dated Nov. 11, 2003, for PCT Application No. PCT/US03/03277, filed Feb. 4, 2003, three pages.
International Search Report dated Sep. 21, 2007, for PCT Application No. PCT/US06/43741, filed Nov. 10, 2006, one page.
International Search Report dated Oct. 17, 2012, for PCT Application No. PCT/US2012/043019, filed Jun. 18, 2012, five pages.
International Search Report dated Oct. 17, 2012, for PCT Application No. PCT/US2012/043023, filed Jun. 18, 2012, six pages.
International Search Report dated Jan. 16, 2013, for PCT Application No. PCT/US2012/043021, filed Jun. 18, 2012, six pages.
International Search Report dated Sep. 12, 2013, for PCT Application No. PCT/US2013/048977, filed Jul. 1, 2013, six pages.
International Search Report dated Apr. 23, 2014, for PCT Application No. PCT/US2014/013927, filed Jan. 30, 2014, four pages.
International Search Report dated Oct. 30, 2014 for PCT Application No. PCT/US2014/047658, four pages.
Kim, J.H. et al. (May 14, 2000). “24.1: Fingerprint Scanner Using a-Si: H TFT-Array,” SID '00 Digest pp. 353-355.
Kis, A. (2006). “Tactile Sensing and Analogic Algorithms,” Ph.D. Dissertation, Péter Pázmány Catholic University, Budapest, Hungary 122 pages.
Lee, S.K. et al. (Apr. 1985). “A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” Proceedings of CHI: ACM Conference on Human Factors in Computing Systems, pp. 21-25.
Non-Final Office Action dated Jun. 4, 2003, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, 16 pages.
Non-Final Office Action dated May 21, 2004, for U.S. Appl. No. 10/329,217, filed Dec. 23, 2002, 13 pages.
Non-Final Office Action dated Sep. 21, 2004, for U.S. Appl. No. 10/442,433, filed May 20, 2003, six pages.
Non-Final Office Action dated Nov. 26, 2004, for U.S. Appl. No. 10/307,106, filed Nov. 27, 2002, eight pages.
Non-Final Office Action dated Dec. 10, 2004, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, nine pages.
Non-Final Office Action dated Jan. 21, 2005, for U.S. Appl. No. 10/347,149, filed Jan. 17, 2003, nine pages.
Non-Final Office Action dated Apr. 15, 2005, for U.S. Appl. No. 10/371,413, filed Feb. 20, 2003, four pages.
Non-Final Office Action dated Jun. 22, 2005, for U.S. Appl. No. 10/739,455, filed Dec. 17, 2003, 10 pages.
Non-Final Office Action dated Jul. 12, 2005, for U.S. Appl. No. 10/347,149, filed Jan. 17, 2003, four pages.
Non-Final Office Action dated Jan. 13, 2006, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, nine pages.
Non-Final Office Action dated May 12, 2006, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, seven pages.
Non-Final Office Action dated Aug. 28, 2006, for U.S. Appl. No. 10/371,413, filed Feb. 20, 2003, six pages.
Non-Final Office Action dated Jun. 28, 2007, for U.S. Appl. No. 11/351,098, filed Feb. 8, 2006, 12 pages.
Non-Final Office Action dated Jun. 29, 2007, for U.S. Appl. No. 10/217,798, filed Aug. 12, 2002, 10 pages.
Non-Final Office Action dated Feb. 25, 2008, for U.S. Appl. No. 11/137,753, filed May 25, 2005, 15 pages.
Non-Final Office Action dated Jun. 24, 2008, for U.S. Appl. No. 11/351,098, filed Feb. 8, 2006, 11 pages.
Non-Final Office Action dated Jun. 25, 2009, for U.S. Appl. No. 11/980,029, filed Oct. 29, 2007, 9 pages.
Non-Final Office Action dated Nov. 23, 2009, for U.S. Appl. No. 11/407,545, filed Apr. 19, 2006, five pages.
Non-Final Office Action dated Jul. 29, 2010, for U.S. Appl. No. 11/901,649, filed Sep. 18, 2007, 20 pages.
Non-Final Office Action dated Oct. 13, 2010, for U.S. Appl. No. 11/978,006, filed Oct. 25, 2007, eight pages.
Non-Final Office Action dated Oct. 14, 2010, for U.S. Appl. No. 11/595,071, filed Nov. 8, 2006, seven pages.
Non-Final Office Action dated Nov. 26, 2010, for U.S. Appl. No. 11/977,279, filed Oct. 24, 2007, nine pages.
Non-Final Office Action dated Nov. 26, 2010, for U.S. Appl. No. 11/977,830, filed Oct. 26, 2007, seven pages.
Non-Final Office Action dated Dec. 13, 2010, for U.S. Appl. No. 11/977,339, filed Oct. 24, 2007, eight pages.
Non-Final Office Action dated Feb. 1, 2011, for U.S. Appl. No. 11/978,031, filed Oct. 25, 2007, 18 pages.
Non-Final Office Action dated Apr. 29, 2011, for U.S. Appl. No. 11/977,911, filed Oct. 26, 2007, 19 pages.
Non-Final Office Action dated Jun. 21, 2011, for U.S. Appl. No. 11/977,339, filed Oct. 24, 2007, 10 pages.
Non-Final Office Action dated Jun. 28, 2011, for U.S. Appl. No. 12/852,883, filed Aug. 8, 2010, 16 pages.
Non-Final Office Action dated Nov. 2, 2011, for U.S. Appl. No. 12/568,316, filed Sep. 28, 2009, 31 pages.
Non-Final Office Action dated Nov. 4, 2011, for U.S. Appl. No. 12/568,302, filed Sep. 28, 2009, 29 pages.
Non-Final Office Action dated Nov. 17, 2011, for U.S. Appl. No. 11/977,339, filed Oct. 24, 2007, nine pages.
Non-Final Office Action dated Jan. 10, 2012, for U.S. Appl. No. 11/977,864, filed Oct. 26, 2007, six pages.
Non-Final Office Action dated Jan. 31, 2012, for U.S. Appl. No. 12/566,477, filed Sep. 24, 2009, 11 pages.
Non-Final Office Action dated Feb. 29, 2012, for U.S. Appl. No. 11/978,031, filed Oct. 25, 2007, 20 pages.
Non-Final Office Action dated Apr. 20, 2012, for U.S. Appl. No. 12/566,455, filed Sep. 24, 2009, eight pages.
Non-Final Office Action dated Jun. 5, 2012, for U.S. Appl. No. 11/595,071, filed Nov. 8, 2006, 14 pages.
Non-Final Office Action dated Jun. 19, 2012, for U.S. Appl. No. 11/977,864, filed Oct. 26, 2007, seven pages.
Non-Final Office Action dated Nov. 15, 2012, for U.S. Appl. No. 12/566,477, filed Sep. 24, 2009, 11 pages.
Non-Final Office Action dated Mar. 5, 2013, for U.S. Appl. No. 13/166,726, filed Jun. 22, 2011, 14 pages.
Non-Final Office Action dated Mar. 29, 2013, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 12 pages.
Non-Final Office Action dated Jun. 17, 2013, for U.S. Appl. No. 13/166,711, filed Jun. 22, 2011, 8 pages.
Non-Final Office Action dated Sep. 18, 2013, for U.S. Appl. No. 13/652,007, filed Oct. 15, 2012, 16 pages.
Non-Final Office Action dated Dec. 16, 2013, for U.S. Appl. No. 13/166,711, filed Jun. 22, 2011, 12 pages.
Non-Final Office Action dated Feb. 27, 2014, for U.S. Appl. No. 11/977,279, filed Oct. 24, 2007, 11 pages.
Non-Final Office Action dated Mar. 14, 2014, for U.S. Appl. No. 11/977,339, filed Oct. 24, 2007, 10 pages.
Non-Final Office Action dated Apr. 24, 2014, for U.S. Appl. No. 13/560,958, filed Jul. 27, 2012, nine pages.
Non-Final Office Action dated May 8, 2014, for U.S. Appl. No. 13/560,973, filed Jul. 27, 2012, six pages.
Non-Final Office Action dated Jun. 4, 2014, for U.S. Appl. No. 13/560,963, filed Jul. 27, 2012, nine pages.
Non-Final Office Action dated Jun. 27, 2014, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 13 pages.
Non-Final Office Action dated Jan. 30, 2015, for U.S. Appl. No. 13/166,711, filed Jun. 22, 2011, 12 pages.
Non-Final Office Action dated May 14, 2015, for U.S. Appl. No. 13/560,963, filed Jul. 27, 2012, twelve pages.
Non-Final Office Action dated May 22, 2015, for U.S. Appl. No. 13/831,318, filed Mar. 14, 2013, eight pages.
Non-Final Office Action dated Aug. 28, 2015, for U.S. Appl. No. 13/560,958, filed Jul. 27, 2012, 11 pages.
Non-Final Office Action dated Sep. 24, 2015, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 14 pages.
Non-Final Office Action dated Dec. 4, 2015, for U.S. Appl. No. 14/333,461, filed Jul. 16, 2014, 15 pages.
Non-Final Office Action dated Feb. 11, 2016, for U.S. Appl. No. 14/578,051, filed Dec. 19, 2014, nine pages.
Non-Final Office Action dated May 13, 2016, for U.S. Appl. No. 15/057,035, filed Feb. 29, 2016, six pages.
Non-Final Office Action dated May 17, 2016, for U.S. Appl. No. 14/333,382, filed Jul. 16, 2014, sixteen pages.
Non-Final Office Action dated Jul. 1, 2016, for U.S. Appl. No. 14/333,457, filed Jul. 16, 2014, 27 pages.
Non-Final Office Action dated Jul. 28, 2016, for U.S. Appl. No. 13/560,963, filed Jul. 27, 2012, twelve pages.
Notice of Allowance dated Feb. 3, 2014, for U.S. Appl. No. 13/166,726, filed Jun. 22, 2011, nine pages.
Notice of Allowance dated May 12, 2014, for U.S. Appl. No. 13/166,726, filed Jun. 22, 2011, nine pages.
Notice of Allowance dated Sep. 4, 2014, for U.S. Appl. No. 13/166,726, filed Jun. 22, 2011, nine pages.
Notice of Allowance dated Dec. 15, 2015, for U.S. Appl. No. 13/560,973, filed Jul. 27, 2012, nine pages.
Notice of Allowance dated Jan. 14, 2016, for U.S. Appl. No. 13/166,711, filed Jun. 22, 2011, five pages.
Notice of Allowance dated May 24, 2016, for U.S. Appl. No. 13/560,958, filed Jul. 27, 2012, ten pages.
Notice of Allowance dated Aug. 10, 2016, for U.S. Appl. No. 14/578,051, filed Dec. 19, 2014, seven pages.
Notice of Allowance dated Sep. 9, 2016, for U.S. Appl. No. 13/560,958, filed Jul. 27, 2012, eight pages.
Notification of Reasons for Rejection dated Dec. 19, 2011, for JP Patent Application No. 2008-540205, with English Translation, six pages.
Pye, A. (Mar. 2001). “Top Touch-Screen Options,” located at http://www.web.archive.org/web/20010627162135.http://www.industrialtechnology.co.uk/2001/mar/touch.html, last visited Apr. 29, 2004, two pages.
Rossiter, J. et al. (2005). “A Novel Tactile Sensor Using a Matrix of LEDs Operating in Both Photoemitter and Photodetector Modes,” IEEE pp. 994-997.
Rubine, D.H. (Dec. 1991). “The Automatic Recognition of Gestures,” CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, 285 pages.
Rubine, D.H. (May 1992). “Combining Gestures and Direct Manipulation,” CHI '92, pp. 659-660.
Search Report dated Jun. 12, 2014, for ROC (Taiwan) Patent Application No. 101122110, one page.
TW Search Report dated Jul. 7, 2014, for TW Patent Application No. 101122109, filed Jun. 20, 2012, one page.
TW Search Report dated Jul. 8, 2014, for TW Patent Application No. 101122107, filed Jun. 20, 2012, one page.
TW Search Report dated Nov. 20, 2015, for TW Patent Application No. 103126285, one page.
TW Search Report dated Jun. 23, 2016, for TW Patent Application No. 104135140, with English Translation, two pages.
U.S. Appl. No. 60/359,263 filed Feb. 20, 2002, by den Boer et al.
U.S. Appl. No. 60/383,040 filed May 23, 2002, by Abileah et al.
U.S. Appl. No. 60/736,708 filed Nov. 14, 2005, by den Boer et al.
U.S. Appl. No. 60/821,325 filed Aug. 3, 2006, by Abileah et al.
Westerman, W. (Spring 1999). “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface,” A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 364 pages.
Yamaguchi, M. et al. (Jan. 1993). “Two-Dimensional Contact-Type Image Sensor Using Amorphous Silicon Photo-Transistor,” Jpn. J. Appl. Phys. 32(Part 1, No. 1B):458-461.
Notice of Allowance dated Feb. 14, 2017, for U.S. Appl. No. 13/560,963, filed Jul. 27, 2012, nine pages.
Non-Final Office Action dated Apr. 6, 2107, for U.S. Appl. No. 14/333,461, filed Jul. 16, 2014, six pages.
Non-Final Office Action dated Jan. 11, 2017, for U.S. Appl. No. 14/869,982, filed Sep. 29, 2015, nine pages.
Non-Final Office Action dated Jan. 12, 2017, for U.S. Appl. No. 14/869,980, filed Sep. 29, 2015, ten pages.
Non-Final Office Action dated Jan. 23, 2017, for U.S. Appl. No. 14/333,382, filed Jul. 16, 2014, sixteen pages.
Non-Final Office Action dated Oct. 20, 2016, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 16 pages.
Non-Final Office Action dated Nov. 25, 2016, for U.S. Appl. No. 13/831,318, filed Mar. 14, 2013, eight pages.
Notice of Allowance dated Oct. 31, 2016, for U.S. Appl. No. 15/057,035, filed Feb. 29, 2016, ten pages.
Final Office Action dated May 31, 2017, for U.S. Appl. No. 13/166,699, filed Jun. 22, 2011, 16 pages.
Final Office Action dated Jun. 21, 2017, for U.S. Appl. No. 14/333,382, filed Jul. 16, 2014, 17 pages.
Final Office Action dated Aug. 7, 2017, for U.S. Appl. No. 14/869,980, filed Sep. 29, 2015, twelve pages.
Final Office Action dated Aug. 16, 2017, for U.S. Appl. No. 14/869,982, filed Sep. 29, 2015, ten pages.
Final Office Action dated Aug. 21, 2017, for U.S. Appl. No. 13/831,318, filed Mar. 14, 2013, nine pages.
Notice of Allowance dated Oct. 26, 2107, for U.S. Appl. No. 14/333,461, filed Jul. 16, 2014, seven pages.
Related Publications (1)
Number Date Country
20160357343 A1 Dec 2016 US
Continuations (1)
Number Date Country
Parent 13166711 Jun 2011 US
Child 15144615 US