The present application relates to and incorporates by reference, for all purposes, the following patent applications: U.S. patent application Ser. No. 12/490,067, filed Jun. 23, 2009, U.S. patent application Ser. No. 12/717,070, filed Mar. 3, 2009, and U.S. patent application Ser. No. 12/787,368, filed May 25, 2010. The present invention also incorporates by reference, for all purposes, the following patent applications related to magnetic field sensors: U.S. patent application Ser. No. 12/859,631, filed Aug. 19, 2010, U.S. Pat. App. No. 61/347,805, filed May 24, 2010, and U.S. Pat. App. No. 61/348,387, filed May 26, 2010.
The present application is also related to concurrently filed U.S. patent application Ser. No. 12/940,020, filed Nov. 4, 2010, U.S. patent application Ser. No. 12/940,023, filed Nov. 4, 2010, and U.S. patent application Ser. No. 12/940,026, filed Nov. 4, 2010 all of which are commonly owned and incorporated by reference for all purposes.
Embodiments of the present invention relate to touch screen devices. More specifically, the present invention relates to touch screen devices capable of sensing the force of a touch and methods of use thereof.
The use of touch screen devices and touch user interfaces are now quite common place for consumers: from the signature machine in the checkout isle, to automatic teller machines at banks, to ticketing kiosks at airports, and the like. Touch screen capability is also now quite common in hand-held devices: from the Palm Pilot, to the Google Nexus One, to the Apple iPad, and the like.
Touch capability has typically been enabled for many touch screen devices through the incorporation and use of a resistive sensor network. These sensor networks can sense when a single finger of the user touches the display, or when the user uses a stylus to touch the display.
Drawbacks to touch screen devices incorporating resistive-based sensors, determined by the inventor, include that if a user inadvertently touches two locations on the touch screen at the same time, the location reported by the touch screen is often incorrect. As such devices typically only support detecting one finger at a time, for example, if two fingers touch the screen, the reported touch location may be between the two fingers. Another drawback includes that the user has to press down with some force on the touch screen before the touch screen can detect the user touch.
Newer capacitive-based touch screen displays are now more commonly used and address some of the short comings of a resistive-based sensor network. As an example, capacitive-based touch screens can sense the individual locations of fingers when the user touches the display with more than one finger. Accordingly, these devices are often termed “multi-touch” displays. As another example, capacitive-based touch screens do not require the user to press-down upon the touch screen before the finger is sensed.
Drawbacks to the use of capacitive-based touch screens, determined by the inventor, include that even if a user inadvertently brushes her finger across the touch screen, that accidental swipe may still be sensed as a user input. This is particularly frustrating, for example, when a user is trying to touch-type using a virtual keyboard to input text. In such cases, as the user hovers her fingers over the home row of the virtual keyboard, often her little finger, middle finger, or the like may accidentally touch the surface of the display. These touches are then incorrectly sensed as presses of the virtual keys causing typographical errors.
Although many touch screen devices include automatic spelling/prediction software to attempt to reduce the effect of typographic errors, in many instances the predicted word is not the word the user wants. Accordingly, the user must constantly watch the touch screen display to monitor the automatic predictions and to select the correct word. These types of interruptions greatly interfere with the text-entry efficiency provided by the user's ability to touch-type.
Additional drawbacks determined by the inventor of resistive and capacitive based touch screen include that the sensed touches are typically binary in nature, i.e. either the finger is not touching or the finger is touching. These types of devices cannot sense the force with which a user touches the touch screen display. From a user point of view, these touch screen devices also do not provide a user with any sensation of pressing a button or key, i.e. they provide no tactile feedback.
One type of touch screen display used by Research In Motion (RIM) to provide the user with tactile feedback was used in the Blackberry Storm series of devices. In these products, one or more micro sensors were placed under the capacitive-based touch screen display. In operation, when the user wanted to make an on-screen selection, the user would press the touch screen display. The touch screen display would then deflect (by about a millimeter) and cause one of the micro sensors to physically click or switch. The physical click would thus provide the user with tactile confirmation of the button press.
Drawbacks to such approaches, determined by the inventor, include that such devices were limited to the physical performance of the micro sensors. For example, a user could not type very quickly with such an approach because the user had to pause between key presses to wait until the micro sensors could fully reset before she could press the next key. Further, if the user placed two or more fingers on the touch screen at the same time she depressed the touch screen (activating the micro sensor(s)), it would be unclear which touch screen location or finger the user intended.
The inventor of the present application has also noticed that with advances in graphical user interfaces, capacitive touch displays, high resolution displays, high contrast displays, and the like, much emphasis has been put upon the user interacting with the display. In contrast, previously, a number of physical buttons were provided upon devices such as a portable telephone, a PDA, or the like. When the user pressed the physical buttons, one of a number of actions occurred, such as launching an application, making a telephone call, taking a picture, or the like.
Drawbacks to having physical buttons for such devices included that it increased manufacturing and assembly costs, increased the number of components and complexity of the devices, increased the number of potential faulty components (e.g. broken switch, dust), and the like. Other drawbacks included that physical buttons are now often deemed to be undesirable as they decrease the aesthetics of such devices. In light of the above, recent popular devices have a reduced number of physical buttons.
Drawbacks to concentrating upon virtual (soft) buttons on a display, determined by the inventor, include that such devices greatly increase the requirements that the user view a display and that the user must interact with that display to perform basic functions. As merely an example, with popular devices, it is now virtually impossible for a user to enter a telephone number without requiring a user to look at the display. As another example, it is now common for a user to press a virtual button displayed on a display to initiate taking a photograph, whereas previously a physical button was provided.
From the above, it is desired to have a device with user input capability without the drawbacks described above.
Embodiments of the present invention relate to touch screen devices. More specifically, the present invention relates to touch screen devices capable of sensing the force of a touch and methods of use thereof.
Various embodiments of the present invention disclose a computer system such as a cell phone, internet access device, media player, or the like having a touch screen display and one or more physical sensors. In operation, when a user touches a location on the touch screen display, the function associated with the touched location is determined. The function may be running of an application program, selection of a function within an application program, and the like. In various embodiments, a type and/or magnitude of movement is determined by the one or more physical sensors also in response to the user touching the touch screen display. Based upon the type and/or magnitude of movement or combinations of movements, an input parameter or value may be determined for use by the selected function. Next, the function is initiated and given the input parameter or value.
Other embodiments of the present invention disclose a computer system such as a tablet computer, a smart phone, cell phone, or the like also having a display (e.g. touch screen) and one or more physical sensors. In operation, when a user touches a location on the touch screen display, the function associated with the touched location is again determined. The function may be running of an application program, selection of a function within an application program, and the like. In various embodiments, a type and/or magnitude of movement is determined by the one or more physical sensors also in response to the user touching the touch screen display. The type and/or magnitude of movement is then compared to one or more thresholds for type and/or magnitude of movement. In various embodiments, if the threshold is not exceeded, the function is inhibited, and when the threshold is exceeded (e.g. enough physical impulse), the function is performed.
Other embodiments of the present invention disclose a computer system such as a tablet computer, a smart phone, cell phone, or the like also having a touch screen display and one or more physical sensors. In operation, when a user physically perturbs the computer system, the perturbation will cause the computer system to perform a user-desired action. The perturbation may be a change in physical position or angular orientation of the computer system, a change in air pressure, a change in sensed magnetic field, or the like. Merely as examples, a user tapping upon a case of the computer system (device) (or a surface upon which the computer system is laying upon or may cause the computer system to take a picture; start or stop a timer; answer or disconnect a telephone call; invoke an application (e.g. knock-knocking on a computer system to invoke a VoIP application, a chat application, an IM, or the like); or the like. As other examples, a user positioning a credit card magnetic strip near the computer system may invoke a payment application and/or may cause the computer system to sense the data encoded on the magnetic strip; a sudden change in magnetic field may cause the computer system to shut down; a constant or sudden change in air pressure may invoke a pressure monitoring program (e.g. a scuba diving log application, a weather application); may cause the computer system to disconnect wireless transceivers and enter an “airplane mode;” or the like.
According to one aspect of the invention, a computer implemented method for initiating capture of an image on a computer system, performed by the computer system that is programmed to perform the method is described. One technique includes determining by a physical sensor of the computer system, a change in physical state of the computer system, wherein the change in physical state is associated with a magnitude of change in physical state, and determining by the computer system, whether the magnitude of change in physical state by the physical sensor exceeds a threshold level. A process includes determining by the computer system, a plurality of parameters for a camera associated with the computer system, and initiating by the computer system, capture of one or more images using the camera when the magnitude of change in physical state by the physical sensor exceeds the threshold level.
According to another aspect of the invention, a computing device is disclosed. One device includes an image acquisition device configured to capture one or more images, and a memory configured to store the one or more images. A system includes a display configured to display the one or more images, and a physical sensor, wherein the physical sensor is configured to sense physical forces acting upon the physical sensor and configured to determine magnitudes of change in physical forces acting upon the physical sensor. An apparatus includes a processor coupled to the image acquisition device, to the display and to the physical sensor, wherein the processor is programmed to determine when a magnitude of change in physical forces exceeds a threshold level and wherein the processor is programmed to determine a plurality of parameters for an image acquisition device. In various embodiments, the processor is programmed to direct the image acquisition device to capture the one or more images when the magnitude of change in physical forces exceeds the threshold level, and the processor is programmed to direct the memory to store the one or more images.
Various additional objects, features and advantages of the present invention can be more fully appreciated with reference to the detailed description and accompanying drawings that follow.
In order to more fully understand the present invention, reference is made to the accompanying drawings. Understanding that these drawings are not to be considered limitations in the scope of the invention, the presently described embodiments and the presently understood best mode of the invention are described with additional detail through use of the accompanying drawings in which:
In various embodiments, computing device 100 may be a hand-held computing device (e.g. Apple iPad, Apple iTouch, Dell Mini slate, Lenovo Skylight/IdeaPad, Asus EEE series, Microsoft Courier, Notion Ink Genesis, Samsung Galaxy Tab,), a portable telephone (e.g. Apple iPhone, Motorola Droid, Droid X, Google Nexus One, HTC Incredible/EVO 4G, Palm Pre series, Nokia N900), a portable computer (e.g. netbook, laptop), a media player (e.g. Microsoft Zune, Apple iPod), a reading device (e.g. Amazon Kindle, Barnes and Noble Nook), or the like.
Typically, computing device 100 may include one or more processors 110. Such processors 110 may also be termed application processors, and may include a processor core, a video/graphics core, and other cores. Processors 110 may be a processor from Apple (A4), Intel (Atom), NVidia (Tegra 2), Marvell (Armada), Qualcomm (Snapdragon), Samsung, TI (OMAP), or the like. In various embodiments, the processor core may be an Intel processor, an ARM Holdings processor such as the Cortex-A, -M, -R or ARM series processors, or the like. Further, in various embodiments, the video/graphics core may be an Imagination Technologies processor PowerVR-SGX, -MBX, -VGX graphics, an Nvidia graphics processor (e.g. GeForce), or the like. Other processing capability may include audio processors, interface controllers, and the like. It is contemplated that other existing and/or later-developed processors may be used in various embodiments of the present invention.
In various embodiments, memory 120 may include different types of memory (including memory controllers), such as flash memory (e.g. NOR, NAND), pseudo SRAM, DDR SDRAM, or the like. Memory 120 may be fixed within computing device 100 or removable (e.g. SD, SDHC, MMC, MINI SD, MICRO SD, CF, SIM). The above are examples of computer readable tangible media that may be used to store embodiments of the present invention, such as computer-executable software code (e.g. firmware, application programs), application data, operating system data or the like. It is contemplated that other existing and/or later-developed memory and memory technology may be used in various embodiments of the present invention.
In various embodiments, touch screen display 130 and driver 140 may be based upon a variety of later-developed or current touch screen technology including resistive displays, capacitive displays, optical sensor displays, electromagnetic resonance, or the like. Additionally, touch screen display 130 may include single touch or multiple-touch sensing capability. Any later-developed or conventional output display technology may be used for the output display, such as TFT-LCD, OLED, Plasma, trans-reflective (Pixel Qi), electronic ink (e.g. electrophoretic, electrowetting, interferometric modulating). In various embodiments, the resolution of such displays and the resolution of such touch sensors may be set based upon engineering or non-engineering factors (e.g. sales, marketing). In some embodiments of the present invention, a display output port, such as an HDMI-based port or DVI-based port may also be included.
In some embodiments of the present invention, image capture device 150 may include a sensor, driver, lens and the like. The sensor may be based upon any later-developed or convention sensor technology, such as CMOS, CCD, or the like. In various embodiments of the present invention, image recognition software programs are provided to process the image data. For example, such software may provide functionality such as: facial recognition, head tracking, camera parameter control, or the like.
In various embodiments, audio input/output 160 may include conventional microphone(s)/speakers. In some embodiments of the present invention, three-wire or four-wire audio connector ports are included to enable the user to use an external audio device such as external speakers, headphones or combination headphone/microphones. In various embodiments, voice processing and/or recognition software may be provided to applications processor 110 to enable the user to operate computing device 100 by stating voice commands. Additionally, a speech engine may be provided in various embodiments to enable computing device 100 to provide audio status messages, audio response messages, or the like.
In various embodiments, wired interface 170 may be used to provide data transfers between computing device 100 and an external source, such as a computer, a remote server, a storage network, another computing device 100, or the like. Such data may include application data, operating system data, firmware, or the like. Embodiments may include any later-developed or conventional physical interface/protocol, such as: USB 2.0, 3.0, micro USB, mini USB, Firewire, Apple iPod connector, Ethernet, POTS, or the like. Additionally, software that enables communications over such networks is typically provided.
In various embodiments, a wireless interface 180 may also be provided to provide wireless data transfers between computing device 100 and external sources, such as computers, storage networks, headphones, microphones, cameras, or the like. As illustrated in
GPS receiving capability may also be included in various embodiments of the present invention, however, is not required. As illustrated in
Additional wireless communications may be provided via RF interfaces 190 and drivers 200 in various embodiments. In various embodiments, RF interfaces 190 may support any future-developed or conventional radio frequency communications protocol, such as CDMA-based protocols (e.g. WCDMA), GSM-based protocols, HSUPA-based protocols, or the like. In the embodiments illustrated, driver 200 is illustrated as being distinct from applications processor 110. However, in some embodiments, these functionality are provided upon a single IC package, for example the Marvel PXA330 processor, and the like. It is contemplated that some embodiments of computing device 100 need not include the RF functionality provided by RF interface 190 and driver 200.
In various embodiments, any number of future developed or current operating systems may be supported, such as iPhone OS (e.g. iOS), WindowsMobile (e.g. 7), Google Android (e.g. 2.2), Symbian, or the like. In various embodiments of the present invention, the operating system may be a multi-threaded multi-tasking operating system. Accordingly, inputs and/or outputs from and to touch screen display 130 and driver 140 and inputs/or outputs to physical sensors 210 may be processed in parallel processing threads. In other embodiments, such events or outputs may be processed serially, or the like. Inputs and outputs from other functional blocks may also be processed in parallel or serially in other embodiments of the present invention, such as image acquisition device 150 and physical sensors 210.
In various embodiments of the present invention, physical sensors 210 are provided as part of a computing device 100, step 300. For example, physical sensors 210 developed by the assignee of the present patent application are provided to an assembly entity to form computing device 100. Computing device 100 is then assembled, step 310 and provided for the user, step 320. As described above, in various embodiments, computing device 100 may be a cell-phone, an internet access device, a tablet computer, a personal media player/viewer, or the like running an appropriate operating system.
In ordinary use of such a device, computing device 100 (via the operating system) may display any number of graphical user interfaces including user-selectable regions on touch screen display 130, step 320. These user-selectable regions may include radio buttons, sliders, selection buttons, text entry regions and the like. Further, these soft buttons may be associated with application software functions, operating system functions, data management functions, telephony functions, audio processing functions, image processing functions, or the like.
Subsequently, the user determines a function she wishes computing device 100 to perform after viewing the graphical user interface, step 340. In various embodiments, the user then touches or contacts a portion of touch screen display 130 corresponding to the user-selectable region, step 350.
Next, in various embodiments of the present invention, the following processes can be performed in parallel by different processing threads, serially by one or more processes, or independently in separate processing threads.
In
In various embodiments of the present invention, it is contemplated that when a user contacts her finger on touch screen display 130 in step 350, computing device 100 (physical sensors 210) will be physically perturbed, step 390. For example, when the user touches touch screen display 130, computing device 100 (physical sensors 210) will be subject to a force (e.g. a change in sensed physical state, a physical perturbation). In various embodiments, this physical change causes physical sensors 210 to sense a change in spatial location (sensed by an accelerometer), causes physical sensors 210 to sense a change its tilt or orientation (sensed by a gyroscope), or the like. For sake of convenience,
Next, in various embodiments in response to the perturbations of the computing device 100/physical sensors 210, magnitudes and/or directions of the changes are determined in step 400. As described in the above-referenced patent applications, the CMOS foundry-compatible MEMS physical sensor embodiments of the present invention provide a higher level of sensitivity and lower level of noise for such measurements than is currently available.
In various embodiments of the present invention, the process may then proceed to
In the example illustrated in
As various examples, the threshold may be an acceleration in a −z-direction (away from a touch screen display) of 0.1 g, an acceleration in a −z-direction of 0.05 g followed by an acceleration in the +z-direction of 0.03 g; an acceleration of 0.1 g in the −z-direction and accelerations of 0.03 g in the x and y directions; a tilt of 0.5 degrees in a first axis rotation at the same time as a tilt of 1 degree in a second axis of rotation; a tilt of 0.2 degrees in a first axis followed by a tilt of −0.3 degrees in the first axis; an increase in magnetic field by 10 gauss; an increase in atmospheric pressure of 10 mm Hg for 0.25 seconds; and the like. In light of the present patent disclosure, one of ordinary skill in the art will recognize many different thresholds based upon permutations of acceleration, tilts, magnetic fields, pressure, GPS coordinates, time, and the like, that are within the scope of embodiments of the present invention.
In various embodiments, if the threshold is exceeded, the function determined in step 380 is performed, step 420; if not, the process returns to step 330. Embodiments may be applied to any number of different functions, for example, a virtual telephone keypad. In typical situations, a user may inadvertently make a telephone call when the cell phone is in her pocket and she reaches for her keys. As her fingers brush against the virtual keypad, the telephone may interpret these as user selections for a telephone number to call. In various embodiments, inadvertent calls may be avoided if it is required that the physical sensors detect an acceleration (e.g. 0.1 g) primarily in the −z direction at about the same time the user touches the virtual keyboard keys. When in her pocket, when the fingers brush or knock against the key pad, the physical sensors may detect an acceleration of 0.05 g in the −z direction, 0.02 in the x direction and 0.05 in the y direction, then, the user touch may be ignored. Accordingly, the execution of unintended user functions on a computing device may be reduced.
In additional embodiments of the present invention, the process of
Similar to the embodiment illustrated in
In response to the value for the input parameter determined, in step 440, the function may be performed using this value. Embodiments may be applied to any number of different functions, for example, a painting program. In such cases, a harder tap may be associated with a larger paint spot upon a canvas, a softer tap may be associated with a smaller spot upon a canvas, and the like. In other embodiments, other types of parameters may also be adjusted based upon sensed physical change such as: position of graphic elements, brightness, contrast, gamma, sharpness, saturation, filter, and the like. As another example, a flick of a finger at a first velocity with a low impact may be associated moving a series of images at a slower rate, a flick of a finger at the first velocity with a higher impact may be associated moving a series of images at a faster rate. In other embodiments, other types of parameters may also be adjusted, such as: rate of acceleration, rate of rotation, rate of zoom, rate of pan, and the like. As another example, the type or magnitude of sensed physical change may control a volume level, a microphone sensitivity level, a bass level, a treble level, or the like. Accordingly, the execution of user functions may have different input parameters of values based upon sensed physical changes.
In various embodiments of the present invention, physical sensors 210 are provided as part of a computing device 100, step 500. For example, physical sensors 210 developed by the assignee of the present patent application are provided to an assembly entity to form computing device 100. Computing device 100 is then assembled, step 510, and provided for the user. As described above, in various embodiments, computing device 100 may be a cell-phone, internet access device, a tablet computer, a personal media player/viewer, or the like running an appropriate operating system along with software applications. These steps may be performed at device manufacturing time whereas the following steps may be performed by a user of the device, or the like.
Next, a user may run or execute a software application upon computing device 100, step 520. In various embodiments, the software application may be an operating system, a program, or the like. In such software, a user input or triggering event is required to invoke a function on computing device 100. As merely an example, a function may be taking a picture, answering or terminating a phone call; initiating a VoIP application, chat program, IM, or the like; initiating a data logging program; or the like. In various embodiments, the user may be prompted to perturb computing device 100 to invoke the function. For example, an output audio message may prompt the user, such as, “tap the phone anywhere to take a picture;” a display image may prompt the user, such as a sequential display of lights in a drag strip “Christmas tree” sequence; and the like.
In various embodiments, computing device 100 is perturbed, step 530. In some examples, the user may directly perturb computing device 100, for example, the user may physically displace, accelerate, rotate and/or move computing device 100 itself (e.g. tapping on the interface device); the user may perturb computing device 100 indirectly (e.g. tapping on a table upon which the interface device is resting); or the like. In other examples, the user may indirectly cause the perturbation, for example, a computing device 100 and a magnetic source are moved towards or away from each other, the air pressure may decrease as the user flies in an airplane or as the weather changes, or the like.
In various embodiments, a type and magnitude of the perturbation are determined by the respective sensors, typically in parallel. For example, an acceleration in the x, y or z axis may be determined by x, y, and z axis accelerometers, a tilt, pan, or roll may be determined by x, y and z rotation sensors, a change in pressure may be determined by a pressure sensor, a change in magnetic field in may be determined in x, y and z axis by separate magnetic sensors, and the like. As discussed above, various embodiments of the present invention may be embodied as a three-axis, six-axis, nine-axis, ten-axis or the like MEMS device currently being developed by the assignee of the present patent application.
In response to the perturbations, computing device 100 determines whether the perturbation are of the type expected/required by the software application, step 540. For example, if computing device 100 is expecting an acceleration in the z-axis, a change is magnetic field may not be deemed to be the proper type of perturbation; if computing device 100 is expecting a change in GPS coordinates, a rotation may not be deemed to be the proper type of perturbation, or the like. In various embodiments, if the perturbation is the desired type, the process continues in step 550, otherwise, the perturbation may be ignored.
In some embodiments of the present invention, the magnitudes of the perturbations may be compared to one or more thresholds, step 550. This step is similar to that described in step 410, above. More specifically, in various embodiments, it may be desirable that the magnitudes of the perturbations be sufficient to reduce the chance of accidental or unintended user input. For example, in one application, a user can knock upon a table to answer call on a cell phone resting upon the table. In such an application, it may be desirable that a firm knock be sensed, before the phone is answered, otherwise, mere shuffling of papers may cause the call to be answered. As other examples, in some embodiments, a change in sensed magnetic field may be small enough to be considered merely noise, thus, such changes may be ignored; a change in sensed pressure differential may be too small to be considered a valid pressure differential; or the like.
As another example, in one application, a user taps on the surface of a hand-held device (e.g. edge, back plate, etc.) to have the hand-held device take a picture. In such an application, without such a threshold, as the user is fumbling the hand-held device and moving the device to a proper photographic position, the hand-held device may sense such changes in positions, and the like, as the user command to take a picture. Accordingly, without a properly set threshold, pictures may be taken at the wrong times.
In various embodiments, if the magnitude of the perturbation exceeds the threshold, the desired function may be performed, step 560. In light of the present patent disclosure, one of ordinary skill in the art will recognize many different types of applications may be performed within embodiments of the present invention.
Merely by example, one application may be recording acceleration data in three-dimensions with respect to time. In such an example, the user may invoke the software application on the computing device; however, the actual recording of the data is initiated in step 560, only after a sufficient change in acceleration is sensed. Such an application may be useful for data logging purposes for a vehicle (e.g. a black box), may be useful for data logging for sports activities (e.g. monitoring movement of a golf club, fishing rod, racquet), may be useful for data logging of freight (e.g. monitoring how roughly freight is handled), or the like. In various examples, other types of perturbations other than the triggering perturbation may also be logged, in the embodiments above. For example, for data logging of sports activities, the rotation in three axes of a golf club may also be recorded in addition to the linear acceleration of the golf club, in three-dimensions.
As another example, one application may be recording magnetic data stored on a magnetic storage media (e.g. a magnetic stripe (e.g. a bank card, credit card); magnetic ink (e.g. currency, commercial or consumer paper, negotiable instruments); or the like. Representative examples include a hand-held device, such as a phone, applications device (e.g. Apple iPad) or the like, including one or more magnetic field sensors, as disclosed in the above-mentioned patent application. In some embodiments of the present invention, the sensitivity of such magnetic sensors may range from approximately: 0.8 mV/V/Oe to 1 mV/V/Oe to approximately 1.2 mV/V/Oe, or the like; and the field range of such magnetic sensors may be adjusted by gain and may be within the range from approximately, +/−1 Oe, to +/−2 Oe, to +/−4 Oe, to +/−8 Oe to +/−12 Oe, or the like. In such devices, a software application running on the computing device may be designed to read magnetic field data external to the device, using the included magnetic sensors. In various examples, the application may monitor the data from the magnetic sensors when the magnetic stripe of a credit card, or the like, is moved (e.g. slowly) over the device. In other embodiments, the device may be moved relative to the credit card, or the like.
In various embodiments, the magnetic sensors can separately read any of the three or more tracks recorded on typical credit card magnetic stripes, drivers licenses, or the like, depending upon the orientation of the magnetic stripes relative to the magnetic sensors. Such embodiments may include magnetic shielding to help isolate track data. In various embodiments, the encoded data stored on any or all of the tracks can be individually sensed or read. In various embodiments, the magnetic sensors may be configured such that the magnetic data on a credit card, or the like may be sensed on the rear portion of a device, e.g. through the casing; or the magnetic sensors may be configured to sense magnetic data on the front portion of a device, e.g. over the display. For the former example, a line, a black mark, a circle or the like may be a physical feature or a graphic feature provided on the rear portion of the device to help the user align the magnetic tracks to the magnetic sensors. Examples of the magnetic sensor being on the front portion of a device and the magnetic sensor being on the back portion of a device are illustrated below.
In
As illustrated in
In various embodiments, when magnetic sensor 620 senses magnetic media from magnetic storage media 650 positioned above magnetic sensor 620, the application program progresses to the next state.
As illustrated in
By doing this, magnetic storage media 650 is passed across magnetic sensor 620, and data stored on the appropriate track is read by magnetic storage media 650. In various embodiments, the read data is used by the application program. For example, in various embodiments, if magnetic storage media 650 is a credit card, the credit card number, name, and the like can be read; if magnetic storage media 650 is a license, a name, physical characteristics, address, and the like can be read; and the like. The application may then provide the data to a remote server for storage or for further processing, for example, providing the credit card number and expiration date to a web-based retailer, or the like; providing a drivers license number to law enforcement agencies; or the like.
In other embodiments of the present invention, a user may be instructed to place their finger, thumbnail, another credit card or the like along the display to aid in alignment of the magnetic sensors relative to the magnetic track, or the like. For example, the user may be instructed to place a finger of their other hand on location 690 to keep the top edge of magnetic storage media 650 properly aligned to magnetic sensor 620. As another example, the user may be instructed to place another credit card, or the like along line 695, and then use the edge of that credit card as a guide for moving their credit card across the face of the device.
In other embodiments of the present invention, the application may operate in a landscape orientation compared to the portrait orientation illustrated in
In
In operation, as illustrated in
In other embodiments of the present invention, a user may be instructed to place their fingers, thumbnail, another credit card or the like along the display to aid in alignment of the magnetic sensors relative to the magnetic track, or the like. For example, the user may be instructed to place fingers on locations 730 to keep the top edge of a magnetic storage media properly aligned to magnetic sensor 710. As another example, the user may be instructed to place another credit card, or the like along line 740, and then use the edge of that credit card as a guide for moving their credit card across magnetic sensor 710. In various embodiments, an optical mark 720 may be provided to give the user an indication of the reading position of magnetic sensor 770.
In various embodiments, the sensed data may then be used as input to other applications, for example, the credit card number, and other data may be provided for e-commerce applications. In other examples, the magnetic sensors can read magnetic data stored on identification cards, e.g. drivers' licenses, or the like and provide such data to a security program or for security purposes. In other examples, the magnetic sensors can be used to monitor magnetic attached or implanted into a person's body for surveillance or monitoring purposes, or the like.
In still other embodiments, the magnetic sensors may be used to sense and track the localized presence of magnetizable materials (e.g. higher magnetic permeability), such as iron, nickel, steel, another magnet or the like. This may be done by sensing localized perturbations in a global magnetic field (e.g. the Earth's magnetic field) due to the steel, magnet, or the like. In various embodiments, the tracking may be in two dimensions, e.g. along the display plane or surface of the device; or in three dimensions, e.g. along the display plane of the device, in addition to towards and away from the display plane of the device. As merely an example, the magnetic sensors may be use to track the position of a metal tip of a pen (having a steel ball, iron alloy ball, or the like). For example, a user may “sign” their name on a display of a device by moving the metal-tipped pen across the surface of a display. Although the device may not officially support an external stylus (e.g. iPad), the position of the metal or magnetic-tipped pen tip may be tracked by the on-board magnetic sensors and used. Further, with appropriate pre-calibration between the magnetic sensors and positions on the display of the device, with the sensed data, the display may visually indicate or reflect the positions of the pen on the display. Accordingly, the display can track and display the path drawn by the user across the face of the display. In some examples, the user can then enter hand written notes, sign their name (e.g. signature), or the like on the display by using an ordinary pen. In other embodiments, the user can interact with the user interface of their device with an ordinary metal-tipped pen to perform customary functions such as invoke applications, select data, surf the web, drag and drop, or the like in a similar way they may use their finger(s).
In still other embodiments, the magnetic sensors can track the location of a magnet or metal material (e.g. magnetizable material) in three-dimensions relative to the device. In such embodiments, pre-calibration may be necessary to correlate the locations of magnetic or metal material in three-space. For example, the user may be initially instructed to position the magnet or metal material at certain x, y and z positions relative to the device, e.g. at lattice corners. In various embodiments, once located at specified locations, the magnetic field data sensed by the magnetic sensors is recorded to determine the calibration data. Subsequently, as the user moves the magnet or metal material in three-space, the magnetic sensor data is compared to the calibration data to determine the location of the magnet or metal material in x, y and z space. In one application, the magnetic or metal material may first be positioned at a location on a sculpture; next based upon the magnetic sensor data, the x, y and z position of that location on the sculpture, the x, y and z space coordinates are determined; subsequently, this process is then repeated for other locations on the sculpture. By doing so, in various embodiments, the three-dimensional shape or surface of the sculpture, or the like may be effectively digitized.
As another example, one application may be logging of pressure for scuba diving or for flying. In such an example, a software application running on the computing device may be designed to read pressure data using the included air pressure sensors. In such an example, the application may monitor the air pressure sensor, and when the pressure changes at a sufficient rate (e.g. faster than the weather changing), the computing device may record the change in pressures with respect to time. In the case of a scuba diving application, the pressure may be correlated to diving depth versus time. Such data can then be used by the software application to notify the diver of decompression depths and durations, whether decompression is required, and the like. In the case of a flying application, the air pressure may be correlated to flying altitude. Such data can then be used to warn the user if there is a slow decompression leak in the cabin, to monitor the altitude versus time (e.g. black box), or the like.
As another example, one application may be capturing one or more images of a camera in response to the user tapping upon the case (e.g. back, side, edge) of a device (e.g. phone). For example, while the user points the high resolution camera of their device at the target (e.g. themselves for a self portrait) the user taps the side of the camera. In various embodiments, such a method for initiating capturing of photographs or images is considered by the inventors as a superior method for taking pictures compared to a user blindly pressing software buttons (mashing) their fingers on a GUI on the display screen, which they cannot see. This sort of situation is commonly found of devices such as the iPhone 4, droid X, HTC Evo 4G for example, when the user takes a high resolution (e.g. ˜≧3 mp) self portrait; when the user is taking a picture in bright sunlight; or the like. In various embodiments, in response to the command for initiating capturing of photographic images (e.g. a sufficiently hard device case finger tap), after a short delay to enable the camera to become stable again, the image may be taken. In various embodiments, the image may be captured once the device becomes stationary (as determined by accelerometers), after the finger tap; a short amount of time after the finger tap; or the like.
In various embodiments, the hand-held device may capture a series of images into a temporary storage and determine one or more images to keep based upon optical parameters of the images (e.g. which image has the least amount of blur); based upon image parameters of the images (e.g. which image has the fastest shutter speed); physical parameters of the hand-held device (e.g. which image was taken at a time having the least amount of associated physical movement based upon the accelerometers, gyroscopes, or the like). In other embodiments, the hand-held device may decide when to take a picture based upon acceleration of the device. For example, the movement of the hand-held device may be monitored, and then when the movement/acceleration is very small, the camera of the hand-held device may be triggered.
In various embodiments the magnitude of the acceleration can be used to set camera parameters such as aperture, ISO rating, shutter speed, and the like. For example, if the magnitude of acceleration is considered large (for example, indicating a urgent, hurried photographic environment), the shutter speed may be increased, the ISO may be increased, the aperture may be decreased (increasing the depth of field), a number of photographs in a burst taken may be increased, or the like; if the magnitude of acceleration is considered small (for example, indicating a quiet, less hurried photographic environment) a volume for a shutter sound may be decreased, the ISO may be decreased, the aperture may be decreased (decreasing the depth of field), or the like.
In other embodiments, the user tapping on the case, as described above, may be used to initiate other types of operations by the computer system. For example a single tap on the back of the computer system may initiate a process for recording audio signals via a microphone, and a double tap on the back may pause or finish the recording of audio signals. As merely another example, a single tap may be used by a user to answer a telephone call, a double tap may be used to mute and unmute a telephone call, a triple tap may be used by the user to hang up the telephone call. A tap near the top of the computer system device may increase the audio playback volume and a tap near the bottom of the device may decrease the playback volume, or the like.
Further embodiments can be envisioned to one of ordinary skill in the art after reading this disclosure. In other embodiments, combinations or sub-combinations of the above disclosed invention can be advantageously made. The block diagrams of the architecture and flow charts are grouped for ease of understanding. However it should be understood that combinations of blocks, additions of new blocks, re-arrangement of blocks, and the like are contemplated in alternative embodiments of the present invention.
The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. It will, however, be evident that various modifications and changes may be made thereunto without departing from the broader spirit and scope of the invention as set forth in the claims.
Number | Name | Date | Kind |
---|---|---|---|
3614677 | Wilfinger | Oct 1971 | A |
4954698 | Yasunaga et al. | Sep 1990 | A |
5140745 | McKenzie | Aug 1992 | A |
5157841 | Dinsmore | Oct 1992 | A |
5173597 | Anglin | Dec 1992 | A |
5493769 | Sakai et al. | Feb 1996 | A |
5610414 | Yoneda et al. | Mar 1997 | A |
5668033 | Ohara | Sep 1997 | A |
5729074 | Shiomi et al. | Mar 1998 | A |
6046409 | Ishii et al. | Apr 2000 | A |
6076731 | Terrell | Jun 2000 | A |
6115261 | Platt et al. | Sep 2000 | A |
6188322 | Yao | Feb 2001 | B1 |
6263736 | Thunder et al. | Jul 2001 | B1 |
6278178 | Kwon et al. | Aug 2001 | B1 |
6480699 | Lovoi | Nov 2002 | B1 |
6483172 | Cote | Nov 2002 | B1 |
6485273 | Goodwin-Johansson | Nov 2002 | B1 |
6534726 | Okada et al. | Mar 2003 | B1 |
6656604 | Hasewaga | Dec 2003 | B2 |
6753664 | Neufeld et al. | Jun 2004 | B2 |
6855572 | Jeun et al. | Feb 2005 | B2 |
6912336 | Ishii | Jun 2005 | B2 |
6933165 | Musolf et al. | Aug 2005 | B2 |
6979872 | Borwick | Dec 2005 | B2 |
7019434 | Helmbrecht | Mar 2006 | B2 |
7095226 | Wan et al. | Aug 2006 | B2 |
7145555 | Taylor et al. | Dec 2006 | B2 |
7183630 | Fogelson et al. | Feb 2007 | B1 |
7195945 | Edelstein et al. | Mar 2007 | B1 |
7239000 | Witcraft | Jul 2007 | B2 |
7253079 | Hanson et al. | Aug 2007 | B2 |
7258009 | Imai | Aug 2007 | B2 |
7358724 | Taylor et al. | Apr 2008 | B2 |
7370530 | DCamp et al. | May 2008 | B2 |
7391091 | Tondra | Jun 2008 | B2 |
7402449 | Fukuda et al. | Jul 2008 | B2 |
7453269 | Won et al. | Nov 2008 | B2 |
7454705 | Cadez et al. | Nov 2008 | B2 |
7456042 | Stark | Nov 2008 | B2 |
7498715 | Yang | Mar 2009 | B2 |
7511379 | Flint | Mar 2009 | B1 |
7514760 | Quevy | Apr 2009 | B1 |
7521783 | Tsai et al. | Apr 2009 | B2 |
7536909 | Zhao et al. | May 2009 | B2 |
7599277 | Kato et al. | Oct 2009 | B1 |
7612443 | Bernstein et al. | Nov 2009 | B1 |
7671478 | Wathanawasam et al. | Mar 2010 | B2 |
7676340 | Yasui | Mar 2010 | B2 |
7690255 | Gogoi et al. | Apr 2010 | B2 |
7708189 | Cipriano | May 2010 | B1 |
7713785 | Flint | May 2010 | B1 |
7779689 | Li et al. | Aug 2010 | B2 |
7814791 | Andersson et al. | Oct 2010 | B2 |
7814792 | Tateyama et al. | Oct 2010 | B2 |
7814793 | Sato | Oct 2010 | B2 |
7861422 | McDonald | Jan 2011 | B2 |
7891103 | Mayor | Feb 2011 | B2 |
8011577 | Mullen et al. | Sep 2011 | B2 |
8016191 | Bonalle et al. | Sep 2011 | B2 |
8037758 | Sato | Oct 2011 | B2 |
8056412 | Rutkiewicz et al. | Nov 2011 | B2 |
8061049 | Mayor | Nov 2011 | B2 |
8087296 | Ueda et al. | Jan 2012 | B2 |
8140358 | Ling et al. | Mar 2012 | B1 |
8148808 | Braden et al. | Apr 2012 | B2 |
8165323 | Zhou | Apr 2012 | B2 |
8181874 | Wan et al. | May 2012 | B1 |
8227285 | Yang | Jul 2012 | B1 |
8236577 | Hsu | Aug 2012 | B1 |
8245923 | Merrill et al. | Aug 2012 | B1 |
8250921 | Nasiri et al. | Aug 2012 | B2 |
8259311 | Petschko | Sep 2012 | B2 |
8324047 | Yang | Dec 2012 | B1 |
8342021 | Oshio | Jan 2013 | B2 |
8367522 | Yang | Feb 2013 | B1 |
8395252 | Yang | Mar 2013 | B1 |
8395381 | Lo | Mar 2013 | B2 |
8402666 | Hsu et al. | Mar 2013 | B1 |
8407905 | Hsu et al. | Apr 2013 | B1 |
8421082 | Yang | Apr 2013 | B1 |
8476084 | Yang et al. | Jul 2013 | B1 |
8476129 | Jensen et al. | Jul 2013 | B1 |
8477473 | Koury et al. | Jul 2013 | B1 |
8486723 | Wan et al. | Jul 2013 | B1 |
20010053565 | Khoury | Dec 2001 | A1 |
20020072163 | Wong et al. | Jun 2002 | A1 |
20020134837 | Kishon | Sep 2002 | A1 |
20030058069 | Schwartz et al. | Mar 2003 | A1 |
20030095115 | Brian et al. | May 2003 | A1 |
20030133489 | Hirota et al. | Jul 2003 | A1 |
20030184189 | Sinclair | Oct 2003 | A1 |
20040002808 | Hashimoto et al. | Jan 2004 | A1 |
20040016995 | Kuo et al. | Jan 2004 | A1 |
20040017644 | Goodwin-Johansson | Jan 2004 | A1 |
20040056742 | Dabbaj | Mar 2004 | A1 |
20040063325 | Urano et al. | Apr 2004 | A1 |
20040104268 | Bailey | Jun 2004 | A1 |
20040113246 | Boon | Jun 2004 | A1 |
20040119836 | Kitaguchi et al. | Jun 2004 | A1 |
20040140962 | Wang et al. | Jul 2004 | A1 |
20040177045 | Brown | Sep 2004 | A1 |
20040207035 | Witcraft et al. | Oct 2004 | A1 |
20040227201 | Borwick et al. | Nov 2004 | A1 |
20050074147 | Smith et al. | Apr 2005 | A1 |
20050174338 | Ing | Aug 2005 | A1 |
20050199791 | Sengoku et al. | Sep 2005 | A1 |
20050247787 | Van Mueller et al. | Nov 2005 | A1 |
20060049826 | Daneman et al. | Mar 2006 | A1 |
20060081954 | Tondra et al. | Apr 2006 | A1 |
20060141786 | Boezen et al. | Jun 2006 | A1 |
20060168832 | Yasui et al. | Aug 2006 | A1 |
20060192465 | Kornbluh et al. | Aug 2006 | A1 |
20060208326 | Nasiri et al. | Sep 2006 | A1 |
20060211044 | Green | Sep 2006 | A1 |
20060238621 | Okubo et al. | Oct 2006 | A1 |
20060243049 | Ohta et al. | Nov 2006 | A1 |
20060274399 | Yang | Dec 2006 | A1 |
20070046239 | Hashizume | Mar 2007 | A1 |
20070132733 | Ram | Jun 2007 | A1 |
20070152976 | Townsend et al. | Jul 2007 | A1 |
20070181962 | Partridge et al. | Aug 2007 | A1 |
20070200564 | Motz et al. | Aug 2007 | A1 |
20070281379 | Stark et al. | Dec 2007 | A1 |
20080014682 | Yang et al. | Jan 2008 | A1 |
20080066547 | Tanaka et al. | Mar 2008 | A1 |
20080110259 | Takeno | May 2008 | A1 |
20080119000 | Yeh et al. | May 2008 | A1 |
20080123242 | Zhou | May 2008 | A1 |
20080210007 | Yamaji et al. | Sep 2008 | A1 |
20080211043 | Chen | Sep 2008 | A1 |
20080211113 | Chua et al. | Sep 2008 | A1 |
20080211450 | Yamada et al. | Sep 2008 | A1 |
20080277747 | Ahmad | Nov 2008 | A1 |
20080283991 | Reinert | Nov 2008 | A1 |
20090007661 | Nasiri et al. | Jan 2009 | A1 |
20090015251 | Azumi et al. | Jan 2009 | A1 |
20090049911 | Fukuda et al. | Feb 2009 | A1 |
20090108440 | Meyer et al. | Apr 2009 | A1 |
20090115412 | Fuse | May 2009 | A1 |
20090153500 | Cho et al. | Jun 2009 | A1 |
20090262074 | Nasiri et al. | Oct 2009 | A1 |
20090267906 | Schroderus | Oct 2009 | A1 |
20090307557 | Rao et al. | Dec 2009 | A1 |
20090321510 | Day et al. | Dec 2009 | A1 |
20100044121 | Simon et al. | Feb 2010 | A1 |
20100045282 | Shibasaki et al. | Feb 2010 | A1 |
20100071467 | Nasiri et al. | Mar 2010 | A1 |
20100075481 | Yang | Mar 2010 | A1 |
20100083756 | Merz et al. | Apr 2010 | A1 |
20100095769 | Matsumoto et al. | Apr 2010 | A1 |
20100109102 | Chen et al. | May 2010 | A1 |
20100171570 | Chandrahalim | Jul 2010 | A1 |
20100208118 | Ueyama | Aug 2010 | A1 |
20100236327 | Mao | Sep 2010 | A1 |
20100248662 | Sheynblat et al. | Sep 2010 | A1 |
20100260388 | Garrett et al. | Oct 2010 | A1 |
20100302199 | Taylor et al. | Dec 2010 | A1 |
20100306117 | Terayoko | Dec 2010 | A1 |
20100307016 | Mayor et al. | Dec 2010 | A1 |
20100312519 | Huang et al. | Dec 2010 | A1 |
20110131825 | Mayor et al. | Jun 2011 | A1 |
20110146401 | Inaguma et al. | Jun 2011 | A1 |
20110154905 | Hsu | Jun 2011 | A1 |
20110172918 | Tome | Jul 2011 | A1 |
20110183456 | Hsieh et al. | Jul 2011 | A1 |
20110198395 | Chen | Aug 2011 | A1 |
20110265574 | Yang | Nov 2011 | A1 |
20110266340 | Block et al. | Nov 2011 | A9 |
20110312349 | Forutanpour et al. | Dec 2011 | A1 |
20120007597 | Seeger et al. | Jan 2012 | A1 |
20120007598 | Lo et al. | Jan 2012 | A1 |
20120215475 | Rutledge et al. | Aug 2012 | A1 |
Entry |
---|
Notice of Allowance for U.S. Appl. No. 12/940,023, mailed on Apr. 16, 2012, 8 pages. |
Office Action for U.S. Appl. No. 12/940,026, mailed on Jul. 3, 2012, 24 pages. |
Non-Final Office Action for U.S. Appl. No. 12/940,023, mailed on Oct. 26, 2011, 5 pages. |
Notice of Allowance for U.S. Appl. No. 12/940,020, mailed on Jan. 13, 2012, 9 pages. |
Requirement for Restriction/Election for U.S. Appl. No. 12/717,070, mailed on Dec. 2, 2011, 5 pages. |
Notice of Allowance for U.S. Appl. No. 12/717,070 mailed on Mar. 9, 2012, 9 pages. |
U.S. Appl. No. 12/913,440, Final Office Action mailed Oct. 10, 2013, 10 pages. |
U.S. Appl. No. 12/944,712 Final Office Action mailed Aug. 21, 2013, 15 pages. |
U.S. Appl. No. 12/983,309 Notice of Allowance mailed Aug. 13, 2013, 11 pages. |
U.S. Appl. No. 13/924,457 Notice of Allowance mailed Sep. 18, 2013, 11 pages. |
U.S. Appl. No. 13/035,969 Non-Final Office Action mailed Oct. 25, 2013, 11 pages. |
U.S. Appl. No. 12/787,368 Non-Final Office Action mailed Sep. 19, 2013, 19 pages. |
U.S. Appl. No. 13/922,983 Notice of Allowance mailed Oct. 7, 2013, 10 pages. |
U.S. Appl. No. 12/787,200 Notice of Allowance mailed Sep. 26, 2013, 11 pages. |
U.S. Appl. No. 13/177,053 Non-Final Office Action mailed Sep. 18, 2013, 12 pages. |
U.S. Appl. No. 13/164,311 Notice of Allowance mailed Sep. 17, 2013, 8 pages. |
U.S. Appl. No. 13/163,672 Non-Final Office Action mailed Sep. 5, 2013, 7 pages. |
U.S. Appl. No. 13/069,355 Final Office Action mailed Oct. 31, 2013, 15 pages. |