Camera touchpad

Information

  • Patent Grant
  • 11818458
  • Patent Number
    11,818,458
  • Date Filed
    Thursday, July 22, 2021
    2 years ago
  • Date Issued
    Tuesday, November 14, 2023
    6 months ago
Abstract
A system and method is disclosed for enabling user friendly interaction with a camera system. Specifically, the inventive system and method has several aspects to improve the interaction with a camera system, including voice recognition, gaze tracking, touch sensitive inputs and others. The voice recognition unit is operable for, among other things, receiving multiple different voice commands, recognizing the vocal commands, associating the different voice commands to one camera command and controlling at least some aspect of the digital camera operation in response to these voice commands. The gaze tracking unit is operable for, among other things, determining the location on the viewfinder image that the user is gazing upon. One aspect of the touch sensitive inputs provides that the touch sensitive pad is mouse-like and is operable for, among other things, receiving user touch inputs to control at least some aspect of the camera operation. Another aspect of the disclosed invention provides for gesture recognition to be used to interface with and control the camera system.
Description
BACKGROUND OF THE INVENTION

Digitally-based and film-based cameras abound and are extremely flexible and convenient. One use for a camera is in the taking of self portraits. Typically, the user frames the shot and places the camera in a mode whereby when the shutter button is depressed; the camera waits a predetermined time so that the user may incorporate himself back into the shot before the camera actually takes the picture. This is cumbersome and leads to nontrivial problems. Sometimes the predetermined delay time is not long enough. Other times, it may be too long. For participates who are in place and ready to have their picture taken, especially children, waiting with a smile on their face for the picture to be snapped by the camera can seem endless even if it is just a few seconds long. Additionally, many who might like to be included into a shot find themselves not able to be because they have to take the picture and it is simply too much trouble to set up for a shutter-delayed photograph.


Voice recognition techniques are well known in the art and have been applied to cameras, see for example, U.S. Pat. Nos. 4,951,079, 6,021,278 and 6,101,338 which are herein incorporated by reference. It is currently possible to have fairly large vocabularies of uttered words recognized by electronic device. Speech recognition devices can be of a type whereby they are trained to recognize a specific person's vocalizations, so called speaker dependent recognition, or can be of a type which recognizes spoken words without regard to who speaks them, so called speaker independent recognition. Prior art voice operated cameras have several defects remedied or improved upon by various aspects of the present invention more fully disclosed below. One such problem is that in self portrait mode, the camera may snap the picture while the user is uttering the command. Another defect is that the microphone coupled to the voice recognition unit is usually mounted on the back of the camera. This placement is non-optimal when the user is in front of the camera as when taking a self portrait. Still another problem with prior art voice activated cameras is that they associate one vocalization or utterance to one camera operation. Thus, the user must remember which command word is to be spoken for which camera operation. This is overly constraining, unnatural, and significantly reduces the utility of adding voice recognition to the camera.


One prior art implementation of voice recognition allows for menu driven prompts to help guide the user through the task of remembering which command to speak for which camera function. This method however requires that the user be looking at the camera's dedicated LCD display for the menu. One aspect of the present invention provides for the menus to be displayed in the electronic view finder of the camera and be manipulated with both voice and gaze. Another aspect of the present invention incorporates touchpad technology which is typically used in laptop computers, such technology being well know in the art, as the camera input device for at least some functions.


SUMMARY OF THE INVENTION

A self-contained camera system, according to various aspects of the present invention, includes voice recognition wherein multiple different vocalizations can be recognized and wherein some such recognized vocalizations can be associated with the same camera command. Another aspect of the invention provides for multiple microphones disposed on or in the camera system body and be operable so that the user can be anywhere around the camera system and be heard by the camera system equally well. According to other aspects of the present invention, the camera system viewfinder includes gaze tracking ability and in exemplary preferred embodiments, gaze tracking is used alone or in combination with other aspects of the invention to, for example, manipulate menus, improve picture taking speed, or improve the auto focus capability of the camera. Other aspects of the present invention, such as the addition of touchpad technology and gesture recognition provide for a improved and more natural user interface to the camera system.


Thus, it is an object of the invention to provide an improved self-portrait mode for a camera system. It is further an object of the invention to provide an improved user interface for a camera system. It is yet a further object of the invention to make a camera system more user friendly with a more natural and intuitive user interface. It is still a further object of the invention to broaden the capabilities of the camera system. It is further an object of the invention to more easily allow a user to compose a shot to be taken by the camera system. It is still further an object of the invention to improve image quality of pictures taken by the camera system. It is yet another object of the invention to improve the speed of picture taking by the camera system.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A is an exemplary perspective view of the rear (back) of the camera system according to various aspects of the present invention.



FIG. 1B is an exemplary perspective view of the front of the camera system according to various aspects of the present invention.



FIG. 2 is a functional representation of automatic microphone selection circuitry that may be uses in various aspects of the present invention.



FIG. 3 shows an exemplary functional block diagram of an inventive camera system implementing various aspects of the present invention.



FIG. 4 shows an exemplary embodiment of a wink detector according to various aspects of the present invention.



FIG. 5A shows an exemplary touchpad overlay with cutouts according to various aspects of the present invention.



FIG. 5B shows an exemplary touchpad overlay with cutouts according to various aspects of the present invention.



FIG. 5C shows an exemplary touchpad overlay with cutouts according to various aspects of the present invention.





DESCRIPTION OF PREFERRED EXEMPLARY EMBODIMENTS

One aspect of the present invention solves several of the problems of the prior art voice recognition cameras in that this aspect provides for more than one microphone to be the source to the recognition unit. With reference to FIG. 1, this aspect of the present invention provides for at least two microphones to be used, one microphone, 10b, placed on the back of the camera and one microphone, 10a, placed on the front, either of which can receive voice commands. In a first preferred embodiment of this aspect of the invention, a detection device determines which microphone is to be used as the input to the recognition unit based upon the strength of the voice signal or sound level received by each of the microphones. In another preferred embodiment, the outputs of the microphones are combined as the input to the voice recognition unit. In still another embodiment, the user can select which microphone is used as the input to the voice recognition unit, for example, by a switch or by selection through a camera menu.


Automatic microphone selection is preferred and with reference to FIG. 2, microphones 10a and 10b are each amplified by amplifiers 20 and 22 respectively. Diode 24, capacitor 28 and resister 32 form a simple energy detector and filter for microphone 10a. The output of this detector/filter is applied to one side of a comparator, 36. Similarly, diode 26, capacitor 30, and resister 34 form the other energy detector associated with microphone 10b. The output of this filter/detector combination is also applied to comparator 36. Thus, the output of this comparator selects which amplified microphone output is passed to the voice recognition unit through multiplexer 38 based on which amplified microphone output contains the greatest energy.


In yet another novel embodiment of this aspect of the invention, the multiple microphones are preferably associated with multiple voice recognition units or, alternatively, with different voice recognition algorithms well know in the art. The outputs of these multiple voice recognition units or different voice recognition algorithms are then coupled to the camera controller (FIG. 3 element 40). The camera controller preferably selects one of these outputs as being the camera controller's voice recognition input. Alternatively, the camera controller accepts the outputs of all the voice recognition units or algorithms and preferably uses a voting scheme to determine the most likely recognized command. This would obviously improve recognition rates and this aspect of the invention is contemplated to have utility beyond camera systems including, by way of example and not limitation, consumer computer devices such as PCs and laptops; portable electronic devices such as cell phones, PDAs, IPODs, etc.; entertainment devices such as TVs, video recorders, etc; and other areas.


To illustrate this embodiment using the example of the camera system having microphones on its frontside and backside given above, each of these microphones is coupled to a voice recognition unit. When an utterance is received, each voice recognition unit recognizes the utterance. The camera controller then selects which voice recognition unit's recognition to accept. This is preferably based on the energy received by each microphone using circuitry similar to FIG. 2 Alternatively, the selection of which voice recognition unit to use would be a static selection. Additionally, both recognizers' recognition would be considered by the camera controller with conflicting results resolved by voting or using ancillary information (such as microphone energy content).


An embodiment using multiple algorithms preferably has one voice recognition algorithm associated with the frontside microphone and, a different voice recognition algorithm associated with the backside microphone. Preferably, the voice recognition algorithm associated with the frontside microphone is adapted to recognize vocalizations uttered from afar (owing to this microphone probably being used in self-portraits), while the voice recognition algorithm associated with the backside microphone is optimal for closely uttered vocalizations. Selection of which algorithm is to be used as the camera controller input is preferably as above. Alternatively, as above, the selection would be by static selection or both applied to the camera controller and a voting scheme used to resolve discrepancies. While the above example contemplates using different voice recognition algorithms, there is no reason this must be so. The same algorithms could also be used in which case this example functions the same as multiple voice recognition units.


It is further contemplated in another aspect of the invention that the voice recognition subsystem be used in conjunction with the photograph storing hardware and software. In a preferred use of this aspect of the invention, the user utters names to be assigned to the photographs during storage and, later, utter then again for recall of the stored image. Thus, according to this aspect of the present invention, a stored photograph can be recalled for display simply by uttering the associated name of the photograph. The name association is preferably by direct association, that is, the name stored with the picture. In a second preferred embodiment, the photograph storage media contains a secondary file managed by the camera system and which associates the given (i.e., uttered) name with the default file name assigned by the camera system's storage hardware and/or software to the photograph when the photograph is stored on the storage media. According to the second embodiment, when a photograph is to be vocally recalled for viewing, the camera system first recognizes the utterance (in this case, the name) which will be used to identify the picture to be recalled. The camera system then scans the association file for the name which was uttered and recognized. Next, the camera system determines the default name which was given to the photograph during storage and associated with the user-given name (which was uttered and recognized) in the association file. The camera system then recalls and displays the photograph by this associated default name.


In another preferred embodiment, the voice recognition subsystem of the improved camera system recognizes at least some vocalized letters of the alphabet and/or numbers so that the user may assign names to pictures simply by spelling the name by vocalizing letters and/or numbers. Another aspect of the invention provides that stored photographs be categorized on the storage media through use of voice-recognized utterances being used to reference and/or create categories labels and that, additionally, the recognizer subsystem preferably recognize key words for manipulating the stored pictures. For instance, according to this aspect of the invention, the inventive camera system would recognize the word “move” to mean that a picture is to be moved to or from a specific category. More specifically, “move, Christmas” would indicate that the currently referenced photograph is to be moved to the Christmas folder. An alternative example is “John move new year's” indicating that the picture named john (either directly named or by association, depending on embodiment) be moved to the folder named “New Year's”. It is further contemplated that the folder names may be used for picture delineation as well. For instance, the picture “John” in the Christmas folder is not the same as the picture “John” in the Birthday folder and the former may be referenced by “Christmas, John” while the latter is referenced by “Birthday, John”.


Another aspect of the present invention provides that the voice recognition camera system be capable of associating more than one vocal utterance or sound with a single command. The different utterances are contemplated to be different words, sounds or the same word under demonstrably different conditions. As an example, the voice recognition camera system of this aspect of the present invention allows the inventive camera system to understand, for example, any of “shoot”, “snap”, “cheese”, and a whistle to indicate to the camera system that a picture is to be taken. In another example, perhaps the phrase and word “watch the birdie” and “click” instruct the camera to take the picture. It is further envisioned that the user select command words from a predetermined list of the camera command words and that he then select which words correspond to which command. It is alternatively envisioned that the association of multiple recognizable words to camera commands may also be predetermined or preassigned. In another alternate embodiment, the inventive camera system allows the user to teach the camera system which words to recognize and also inform the camera system as to which recognized words to associate with which camera commands. There are obviously other embodiments for associating recognized vocalizations to camera commands and the foregoing embodiments are simply preferred examples.


In another embodiment of this aspect of the present invention, the user has his uttered commands recognized under demonstrably different conditions and recognized as being different utterances. For instance, according to this aspect of the invention, the voice operated camera system operates so that it understand commands vocalized close to the camera (as if the user is taking the picture in traditional fashion with the camera back to his face) and significantly farther away (as if the user is taking a self portrait picture and is part of the shot and thus has to vocalize loudly to the front of the camera.) For this illustration, in a preferred embodiment the user teaches the words to the camera under the different conditions anticipated. For example, the user would teach the camera system by speaking the word “snap” close to the camera and inform the camera that this is a picture taking command and would then stand far from the camera and say “snap”, thus teaching another utterance, and instruct the camera that this is also a picture taking command. These two different utterances of the same word under different conditions would be stored and recognized as different utterances. This aspect of the invention contemplates that the words vocalized and/or taught need not be the same word and, as illustrated above, different words would also be considered different utterances as well.


Since voice recognition is not always 100 percent accurate, another aspect of the present invention contemplates that the camera system or a remote device, or both, preferably provide an indication that a voice command was or was not understood. Thus, using the self portrait example above, if the user vocalizes the command to take a picture but the camera system does not properly recognize the vocalization as being something it understands, the camera system would beep, or light an LED, etc. to indicate it's misrecognition. Because of the relatively small number of anticipated camera commands and allowing for multiple vocalizations to command the same action, it is expected that the recognition rates will be quite high and fairly tolerant of extraneous noise without necessarily resorting to the use of a highly directional or closely coupled (to the user's mouth) microphone though the use of such devices is within the scope of the invention.


It is anticipated that the user of the inventive camera system may be too far away from the camera system for the camera system to recognize and understand the user's vocalizations. Thus, another aspect of the invention provides that the camera is equipped with a small laser sensor (FIG. 1 element 18) or other optically sensitive device such that when a light of a given frequency or intensity or having a given pulse sequence encoded within it is sensed by the camera system equipped with the optically sensitive device, the camera system immediately, or shortly thereafter (to give the user time to put the light emitting device down or otherwise hide it, for example) takes a picture. The light emitting device is preferably a laser pointer or similar, stored within the camera housing when not needed so as to not be lost when not in use. Additionally, the light emitting device's power source would preferably be recharged by the camera system's power source when so stored. In another embodiment, it is also contemplated that the light emitting device may be housed in a remotely coupled display which is disclosed below. The light emitting device preferably includes further electronics to regulate the emitted light intensity or to encode a predetermined pulse sequence (on-off pulses for example) or otherwise onto the emitted light, all of which techniques are well known in the art, which the camera system of this aspect of the present invention would receive and recognize by methods well known in the art.


Another aspect of the present invention provides for there being a predetermined delay introduced between recognizing a voice command and the camera actually implementing the command. This aspect of the invention allows time, for example, for the user to close his mouth or for others in a self-portrait shot to settle down quickly before the picture is actually taken. In a first preferred embodiment of this aspect of the invention, the delay is implemented unconditionally for at least the picture taking command. In a second preferred embodiment of this aspect of the invention, the delay introduced is dependent upon from where the command came relative to the camera system. For instance, if the camera system recognized the command as coming from the frontside microphone, delay is used, but if the command comes from the backside microphone, then no delay is implemented. The simple energy detection circuitry of FIG. 2, described above is easily adapted for this function. In an alternative embodiment, implementation of the delay is dependent upon the location of the microphone due to the orientation of the flip-up or swivel LCD display when the microphone is attached to the LCD display (FIG. 1, element 12c). For example, if the microphone in the display sub-housing is oriented forward relative to the camera body then delay is implemented, if the microphone is not oriented forward then no delay is introduced. Determining the orientation of this microphone relative to the camera body is known in the art and would typically be done with switches or other sensor devices. Another preferred embodiment of this aspect of the invention implements the delay for only certain commands, such as the command to take a picture. In yet another preferred embodiment, whether the delay is implemented at all is selectable by the user.


Another aspect of the present invention provides that the camera LCD display (FIG. 1, element 14) employs touch sensitive technology. This technology is well known in the computer art and can be any of resistive, capacitive, RF, etc touch technology. This aspect of the present invention allows the user to interact with menus, features and functions displayed on the LCD display directly rather than through ancillary buttons or cursor control. For those embodiments of touch technology requiring use of a stylus, it is further contemplated that the camera body house the stylus for easy access by the user.


According to another aspect of the present invention, it is envisioned that the current dedicated LCD display (FIG. 1, element 14) incorporated on a digital camera be made to be removable and be extendable from the camera by cable, wireless, optical, etc. interconnection with the camera. In one embodiment, this remote LCD would be wire-coupled to receive display information from the digital camera through a pluggable port. In another embodiment, the remote LCD would be wirelessly coupled to the digital camera through any of several technologies well understood in the art including, by way of example only, Bluetooth, WIFI (802.11 a/b/g/n), wireless USB, FM, optical, etc. In a another embodiment of this aspect of the invention, the remotely coupled display would serve the dual purpose of being a remote input terminal to the camera system in addition to being a dedicated display for the camera system. Preferably, as mentioned earlier, the display is touch sensitive using any of the touch sensitive technology well understood in the art such as resistive, capacitive, RF, etc., methods mentioned above. Touch commands input by the user would be coupled back to the camera system as needed. It is also contemplated that the remote display house the stylus if one is required.


In another preferred embodiment, the remotely coupled display has buttons on it to control the camera system. In another embodiment, the remotely coupled display contains the microphone for receiving the voice commands of the user, digitizing the received voice, analyzing and recognizing the vocalization locally and sending a command to the camera system. In another preferred embodiment, the remotely coupled display containing the microphone simply digitizes the vocalization received by the microphone and transmits the digitized vocalization to the camera system for recognition of the vocalization by the camera system itself. In all embodiments of the wireless remote display, it is preferred that the display contain its own power source, separate from the power source of the camera. It is also contemplated that the display's separate power source may be coupled to the camera's power source when the display is ‘docked’ to the camera so that both may share power sources or so that the camera's power source may recharge the display's power source.


According to another aspect of the present invention, the electronic view finder (EVF) typically used on modern digital cameras includes a gaze tracking capability which is well known in the art, see for example U.S. Pat. No. 6,758,563 to Levola which is herein incorporated by reference. In this aspect of the present invention, menus typically used for user interface to the camera are electronically superimposed in the image in the EVF. The gaze tracker subsystem is operable for determining the area or approximate location of the viewfinder image at which the user is gazing. Thus, by the user looking at different areas of the EVF image, the gaze tracker subsystem informs the camera system so that a mouse-like pointer or cursor is moved by the camera system to the area of the EVF image indicated by the gaze tracking device to be the area the user is viewing. Preferably, the user then speaks a command to indicate his selection of the item pointed to by the pointer image. Alternatively, the user may indicate through other methods that this is his selection, such as staring at a position in the image for a minimum predetermined time or pressing a button, etc. As an example, the EVF displays icons for flash, shutter speed, camera mode, etc (alone or superimposed on the normal viewfinder image.) By gazing at an icon, a small compositely rendered arrow, cursor, etc., in the EVF image is caused by the gaze tracker subsystem to move to point to the icon at which the user is determined to be gazing by the gaze tracking subsystem, for instance, the camera mode icon as an example here. Preferably, the user then utters a command which is recognized by the camera system as indicating his desire to select that icon, for example, “yes” or “open”.


Alternatively, the icon is selected by the user gazing at the icon for some predetermined amount of time. When the icon is selected by whatever method, the EVF image shows a drop down menu of available camera modes, for example, portrait, landscape, fireworks, etc. The user, preferably, then utters the proper command word from the list or he may optionally gaze down the list at the mode he desires whereupon the gaze tracker subsystem directs that the pointer or cursor in the EVF image moves to the word and, preferably highlighting it, indicates that this is what the camera system thinks the user want to do. The user, preferably, then utters a command indicating his acceptance or rejection of that mode in this example, such as ‘yes’ or ‘no’. If the command uttered indicates acceptance, the camera system implements the command, if the command indicates rejection of the selected command, the camera system preferably moves the pointer to a neighboring command. To leave a menu, the user may utter ‘end’ to return to the menu above or ‘home’ to indicate the home menu. Preferably, the user can also manipulate the pointer position by uttering commands such as “up”, “down”, “left” and “right” to indicate relative cursor movement. In this way, the user interacts with the camera in the most natural of ways, through sight and sound cooperatively. While the above example used the preferred combination of gaze and voice recognition, it is contemplated that gaze tracking be combined with other input methods such as pushing buttons (like a mouse click) or touch input disclosed below, or gesture recognition disclosed below, etc. as examples.


Another application of this aspect of the invention uses gaze tracking to assist the auto focus (AF) capability of the prior art camera. AF generally has too modes, one mode uses the entire image, center weighted, to determine focus, another mode allows different areas of the image to have greater weight in determining focus. In the second mode, the user typically pre-selects the area of the framed image that he wishes to be over-weighted by the AF capability. This is cumbersome in that the user must predict where he wants the weighting to be ahead of time, thus, this embodiment of this aspect of the invention provides that the gaze tracker subsystem inform the AF capability of the camera system as to the location of the image that the user is gazing and that the AF capability use this information to weight this area of the image when determining focus. It is contemplated that the AF system may only provide for discrete areas of the image to be so weighted and in this case, preferably, the AF capability selects the discrete area of the image closest to that being gazed upon.


Another embodiment of this aspect of the invention uses the gaze tracker to enable the flash of the camera system. Flash is common used to “fill” dimly lit photographic scenes but sometimes this is not warranted. Other times, it is desired to have “fill” flash because the area of the scene desired is dark but the rest of the scene is quite bright (taking a picture in shade for example) and the camera does not automatically provide “fill” flash because the overall image is bright enough. Typically, the amount of “fill” flash the camera will give is determined by the camera measuring the brightness of the scene. The inventive camera system with gaze tracking is used to enhance the prior art method of determining the desire and amount of “fill” flash in that the inventive camera system gives more weight, in determining the scene brightness, to the area of the scene indicated by the gaze tracker as being gazed upon.


Another aspect of the present invention adds touchpad technology to the prior art camera system. Use of the word ‘touchpad’ throughout this disclosure should be construed to mean either the touchpad itself or the touchpad with any or all of a controller, software, associated touchpad electronics, etc. This touchpad technology is similar to the touchpad mouse pad used on laptop computers which is also well understood in the computer art. In a first preferred embodiment, the EVF (or LCD display) displays the menus as above and the user moves the cursor or mouse pointer around this image by use of his finger on the touchpad. This operation is virtually identical to that of the mouse in laptop computers and is well understood in the art. Preferably, the touch pad is mounted on the top of the camera at the location typically used for the shutter button (FIG. 1 element 12a). It is also preferred that the touchpad software implement ‘tapping’ recognition, also well known in the art, so that the user may operate the shutter button, make a selection, etc. simply by tapping the touchpad with his index finger, much the same way modern laptop driver software recognizes tapping of the touchpad as a click of the mouse button. It is also currently preferred that tapping recognition is used to make selections on the menus shown in the EVF, LCD display, or otherwise.


Another application of this aspect of the invention uses the touchpad to inform the camera system to zoom the lens simply by the user stroking his finger from front to back (for example, to zoom) or back to front over the touchpad (for example, to wide angle). For this aspect of the present invention, a preferred embodiment has the touchpad on the barrel of the lens. This is a most natural way to control zoom since the movement of the finger is a gesture with the user ‘pulling’ the object to be photographed closer (front to back stroke means zooming) or ‘pushing’ the object to be photographed away (back to front stroke means wide angle). According to another aspect of the invention, the touchpad replaces the shutter button functionality and the preferable location for this embodiment is top mounted. Preferably, the touchpad is tapped once to focus the camera and/or lock the AF and tapped a second time to trip the shutter. Alternatively, the inventive camera system simply senses the person's touch of the touchpad, auto focuses the camera and/or locks the focus or provides continually focusing while the person's touch is sensed and wherein a tap of the touchpad then trips the shutter. Preferably, the camera system enforces a maximum amount of time that the AF may be locked so that action photographs will not be badly focused. Automatically locking the AF settings for a maximum predetermined time after AF activation or continuously focus upon AF activation is also applicable to the prior art AF button activation method described below. While a computer-like touchpad was used to illustrate the above preferred embodiments of this aspect of the invention, the touch sensitive input device could be comprised of other structure, for instance, the aforementioned touch-sensitive LCD display. Also, throughout this disclosure, the word ‘continuous’ (and its variants, e.g., continually, etc.) should be construed to mean discretely continuous in addition to its analogue-world definition.


In a second preferred embodiment of this aspect of the invention, the touchpad is placed on the back of the camera (FIG. 1 element 12b) and is operable for manipulated the cursor and menus shown on the LCD or EVF display. This provides a much more natural and computer-like interface to the camera system. It is also contemplated that either embodiment of this aspect of the invention may be coupled with voice recognition so that the user may interact with the camera by touchpad manipulation in combination with voice commands. Additionally, combined with gaze tracking, the user can interact with the camera through touch, voice, and gaze (i.e., sight) to manipulate menus, control the camera system, compose the shot, focus, zoom, enable/disable flash, select macro or panoramic camera modes, etc.


One of the most annoying properties of the modern digital camera is the shutter delay that occurs when a picture is taken. That is, the delay between the user depressing the shutter button and the camera actually taking the picture. This delay can be as much as one second on some modern digital cameras and is typically due to the camera focusing and then taking the picture after the shutter button is depressed. One solution to this implemented by prior art cameras is for the camera to sense when the shutter button is depressed half way, then focus and lock the AF settings of the camera while the shutter button remains half way depressed, so that when the user depresses the shutter button the rest of the way, the picture is taken almost instantaneously. This solution is more often than not misused or misunderstood by novice users or those who do not use their camera regularly and can also result in blurred action photographs. Thus, one aspect of the present invention provides that the viewfinder be coupled to a unit for detecting when the user's eye is viewing through the viewfinder. When viewfinder use is detected, the inventive camera system preferably enables the auto focus system to continually focus thus ensuring that the shot is focused when the camera system is commanded to take a picture. Preferably, the gaze tracker is used for this determination though this aspect of the invention may be implemented without gaze tracking.


In a preferred embodiment of this aspect of the invention without gaze tracking, the viewfinder is equipped with a small light emitting device and a light detection device both well known in the art. With reference to FIG. 4, the light emitting device, 70, emits a frequency or frequencies of light some of which is reflected from the eyeball when a user is viewing through the viewfinder, 74. The light detection device, 72, is operable for sensing this reflected light and an amplifier (not shown) coupled to device 72, amplifies the signal from the light detection device, 72. Obviously, if there is no one viewing through the viewfinder, then there will be no reflected light from the eyeball and the amplifier output will be near ground, however, when a person peers into the viewfinder, light will be reflected from his eyeball and the output of the amplifier will be significantly larger. Thus, this system and method provides a way for detecting the use of the viewfinder by the user without providing gaze tracking ability. It is contemplated that this system and method be used with both EVF and optical (i.e., traditional) viewfinders and that viewport, 76, may be an LCD, optical lens, etc. Shroud 78 typically included on modern viewfinders helps to improve viewfinder use detection by cutting down on extraneous light reaching device 72 when the user is viewing through the viewfinder. It should be noted that the location of elements 70 and 72 in FIG. 4 is exemplary only and other placements of these elements are within the scope of this aspect of the invention. While the above embodiment of this aspect of the invention relied on eyeball reflectivity, in an alternate embodiment it is contemplated that the viewfinder use detect can be made with a light source and light detector juxtaposed wherein the eye interrupts the light between the two thus indicating viewfinder use, or that the shroud be fitted with a touch sensor around its outer ring that would sense the person's contact with the shroud when the viewfinder is in use. Additionally, it is contemplated that embodiments of this aspect of the invention may employ filters or other structures to help minimize false viewfinder use detection due to sunlight or other light sources shining on detector 72 when a user is not viewing through the viewfinder.


Another aspect of the present invention is to employ a wink-detector as part of the viewfinder of the camera. Preferably, the gaze tracker is modified for this purpose. Alternatively, the previously disclosed viewfinder use detector may also be employed. All that is required is to additionally detect the abrupt change in reflected light from the eye that would be caused by the eyelid wink. The wink-detector is contemplated to be used for shutter trip and/or AF activation or lock among other things. It is contemplated that it be used in the aforementioned application wherein the menus of the camera are displayed on the EVF. In this case, the wink detector preferably acts as a user selection detector device in that the user may select an item pointed to by the gaze tracker pointer or that is otherwise highlighted by the gaze tracker simply by winking. It is contemplated that the detected wink would preferably function in the camera system similarly to a left mouse click on a computer system when dealing with menus and icons. In this way, the camera system with wink detector of this aspect of the present invention becomes a optical gesture-recognizing camera wherein the gesture is optically received and electronically recognized (gesture recognition is also contemplated to be used in the touchpad software as described above.)


In an enhancement of this aspect of the invention, the wink detector subsystem discriminates between a wink and a blink by preferably determining the amount of time taken by the wink or blink. If the amount of time taken for the gesture (blinking or winking) is below a certain threshold, the gesture is considered a wink and disregarded.


Once a user of a camera has taken pictures, typically he will wish to print or otherwise develop the pictures for viewing, framing, etc. Another aspect of the present invention provides for simpler photo offloading from the modern digital camera when a set of predetermined conditions, such as day, time, number of pictures to offload, etc., are met. The camera system preferably includes the ability for the user to indicate to the camera which pictures to offload so that the camera offloads only those pictures that are so indicated by the user. In a first preferred embodiment of this aspect of the invention, the camera system is internally equipped with wireless interface technology by a wireless interface to the camera controller for interfacing directly to a photo printer or other photo rendering device. Currently preferred is WIFI (i.e., IEEE 802.11 a/b/g/n) with alternatives being Bluetooth, or wireless USB all of which are known in the art. By connecting via WIFI, the inventive camera system can preferably access other devices on the LAN associated with the WIFI for the storing of pictures onto a computer, network drive, etc. In additional, preferably, devices on the network can access the camera system and the pictures within it directly and also access camera settings, upload new software or updates to the camera system, etc. Since one of the big complaints with wireless technology for small devices is the often-obtrusive antenna, it is greatly preferred for this aspect of the invention that the wireless hardware including antenna be completely contained within the body of the camera system.


In a second preferred embodiment of this aspect of the invention, the inventive camera system is equipped with software and hardware coupled to the camera controller allowing independent communication with a computer network for the primary purpose of communicating its pictures over the internet. Currently preferred is WIFI which is typically connected by LAN, routers, etc. to the internet and which usually allows WIFI-equipped devices to independently connect to the internet (FIG. 3, element 46c). Alternatively, the invention contemplates the use of wired LAN, cellular data networks, etc. as the interconnection technology (FIG. 3, element 46b) used by the inventive camera system. The inventive camera system is further preferably equipped with a microbrowser that runs on the inventive camera system's camera controller which is preferably a microprocessor. It is contemplated that some embodiments may not be required a microbrowser (see enhancement below). Design and operation of microbrowser-equipped electronic devices for use with the internet is well known in the art and need not be discussed further. The camera system LCD display serves the purpose of displaying internet webpages when the user is navigating the internet in addition to its function as the camera display. So equipped, the inventive camera system can now independently upload its pictures to any of the internet-based photo printing services, such as those provided by Walmart.com, Walgreens.com, Kodak.com, etc., without the need for first storing the photos to a computer system and then connecting the computer system to the internet to upload the pictures. Use of these internet services for printing photos is preferred by many over use of a home photo printer because of the convenience, ease, availability, quality and lower per-picture printing costs. Providing the novel combination of a high photo-quality camera system with direct access to the internet according to this aspect of the present invention will further improve the utility of the camera system and these services.


In an enhancement to the above-disclosed embodiments of this aspect of the invention, the inventive camera system is operable for being instructed to automatically initiate a connection to the internet, LAN, printer, etc. whenever the predetermined conditions are met and it is in range of the network connection, (e.g., WIFI, Bluetooth, wireless USB, wired LAN, etc). Once the transmittal of the pictures is complete, the inventive camera system preferably terminates the connection. Additionally, the inventive camera system is preferably operable so that the automatic connection is made only at certain times of the day or weekends, etc., so as to confine picture transmission to periods of low network usage or periods of cheaper network access, etc. Also, it is currently preferred that the user be queried to allow the automatic connection though this is obviously not required and the connection can be made completely autonomously. Thus, in the first embodiment above, the inventive camera system automatically sends its pictures to a printer or other device on the LAN for printing or for remotely storing the pictures in the inventive camera system, whenever the inventive camera system is in range of the LAN network connection and connection can be made. In the second embodiment above, the inventive camera system automatically connects to the internet preferably via WIFI, although cellular network, etc. connection is also contemplated, when it has a predetermined number of pictures and can so connect, and will send the pictures to virtually any internet destination without user intervention. For example, the inventive camera system can be instructed to automatically send the pictures to an email account, internet picture hosting site (FIG. 3, element 46d), web-based photo printing site, the user's internet-connected home computer (when he is on vacation, for instance), etc. In this way, valuable pictures are immediately backed-up and the need for reliance on expensive camera storage media like flash cards, SD, etc. is greatly reduced.


Many prior art digital cameras can now record images continuously at 30 frames per second (i.e., take movies) along with sound. Thus, a prior art camera having an internet connection capability as herein taught combined with well known and straightforward editing methods enables inventive on-camera movie composition. According to this aspect of the invention, the inventive camera records a series of images, (e.g., a movie) and then the user downloads an MP3 file (i.e., a sound file) from a network (e.g., internet) source to be associated with the movie taken so that when the movie is played, the MP3 file also plays. Alternatively, the MP3 content is embedded in the movie, either as is, or re-encoded. Additionally, the user may download other movie material or still images via the network connection for insertion in the camera-recorded movie or for the replacement of certain individual camera-taken “frames” in the movie.



FIG. 3 shows an exemplary functional block diagram of the improved, camera system according to various aspects of the present invention. The figure shows one possible exemplary embodiment contemplated and the figure should not be used to limit the teaching of this disclosure to a certain implementation, embodiment, combination of aspects of the present invention, or otherwise.


Another aspect of the present invention provides that prior art features of the cell phone (FIG. 3, element 46a) are combined so that voice control of the camera in the cell phone can be accomplished. Many modern cell phones incorporating cams also provide voice recognition-driven dialing. Therefore, the functionality necessary for recognizing vocalizations within a cellular communication device exists in the art but has not been applied to the cell phone camera. This aspect of the present invention couples the voice recognition unit of a cell phone to the camera control unit of the cell phone either directly or via the cell phone controller, thus enabling voice control of the cell phone camera. Preferably, when recognizing a vocalization, the cell phone controller programming would also include the step of determining if the recognized vocalization was for camera control, or for dialing. Such determination would preferably be by reserving certain recognized keywords to be associated with camera functions (e.g., snap, shoot, etc). Alternatively, the cell phone may be explicitly placed into camera mode so that it is known ahead of time that recognized utterances are for camera control.


Cell phones being so light and without much inertia are hard to steady and the fact that the user must push a button on something so light makes it even harder to keep steady particularly given the small size of the shutter button on some cell phones. This aspect of the present invention would make picture taking on cell phones simpler and more fool proof.


Another aspect of the invention provides that the prior art voice recognition unit of the cell phone be adapted to recognize at least some email addresses when spoken. Another aspect of this inventive adaptation is to adapt the cell phone voice recognizer to identify the letters of the alphabet along with certain key words, for example, “space”, “underscore”, “question mark”, etc and numbers so that pictures may be named when stored by spelling, for example. This aspect of the invention is contemplated to serve the dual purpose of being usable for text messaging or chat text input on the cell phone in addition to picture labeling.


Additionally, other aspects of the present invention taught for the improved camera system are applicable to the improved cell phone herein disclosed particularly the aspect of the present invention associating multiple different utterances to a single command. The aspect of the invention allowing for automatic connection to a LAN or the internet is also contemplated for use with cell phone cameras. This aspect of the invention ameliorates the prior art storage space limitation which severely hampers the utility of the cell phone camera. Cellular service providers typically charge a fee for internet access or emailing and so an automatic feature to connect to the net or send email for the purposes of transmitting pictures can improve revenue generation for these companies.


The embodiments herein disclosed for the various aspects of the present invention are exemplary and are meant to illustrate the currently preferred embodiments of the various aspects of the invention. The disclosed embodiments are not meant to be exhaustive or to limit application of the various aspects of the invention to those embodiments so disclosed. There are other embodiments of the various aspects of the present invention that are within the scope of the invention. Additionally, not all aspects of the invention need to be practiced together, it is contemplated that subsets of the disclosed aspects of the present invention may be practiced in an embodiment and still be within the scope of the present invention. For instance, an embodiment combining a touch sensitive shutter button with a viewfinder use detector so that focusing is only accomplished when both the shutter button is touched and viewfinder use is detected. Another embodiment contemplated is to use the viewfinder use detector to automatically turn the EVF on and the LCD display off when viewfinder use is detected instead of the prior art method of pressing a button which typically toggles which of the two is on and which is off. Still another contemplated embodiment applies the touch gesture recognition typically used with the computer-like touchpad technology to a touch sensitive display, such as the touch sensitive LCD of the camera and other devices herein disclosed that utilize an LCD display. Combining various aspects of the invention herein disclosed, such as voice recognition, touch input, gaze tracking, etc for camera control provides much more natural and human interfacing to the camera system for the control of camera menus, camera features, camera options, camera settings, commanding picture taking, enabling flash, etc.


Another alternative embodiment for the disclosed aspects of the present invention is to use the disclosed touchpad with or without supporting input gesture recognition with cellular phones, other cellular devices, Apple Computer Inc.'s Ipod MP3 player, etc., with the computer-like touchpad replacing some or all of the buttons on devices. Touch input with or without touch-based gesture recognition would be an ideal replacement for Apple's Ipod click wheel interface. The touch pad would preferably be made round (alternatively, it would be rectangular with the housing of the device providing a round aperture to the touchpad device) and simply by skimming a finger over or touching the touchpad at the appropriate places on the touch pad, the Ipod would be commanded to perform the proper function such as raising or lowering the volume, fast forwarding, slowing down replay, changing the selection, etc. This type of round touchpad is also contemplated for use on cell phones to simulate the old-fashioned rotary dial action or placement of digits. The user touches the pad at the appropriate place around the circumference of the touch pad to select digits and enter them and then makes a dialing motion (stroking a thumb or finger around the circumference of the touchpad) to begin the call or touches the center of the pad to begin the call. Round pattern dialing is easily done with the thumb when the phone is being single-handedly held. With reference to FIG. 5, in another embodiment, the touchpad, 94, is further contemplated to be fitted with a solid overlay having 2 or more cutouts over its surface (the solid overlay with cutouts is preferably part of the cell phone or other device's housing and alternatively, the solid overlay, 90, with cutouts, 92, is applied to the touchpad surface separately) that only allows for certain areas of the touchpad to actually be touched to assist the user in assuring that only certain well-defined areas of the touchpad are touched. This greatly reduces the software detection requirements for the touchpad interface software since now the software need only detect when a certain defined area is touched and assigns a specific function to that touched area and reports that to the device controller. That is, the cutout areas would essentially be soft keys but without there being a plurality of different keys, instead, simply different soft key locations on the same touchpad but delineated physically so that certain other areas of the touchpad simply cannot be touched. It is further contemplated that, in many instances, the cutouts can be made large enough so that finger-stroke gestures can still be made and discerned. Because of the nature of modern mouse-like touchpad technology and how it works, the firmness of a persons touch that actually registers as a touch can also be provided for by software and this feature is also contemplated for use herein. Additionally, the touchpad, covered by a solid overlay with cutouts, would be recessed below the upper surface of the overlay (by as much as desired) helping to minimize false touches. This would be a much cheaper input gathering structure and would replace some or all of the many buttons and joystick-like controller of the cell phone, Ipod, camera, etc. It is contemplated that a few generic touchpad shapes and sizes could be manufactured and serve a host of input functions, replacing literally tons of buttons and switches, since now the solid overlay with cutouts on top of the touchpad defines the areas that can be touched or gestured (see exemplary drawings of FIG. 5(b) and FIG. 5(c)), and touchpad software, well understood in the art, defines what meaning is ascribed to these touched locations and gestures and what degree of firmness of touch is required to actually register the touch. Tapping and gesture (i.e., a finger stroke) recognition would further extend this new input-gathering device capability but is not required. This new input-gather device can be used to replace all or some of the buttons or joystick-like controllers on cell phones, portable electronic devices, cordless phones, mp3 players, PDAs, cameras, calculators, point of sales terminals, computers, computer monitors, game controllers, radio, stereos, TV, DVD players, set-top boxes, remote controls, automobile interfaces, appliances, household switches light and appliance switches, etc. Additionally, use of an overlay with cutouts is not absolutely necessary to practicing the above teachings. Similar functionality can be accomplished by simply embedding, embossing, or surface applying area-delineating markings, preferably with labels, to the touchpad itself and allowing software to accept only those touches that occur in these defined areas and to give the labeled meaning to these areas when so touched. However, use of an overlay with cutouts is currently greatly preferred because of the tactile delineation of areas it provides.


Returning to the Ipod example, because of the large memory currently available with the Ipod, it is also contemplated that a digital camera, similar to cell phone's camera be embedded in the Ipod and coupled to the Ipod controller and this inventive Ipod be operable for taking pictures and storing the pictures in the Ipod's memory. Another alternate embodiment for the disclosed aspects of the present invention is to use the viewfinder use detector, gaze tracker, and/or the disclosed internet connectability, herein described, in a video camera. As with the camera system disclosure, the viewfinder use detector can be used to enable or disable various aspects of the video camera system, such as turning the LCD display off when viewfinder use is detected. Gaze tracking is contemplated to be used to assist the video camera focusing or used to guide and select menu items. Internet connectability is contemplated be used to download sound or image files for editing or for uploading video recorded for editing or remote storage of the video images.


It is further contemplated that certain aspects of the presently disclosed invention have application beyond those disclosed herein. For instance, various voice recognition aspects of the present invention, such as use of a plurality of microphones or multiple different vocal utterances associated with the same command or delayed implementation of a command which corresponds to a recognized vocalization, are contemplated to have utility for many of the devices herein referenced and are anticipated to be incorporated therein. As an example, automatically connecting to the internet when a set of predetermined rules or conditions (such as time, date, status of equipment, etc) is met would be useful for the download/upload of information from/to the internet, like music, video, etc. for processing, storage, transmission to another party, etc. Those skilled in the art will undoubtedly see various combinations and alternative embodiments of the various aspects of the present invention herein taught but which will still be within the spirit and scope of the invention.

Claims
  • 1. A camera comprising a touchpad fitted with a solid overlay, the solid overlay having at least two cutouts on its surface, such that (1) the at least two cutouts provide defined areas for touch sensing on the touch pad and (2) each defined area for touch sensing corresponds to a defined camera function for a controller of the camera.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of pending application Ser. No. 16/663,742, filed Oct. 25, 2019, which is a continuation of application Ser. No. 14/614,515, filed Feb. 5, 2015 (abandoned), which claims the benefit of application Ser. No. 14/539,687 (now issued U.S. Pat. No. 9,485,403), filed Nov. 12, 2014, which claims the benefit of application Ser. No. 14/495,976 (now issued U.S. Pat. No. 8,91 7,982), filed Sep. 25, 2014, which claims the benefit of application Ser. No. 14/453,511 (now issued U.S. Pat. No. 8,923,692), filed Aug. 6, 2014, which claims the benefit of application Ser. No. 14/315,544 (now issued U.S. Pat. No. 8,897,634), filed Jun. 26, 2014, which claims the benefit of application Ser. No. 14/203,129 (now issued U.S. Pat. No. 8,818,182), filed Mar. 10, 2014, which claims the benefit of application Ser. No. 13/717,681 (now issued U.S. Pat. No. 8,831,418), filed Dec. 17, 2012, which claims the benefit of application Ser. No. 13/087,650 (now issued U.S. Pat. No. 8,467,672), filed Apr. 15, 2011, which claims the benefit of application Ser. No. 12/710,066 (now issued U.S. Pat. No. 7,933,508), filed Feb. 22, 2010, which claims the benefit of application Ser. No. 11/163,391 (now issued U.S. Pat. No. 7,697,827), filed Oct. 17, 2005, all of which are herein incorporated by reference. Reference is also made to related application Ser. No. 14/199,855 (now issued U.S. Pat. No. 8,824,879), filed Mar. 6, 2014, related application Ser. No. 14/950,338 (now issued U.S. Pat. No. 10,257,401), filed Nov. 24, 2015, related application Ser. No. 14/950,370 (now issued U.S. Pat. No. 10,063,761), filed Nov. 24, 2015, and related application Ser. No. 15/188,736 (now issued U.S. Pat. No. 9,936,116) filed Jun. 21, 2016.

US Referenced Citations (1491)
Number Name Date Kind
2950971 George Aug 1960 A
3403223 Derk Sep 1968 A
3439598 Weitzner et al. Apr 1969 A
3483324 Gorike Dec 1969 A
3639920 Griffin et al. Feb 1972 A
3751602 Breeden Aug 1973 A
3755625 Maston Aug 1973 A
3770892 Clapper Nov 1973 A
3777222 Harris Dec 1973 A
3793489 Sank Feb 1974 A
3814856 Dugan Jun 1974 A
3877790 Robinson Apr 1975 A
3973081 Hutchins Aug 1976 A
3994283 Farley Nov 1976 A
4003063 Takahashi et al. Jan 1977 A
4021828 Iura et al. May 1977 A
4081623 Vogeley Mar 1978 A
4082873 Williams Apr 1978 A
4087630 Browning et al. May 1978 A
4090032 Schrader May 1978 A
D248669 Ramsey Jul 1978 S
4099025 Kahn Jul 1978 A
4158750 Sakoe et al. Jun 1979 A
4192590 Kitaura Mar 1980 A
4195641 Joines et al. Apr 1980 A
4207959 Youdin et al. Jun 1980 A
4209244 Sahara et al. Jun 1980 A
4219260 Date et al. Aug 1980 A
4221927 Dankman et al. Sep 1980 A
4222644 Tano et al. Sep 1980 A
4222658 Mandel Sep 1980 A
4227177 Moshier Oct 1980 A
4237339 Bunting et al. Dec 1980 A
4270852 Suzuki et al. Jun 1981 A
4270853 Hatada et al. Jun 1981 A
4270854 Stemme et al. Jun 1981 A
4285559 Koch Aug 1981 A
4288078 Lugo Sep 1981 A
4290685 Ban Sep 1981 A
4308425 Momose et al. Dec 1981 A
4334740 Wray Jun 1982 A
4340800 Ueda et al. Jul 1982 A
4344682 Hattori Aug 1982 A
4354059 Ishigaki et al. Oct 1982 A
4386834 Toolan Jun 1983 A
4389109 Taniguchi et al. Jun 1983 A
4393271 Fujinami et al. Jul 1983 A
4399327 Yamamoto et al. Aug 1983 A
4434507 Thomas Feb 1984 A
4443077 Tanikawa Apr 1984 A
4450545 Noso et al. May 1984 A
4472742 Hasegawa et al. Sep 1984 A
4485484 Flanagan Nov 1984 A
4489442 Anderson et al. Dec 1984 A
4501012 Kishi et al. Feb 1985 A
4503528 Nojiri et al. Mar 1985 A
4506378 Noso et al. Mar 1985 A
4520576 Molen Jun 1985 A
4531818 Bally Jul 1985 A
4538295 Noso et al. Aug 1985 A
4538894 Shirane Sep 1985 A
4542969 Omura Sep 1985 A
4550343 Nakatani Oct 1985 A
4557271 Stoller et al. Dec 1985 A
4563780 Pollack Jan 1986 A
4567606 Vensko et al. Jan 1986 A
4595990 Garwin Jun 1986 A
4597098 Noso et al. Jun 1986 A
4613911 Ohta Sep 1986 A
4627620 Yang Dec 1986 A
4630910 Ross et al. Dec 1986 A
4635286 Bui et al. Jan 1987 A
4641292 Tunnell et al. Feb 1987 A
4642717 Matsuda et al. Feb 1987 A
4645458 Williams Feb 1987 A
4648052 Friedman et al. Mar 1987 A
4658425 Julstrom Apr 1987 A
4679924 Wamsley Jul 1987 A
4695953 Blair et al. Sep 1987 A
4702475 Elstein et al. Oct 1987 A
4711543 Blair et al. Dec 1987 A
4717364 Furukawa Jan 1988 A
4742369 Ishii et al. May 1988 A
4742548 Sessler et al. May 1988 A
4746213 Knapp May 1988 A
4751642 Silva et al. Jun 1988 A
4757388 Someya et al. Jul 1988 A
4761641 Schreiber Aug 1988 A
4764817 Blazek et al. Aug 1988 A
4776016 Hansen Oct 1988 A
4780906 Rajasekaran et al. Oct 1988 A
4783803 Baker et al. Nov 1988 A
4794934 Motoyama et al. Jan 1989 A
4796997 Svetkoff et al. Jan 1989 A
4797927 Schaire Jan 1989 A
4807051 Ogura Feb 1989 A
4807273 Haendle Feb 1989 A
4809065 Harris et al. Feb 1989 A
4809332 Jongman et al. Feb 1989 A
4817158 Picheny Mar 1989 A
4817950 Goo Apr 1989 A
4827520 Zeinstra May 1989 A
4833713 Muroi et al. May 1989 A
4836670 Hutchinson Jun 1989 A
4837817 Maemori Jun 1989 A
4843568 Krueger et al. Jun 1989 A
4862278 Dann et al. Aug 1989 A
4866470 Arai et al. Sep 1989 A
D305648 Edington Jan 1990 S
4893183 Nayar Jan 1990 A
4895231 Yamaguchi et al. Jan 1990 A
4901362 Terzian Feb 1990 A
4905029 Kelley Feb 1990 A
4925189 Braeunig May 1990 A
4950069 Hutchinson Aug 1990 A
4951079 Hoshino et al. Aug 1990 A
4953029 Morimoto et al. Aug 1990 A
4953222 Roberts Aug 1990 A
4961211 Tsugane et al. Oct 1990 A
4965626 Robison et al. Oct 1990 A
4965775 Elko et al. Oct 1990 A
4973149 Hutchinson Nov 1990 A
4977419 Wash et al. Dec 1990 A
4980918 Bahl et al. Dec 1990 A
4983996 Kinoshita Jan 1991 A
4989253 Liang et al. Jan 1991 A
5005041 Suda et al. Apr 1991 A
5023635 Nealon Jun 1991 A
5025283 Robison Jun 1991 A
5027149 Hoshino et al. Jun 1991 A
5048091 Sato et al. Sep 1991 A
5062010 Saito Oct 1991 A
5069732 Levine Dec 1991 A
5070355 Inoue et al. Dec 1991 A
5074683 Tarn et al. Dec 1991 A
5086385 Launey et al. Feb 1992 A
5097278 Tamamura et al. Mar 1992 A
5099262 Tanaka et al. Mar 1992 A
5101444 Wilson et al. Mar 1992 A
5111410 Nakayama et al. May 1992 A
5121426 Baumhauer, Jr. et al. Jun 1992 A
5127055 Larkey Jun 1992 A
5128700 Inoue et al. Jul 1992 A
5128705 Someya et al. Jul 1992 A
5134680 Schempp Jul 1992 A
5146249 Hoda et al. Sep 1992 A
5148154 MacKay et al. Sep 1992 A
5160952 Iwashita et al. Nov 1992 A
5164831 Kuchta et al. Nov 1992 A
5184295 Mann Feb 1993 A
5193117 Ono et al. Mar 1993 A
5204709 Sato Apr 1993 A
5208453 Hostetler May 1993 A
5210560 Labaziewicz May 1993 A
5210566 Nishida May 1993 A
5212647 Raney et al. May 1993 A
5229754 Aoki et al. Jul 1993 A
5229756 Kosugi et al. Jul 1993 A
5230023 Nakano Jul 1993 A
5239337 Takagi et al. Aug 1993 A
5239463 Blair et al. Aug 1993 A
5239464 Blair et al. Aug 1993 A
5241619 Schwartz et al. Aug 1993 A
5245372 Aoshima Sep 1993 A
5245381 Takagi et al. Sep 1993 A
5253008 Konishi et al. Oct 1993 A
5274862 Palmer Jan 1994 A
5288078 Capper et al. Feb 1994 A
5295491 Gevins Mar 1994 A
5297210 Julstrom Mar 1994 A
5303148 Mattson et al. Apr 1994 A
5303373 Harootian Apr 1994 A
5313542 Castonguay May 1994 A
5320538 Baum Jun 1994 A
5331149 Spitzer et al. Jul 1994 A
5335011 Addeo et al. Aug 1994 A
5335041 Fox Aug 1994 A
5335072 Tanaka et al. Aug 1994 A
5335313 Douglas Aug 1994 A
5345281 Taboada et al. Sep 1994 A
5345538 Narayannan et al. Sep 1994 A
5347306 Nitta Sep 1994 A
5363481 Tilt Nov 1994 A
5365302 Kodama Nov 1994 A
5366379 Yang et al. Nov 1994 A
5367315 Pan Nov 1994 A
5372147 Lathrop et al. Dec 1994 A
5373341 SanGregory Dec 1994 A
5385519 Hsu et al. Jan 1995 A
5386494 White Jan 1995 A
5404189 Labaziewicz et al. Apr 1995 A
5404397 Janse et al. Apr 1995 A
5405152 Katanics et al. Apr 1995 A
5417210 Funda et al. May 1995 A
5423554 Davis Jun 1995 A
5425129 Garman et al. Jun 1995 A
5426510 Meredith Jun 1995 A
5426745 Baji et al. Jun 1995 A
5427113 Hiroshi et al. Jun 1995 A
5446512 Mogamiya Aug 1995 A
5452397 Ittycheriah et al. Sep 1995 A
5454043 Freeman Sep 1995 A
5459511 Uehara et al. Oct 1995 A
5461453 Watanabe et al. Oct 1995 A
5465317 Epstein Nov 1995 A
5469740 French et al. Nov 1995 A
5471542 Ragland Nov 1995 A
5475792 Stanford et al. Dec 1995 A
5475798 Handlos Dec 1995 A
5477264 Sarbadhikari et al. Dec 1995 A
5481622 Gerhardt et al. Jan 1996 A
5486892 Suzuki et al. Jan 1996 A
5495576 Ritchey Feb 1996 A
5508663 Konno Apr 1996 A
5508774 Klees Apr 1996 A
5510981 Berger et al. Apr 1996 A
5511256 Capaldi Apr 1996 A
5513298 Stanford et al. Apr 1996 A
5515130 Tsukahara et al. May 1996 A
5516105 Eisenbrey et al. May 1996 A
5517021 Kaufman May 1996 A
5519809 Husseiny et al. May 1996 A
5521635 Mitsuhashi et al. May 1996 A
5524637 Erickson Jun 1996 A
5534917 MacDougall Jul 1996 A
5541400 Hagiwara et al. Jul 1996 A
5541656 Kare et al. Jul 1996 A
5544654 Murphy et al. Aug 1996 A
5546145 Bernardi et al. Aug 1996 A
5548335 Mitsuhashi et al. Aug 1996 A
5550380 Sugawara et al. Aug 1996 A
5550628 Kawabata Aug 1996 A
5557358 Mukai et al. Sep 1996 A
5561737 Bowen Oct 1996 A
5563988 Maes et al. Oct 1996 A
5566272 Brems et al. Oct 1996 A
5570151 Terunuma et al. Oct 1996 A
5573506 Vasko Nov 1996 A
5577981 Jarvik Nov 1996 A
5579037 Tahara et al. Nov 1996 A
5579046 Mitsuhashi et al. Nov 1996 A
5579080 Irie et al. Nov 1996 A
5580249 Jacobsen et al. Dec 1996 A
5581323 Suzuki et al. Dec 1996 A
5581485 Van Aken Dec 1996 A
5581655 Cohen et al. Dec 1996 A
5594469 Freeman et al. Jan 1997 A
5597309 Riess Jan 1997 A
5600399 Yamada et al. Feb 1997 A
5602458 Dowe Feb 1997 A
5603127 Veal Feb 1997 A
5606390 Arai et al. Feb 1997 A
5609938 Shields Mar 1997 A
5614763 Womack Mar 1997 A
5615296 Stanford et al. Mar 1997 A
5616078 Oh Apr 1997 A
5617312 Iura et al. Apr 1997 A
5633678 Parulski et al. May 1997 A
5634141 Akashi et al. May 1997 A
5637849 Wang et al. Jun 1997 A
5638300 Johnson Jun 1997 A
5640612 Owashi Jun 1997 A
5641288 Zaenglein Jun 1997 A
5644642 Kirschbaum Jul 1997 A
5647025 Frost et al. Jul 1997 A
5655172 Omi et al. Aug 1997 A
5664021 Chu et al. Sep 1997 A
5664133 Malamud et al. Sep 1997 A
5664243 Okada et al. Sep 1997 A
5666215 Fredlund et al. Sep 1997 A
5666566 Gu et al. Sep 1997 A
5668928 Groner Sep 1997 A
5670992 Iizuka et al. Sep 1997 A
5672840 Sage et al. Sep 1997 A
5673327 Julstrom Sep 1997 A
5675633 Kopp et al. Oct 1997 A
5677834 Mooneyham Oct 1997 A
5680709 Stone Oct 1997 A
5682030 Kubon Oct 1997 A
5682196 Freeman Oct 1997 A
5682229 Wangler Oct 1997 A
5689619 Smyth Nov 1997 A
5690582 Ulrich et al. Nov 1997 A
5703367 Hashimoto et al. Dec 1997 A
5704837 Iwasaki et al. Jan 1998 A
5706049 Moghadam et al. Jan 1998 A
5708863 Satoh et al. Jan 1998 A
5710866 Alleva et al. Jan 1998 A
5715334 Peters Feb 1998 A
5715548 Weismiller et al. Feb 1998 A
5715834 Bergamasco et al. Feb 1998 A
5721783 Anderson Feb 1998 A
5724619 Hamada et al. Mar 1998 A
5729289 Etoh Mar 1998 A
5729659 Potter Mar 1998 A
5734425 Takizawa et al. Mar 1998 A
D393808 Lindsey et al. Apr 1998 S
5737491 Allen et al. Apr 1998 A
5740484 Miyazaki et al. Apr 1998 A
5742233 Hoffman et al. Apr 1998 A
5745717 Vayda et al. Apr 1998 A
5745810 Masushima Apr 1998 A
5748992 Tsukahara et al. May 1998 A
5749000 Narisawa May 1998 A
5749324 Moore May 1998 A
5751260 Miller et al. May 1998 A
5752094 Tsutsumi et al. May 1998 A
5757428 Takei May 1998 A
5760917 Sheridan Jun 1998 A
5765045 Takagi et al. Jun 1998 A
5771414 Bowen Jun 1998 A
5771511 Kummer et al. Jun 1998 A
5774754 Ootsuka Jun 1998 A
5774851 Miyashiba et al. Jun 1998 A
5779483 Cho Jul 1998 A
5788688 Bauer et al. Aug 1998 A
5797046 Nagano et al. Aug 1998 A
5797122 Spies Aug 1998 A
5805251 Ozawa Sep 1998 A
5809591 Capaldi et al. Sep 1998 A
5812978 Nolan Sep 1998 A
5815750 Ishiguro Sep 1998 A
5819183 Voroba et al. Oct 1998 A
5828376 Solimene et al. Oct 1998 A
5829782 Breed et al. Nov 1998 A
5832077 Ciurpita Nov 1998 A
5832440 Woodbridge et al. Nov 1998 A
5841950 Wang et al. Nov 1998 A
5844599 Hildin Dec 1998 A
5848146 Slattery Dec 1998 A
5850058 Tano et al. Dec 1998 A
5850211 Tognazzini Dec 1998 A
5850218 LaJoie et al. Dec 1998 A
5855000 Waibel et al. Dec 1998 A
5867817 Catalo et al. Feb 1999 A
5870709 Bernstein Feb 1999 A
5871589 Hedge Feb 1999 A
5874947 Lin Feb 1999 A
5875108 Hoffberg et al. Feb 1999 A
5877772 Nomura et al. Mar 1999 A
5877803 Wee et al. Mar 1999 A
5877809 Omata et al. Mar 1999 A
5878922 Boring Mar 1999 A
5884265 Squitteri et al. Mar 1999 A
5884350 Kurze Mar 1999 A
5893037 Reele et al. Apr 1999 A
5897228 Schrock Apr 1999 A
5897232 Stephenson et al. Apr 1999 A
5898779 Squilla et al. Apr 1999 A
5903864 Gadbois et al. May 1999 A
5903870 Kaufman May 1999 A
5907723 Inoue May 1999 A
5911687 Sato et al. Jun 1999 A
5913080 Yamada et al. Jun 1999 A
5913727 Ahdoot Jun 1999 A
5917921 Sasaki Jun 1999 A
5920350 Keirsbilck Jul 1999 A
5923908 Schrock et al. Jul 1999 A
5926655 Irie et al. Jul 1999 A
5930533 Yamamoto Jul 1999 A
5930746 Ting Jul 1999 A
5933125 Fernie et al. Aug 1999 A
5940121 McIntyre et al. Aug 1999 A
5943516 Uchiyama et al. Aug 1999 A
5959667 Maeng Sep 1999 A
5970258 Suda et al. Oct 1999 A
5970457 Brant et al. Oct 1999 A
5980124 Bernardi et al. Nov 1999 A
5980256 Carmein Nov 1999 A
5982555 Melville et al. Nov 1999 A
5983186 Miyazawa et al. Nov 1999 A
5989157 Walton Nov 1999 A
5991385 Dunn et al. Nov 1999 A
5991720 Galler et al. Nov 1999 A
5991726 Immarco et al. Nov 1999 A
5995649 Marugame Nov 1999 A
5995931 Bahl et al. Nov 1999 A
5995936 Brais et al. Nov 1999 A
6003004 Hershkovits et al. Dec 1999 A
6003991 Virre Dec 1999 A
6004061 Manico et al. Dec 1999 A
6005548 Latypov et al. Dec 1999 A
6005610 Pingali Dec 1999 A
6006126 Cosman Dec 1999 A
6006187 Tanenblatt Dec 1999 A
6009210 Kang Dec 1999 A
6012029 Cirino et al. Jan 2000 A
6012102 Shachar Jan 2000 A
6014524 Suzuki et al. Jan 2000 A
6016450 Crock Jan 2000 A
6021278 Bernardi et al. Feb 2000 A
6021418 Brandt et al. Feb 2000 A
6027216 Guyton et al. Feb 2000 A
6031526 Shipp Feb 2000 A
6040824 Maekawa et al. Mar 2000 A
6049766 Laroche Apr 2000 A
6050963 Johnson et al. Apr 2000 A
6054990 Tran Apr 2000 A
6054991 Crane et al. Apr 2000 A
6066075 Poulton May 2000 A
6067112 Wellner et al. May 2000 A
6070140 Tran May 2000 A
6072494 Nguyen Jun 2000 A
6073489 French et al. Jun 2000 A
6077085 Parry et al. Jun 2000 A
6077201 Cheng Jun 2000 A
6078886 Dragosh et al. Jun 2000 A
6081670 Madsen et al. Jun 2000 A
6085160 D'hoore et al. Jul 2000 A
6088669 Maes Jul 2000 A
6091334 Galiana Jul 2000 A
6098458 French et al. Aug 2000 A
6099473 Liu et al. Aug 2000 A
6100896 Strohecker et al. Aug 2000 A
6101115 Ross Aug 2000 A
6101258 Killion et al. Aug 2000 A
6101289 Kellner Aug 2000 A
6101338 Bernardi et al. Aug 2000 A
6104877 Smart et al. Aug 2000 A
6111580 Fukui et al. Aug 2000 A
6115482 Goldberg et al. Sep 2000 A
6115556 Reddington Sep 2000 A
6115668 Kaneko et al. Sep 2000 A
6118888 Chino et al. Sep 2000 A
6128003 Smith et al. Oct 2000 A
6128446 Schrock et al. Oct 2000 A
6130677 Kunz Oct 2000 A
6130741 Wen et al. Oct 2000 A
6134392 Gove Oct 2000 A
6137487 Mantha Oct 2000 A
6137887 Anderson Oct 2000 A
6138091 Haataja et al. Oct 2000 A
6141463 Cowell et al. Oct 2000 A
6144807 Smart et al. Nov 2000 A
6147678 Kumar et al. Nov 2000 A
6147711 Washio Nov 2000 A
6147744 Smart et al. Nov 2000 A
6148154 Ishimaru et al. Nov 2000 A
6152856 Studor et al. Nov 2000 A
6159100 Smith Dec 2000 A
6160540 Fishkin et al. Dec 2000 A
6161932 Goto et al. Dec 2000 A
6163652 Sato Dec 2000 A
6167469 Safai et al. Dec 2000 A
6169854 Hasegawa et al. Jan 2001 B1
6173059 Huang et al. Jan 2001 B1
6173066 Peurach et al. Jan 2001 B1
6181343 Lyons Jan 2001 B1
6181377 Kobayashi Jan 2001 B1
6181883 Oswal Jan 2001 B1
6185371 Smart et al. Feb 2001 B1
6188777 Darrell et al. Feb 2001 B1
6192193 Smart et al. Feb 2001 B1
6192343 Morgan et al. Feb 2001 B1
6201931 Cipolla et al. Mar 2001 B1
6204877 Kiyokawa Mar 2001 B1
6215471 Deluca Apr 2001 B1
6215890 Matsuo et al. Apr 2001 B1
6215898 Woodfill et al. Apr 2001 B1
6222993 Smart et al. Apr 2001 B1
6224542 Chang et al. May 2001 B1
6226396 Marugame May 2001 B1
6229913 Nayar et al. May 2001 B1
6230138 Everhart May 2001 B1
6240251 Smart et al. May 2001 B1
6243076 Hatfield Jun 2001 B1
6243683 Peters et al. Jun 2001 B1
6244873 Hill et al. Jun 2001 B1
6249316 Anderson Jun 2001 B1
6253184 Ruppert Jun 2001 B1
6256060 Wakui Jul 2001 B1
6256400 Takata et al. Jul 2001 B1
6259436 Moon et al. Jul 2001 B1
6266635 Sneh Jul 2001 B1
6272287 Cipola et al. Aug 2001 B1
6275656 Cipola et al. Aug 2001 B1
6278973 Chung et al. Aug 2001 B1
6279946 Johnson et al. Aug 2001 B1
6282317 Luo et al. Aug 2001 B1
6283860 Lyons et al. Sep 2001 B1
6287252 Lugo Sep 2001 B1
6289112 Jain et al. Sep 2001 B1
6289140 Oliver Sep 2001 B1
6294993 Calaman Sep 2001 B1
6299308 Voronka et al. Oct 2001 B1
6304841 Berger et al. Oct 2001 B1
6308565 French et al. Oct 2001 B1
6311156 Ho Oct 2001 B1
6313864 Tabata et al. Nov 2001 B1
6316934 Amorai-Moriya et al. Nov 2001 B1
6317717 Lindsey et al. Nov 2001 B1
6321040 Wess et al. Nov 2001 B1
6323858 Gilbert et al. Nov 2001 B1
6324545 Morag Nov 2001 B1
6327423 Ejima et al. Dec 2001 B1
6339429 Schug Jan 2002 B1
6344875 Hashimoto et al. Feb 2002 B1
6345111 Fukui et al. Feb 2002 B1
6349001 Spitzer Feb 2002 B1
6351222 Henry et al. Feb 2002 B1
6351273 Lemelson et al. Feb 2002 B1
6359837 Tsukamoto Mar 2002 B1
6363160 Bradski et al. Mar 2002 B1
6366319 Bills Apr 2002 B1
6373961 Richardson et al. Apr 2002 B1
6377923 Hershkovits et al. Apr 2002 B1
6381316 Joyce et al. Apr 2002 B2
6381412 Ishito et al. Apr 2002 B1
6384819 Hunter May 2002 B1
6388681 Nozaki May 2002 B1
6388707 Suda May 2002 B1
6389395 Ringland May 2002 B1
6392249 Struye et al. May 2002 B1
6393216 Ootsuka et al. May 2002 B1
6394602 Morrison et al. May 2002 B1
6405939 Mazzenga et al. Jun 2002 B1
6406758 Bottari et al. Jun 2002 B1
6408138 Chang et al. Jun 2002 B1
6408301 Patton et al. Jun 2002 B1
6411744 Edwards Jun 2002 B1
6411925 Keiller Jun 2002 B1
6424843 Jyrki et al. Jul 2002 B1
6426740 Goto et al. Jul 2002 B1
6426761 Kanevsky et al. Jul 2002 B1
6430551 Thelen et al. Aug 2002 B1
6430997 French et al. Aug 2002 B1
6434255 Harakawa Aug 2002 B1
6434403 Ausems et al. Aug 2002 B1
6438323 DeCecca et al. Aug 2002 B1
6438520 Curt et al. Aug 2002 B1
6452348 Toyoda Sep 2002 B1
6452544 Hakala et al. Sep 2002 B1
6456788 Otani Sep 2002 B1
6456892 Dara-Abrams et al. Sep 2002 B1
6466688 Ramstack Oct 2002 B1
6476834 Doval et al. Nov 2002 B1
6496598 Harman Dec 2002 B1
6498628 Iwamura Dec 2002 B2
6499016 Anderson Dec 2002 B1
6503195 Keller et al. Jan 2003 B1
6504552 Phillips Jan 2003 B2
6510414 Chaves Jan 2003 B1
6526352 Breed et al. Feb 2003 B1
6529802 Kawakita et al. Mar 2003 B1
6531999 Trajkovic Mar 2003 B1
6535694 Engle et al. Mar 2003 B2
6538697 Honda et al. Mar 2003 B1
6539931 Trajkovic et al. Apr 2003 B2
6549586 Gustafsson et al. Apr 2003 B2
6549629 Finn et al. Apr 2003 B2
6556240 Oka et al. Apr 2003 B2
6556784 Onuki Apr 2003 B2
6560027 Meine May 2003 B2
6563532 Strub et al. May 2003 B1
6570555 Prevost et al. May 2003 B1
6584221 Moghaddam et al. Jun 2003 B1
6591239 McCall Jul 2003 B1
6593956 Potts et al. Jul 2003 B1
6594629 Basu et al. Jul 2003 B1
6603858 Raicevich et al. Aug 2003 B1
6606280 Knittel Aug 2003 B1
6608615 Martins Aug 2003 B1
6611456 Kushnarenko Aug 2003 B2
6611661 Buck Aug 2003 B2
6629642 Swartz et al. Oct 2003 B1
6633231 Okamoto et al. Oct 2003 B1
6633294 Rosenthal et al. Oct 2003 B1
6636259 Anderson Oct 2003 B1
6637883 Tengshe et al. Oct 2003 B1
6640202 Dietz et al. Oct 2003 B1
6654721 Handelman Nov 2003 B2
6658389 Alpdemir Dec 2003 B1
6658572 Craig Dec 2003 B1
6661918 Gordon et al. Dec 2003 B1
6674964 Irie Jan 2004 B2
6675075 Engelsberg et al. Jan 2004 B1
6678398 Wolters et al. Jan 2004 B2
6681031 Cohen et al. Jan 2004 B2
6686844 Murase et al. Feb 2004 B2
6690374 Park et al. Feb 2004 B2
6691151 Cheyer et al. Feb 2004 B1
6704044 Foster et al. Mar 2004 B1
6704415 Katayama et al. Mar 2004 B1
6704422 Jensen Mar 2004 B1
6707475 Snyder Mar 2004 B1
6711536 Rees Mar 2004 B2
6714205 Miyashita et al. Mar 2004 B1
6714665 Hanna et al. Mar 2004 B1
6715003 Safai Mar 2004 B1
6717600 Dutta et al. Apr 2004 B2
6721001 Berstis Apr 2004 B1
6724873 Senna Da Silva Apr 2004 B2
6731799 Sun et al. May 2004 B1
6735562 Zhang et al. May 2004 B1
6738066 Nguyen May 2004 B1
6741266 Kamiwada et al. May 2004 B1
6746397 Lee et al. Jun 2004 B2
6750913 Noro et al. Jun 2004 B1
6754373 Cuetos et al. Jun 2004 B1
6757657 Kojima et al. Jun 2004 B1
6758563 Levola Jul 2004 B2
6763226 McZeal, Jr. Jul 2004 B1
6766036 Pryor Jul 2004 B1
6766176 Gupta et al. Jul 2004 B1
6771294 Antoniac et al. Aug 2004 B1
6788809 Grzeszczuk et al. Sep 2004 B1
6793128 Huffman Sep 2004 B2
6795558 Matsuo Sep 2004 B2
6795806 Lewis et al. Sep 2004 B1
6798890 Killion et al. Sep 2004 B2
6801637 Voronka et al. Oct 2004 B2
6802382 Hattori et al. Oct 2004 B2
6803887 Lauper et al. Oct 2004 B1
6804396 Higaki et al. Oct 2004 B2
6807529 Johnson et al. Oct 2004 B2
6809759 Chiang Oct 2004 B1
6812956 Ferren et al. Nov 2004 B2
6812968 Kermani Nov 2004 B1
6813439 Misumi et al. Nov 2004 B2
6813603 Groner et al. Nov 2004 B1
6813618 Loui et al. Nov 2004 B1
6817982 Fritz et al. Nov 2004 B2
6821034 Ohmura Nov 2004 B2
6825769 Colmenarez et al. Nov 2004 B2
6833867 Anderson Dec 2004 B1
6842175 Schmalstieg et al. Jan 2005 B1
6842670 Lin et al. Jan 2005 B2
6847336 Lemelson et al. Jan 2005 B1
6853401 Fujii et al. Feb 2005 B2
6853972 Friedrich et al. Feb 2005 B2
6856708 Aoki Feb 2005 B1
6867798 Wada et al. Mar 2005 B1
6873723 Aucsmith et al. Mar 2005 B1
6882734 Watson et al. Apr 2005 B2
6882971 Craner Apr 2005 B2
6900731 Kreiner et al. May 2005 B2
6911972 Brinjes Jun 2005 B2
6912499 Sabourin et al. Jun 2005 B1
6919927 Hyodo Jul 2005 B1
6920283 Goldstein Jul 2005 B2
6920654 Noguchi et al. Jul 2005 B2
6927694 Smith Aug 2005 B1
6934461 Strub et al. Aug 2005 B1
6934684 Alpdemir et al. Aug 2005 B2
6937742 Roberts et al. Aug 2005 B2
6940545 Ray et al. Sep 2005 B1
6947029 Akasaka et al. Sep 2005 B2
6948937 Tretiakoff et al. Sep 2005 B2
6950534 Cohen et al. Sep 2005 B2
6952525 Lee et al. Oct 2005 B2
6956616 Jung et al. Oct 2005 B2
6959095 Bakis et al. Oct 2005 B2
6964023 Maes et al. Nov 2005 B2
6965403 Endo Nov 2005 B2
6970185 Halverson Nov 2005 B2
6970824 Hinde et al. Nov 2005 B2
6971072 Stein Nov 2005 B1
6975991 Basson et al. Dec 2005 B2
6983245 Jimenez Felstrom et al. Jan 2006 B1
6990455 Vozick et al. Jan 2006 B2
6993482 Ahlenius Jan 2006 B2
6999066 Litwiller Feb 2006 B2
7003134 Cowell et al. Feb 2006 B1
7006764 Brost Feb 2006 B2
7010263 Patsiokas Mar 2006 B1
7015950 Pryor Mar 2006 B1
7016505 Nakadai et al. Mar 2006 B1
7016604 Stavely et al. Mar 2006 B2
7020290 Ribic Mar 2006 B1
7027094 Battles et al. Apr 2006 B2
7027565 Tateishi et al. Apr 2006 B2
7028269 Cohen et al. Apr 2006 B1
7031439 Baxter Apr 2006 B2
7031477 Mella et al. Apr 2006 B1
7032182 Prabhu et al. Apr 2006 B2
7036094 Cohen et al. Apr 2006 B1
7039676 Day et al. May 2006 B1
7042440 Pryor et al. May 2006 B2
7046232 Gomi et al. May 2006 B2
7046300 Iyengar et al. May 2006 B2
7046924 Miller et al. May 2006 B2
7050606 Paul et al. May 2006 B2
7053938 Sherry May 2006 B1
7058204 Hildreth et al. Jun 2006 B2
7058409 Hänninen et al. Jun 2006 B2
7060957 Lange et al. Jun 2006 B2
7062576 Ohmura et al. Jun 2006 B2
7075579 Whitby et al. Jul 2006 B2
7076293 Wang Jul 2006 B2
7080014 Bush et al. Jul 2006 B2
7082393 Lahr Jul 2006 B2
7084859 Pryor Aug 2006 B1
7085590 Bates et al. Aug 2006 B2
7091928 Rajasingham Aug 2006 B2
7092024 Ejima et al. Aug 2006 B2
7095901 Lee et al. Aug 2006 B2
7095907 Berkner et al. Aug 2006 B1
7099920 Kojima et al. Aug 2006 B1
7107378 Brewer et al. Sep 2006 B1
7110553 Julstrom et al. Sep 2006 B1
7110582 Hay Sep 2006 B1
7112841 Eldridge et al. Sep 2006 B2
7113201 Taylor et al. Sep 2006 B1
7113918 Ahmad et al. Sep 2006 B1
7114659 Harari et al. Oct 2006 B2
7117519 Anderson et al. Oct 2006 B1
7120586 Loui et al. Oct 2006 B2
7121946 Paul et al. Oct 2006 B2
7122798 Shigenaka et al. Oct 2006 B2
7127401 Miller Oct 2006 B2
7133031 Wang et al. Nov 2006 B2
7133608 Nagata et al. Nov 2006 B1
7133937 Leavitt Nov 2006 B2
7134078 Vaarala Nov 2006 B2
7142197 Wang et al. Nov 2006 B2
7142231 Chipchase et al. Nov 2006 B2
7142678 Falcon Nov 2006 B2
7149552 Lair Dec 2006 B2
7149688 Schalkwyk Dec 2006 B2
7149814 Neufeld et al. Dec 2006 B2
7156866 Riggs et al. Jan 2007 B1
7158123 Myers et al. Jan 2007 B2
7158175 Belz et al. Jan 2007 B2
7163151 Kiiskinen Jan 2007 B2
7164117 Breed Jan 2007 B2
7167201 Stavely et al. Jan 2007 B2
7168804 Velazquez Jan 2007 B2
7170492 Bell Jan 2007 B2
7173722 Lapstun et al. Feb 2007 B1
7184573 Malone et al. Feb 2007 B2
7187412 Silverstein Mar 2007 B1
7187764 Ruetschi Mar 2007 B2
7190825 Yoon Mar 2007 B2
7194412 Mays Mar 2007 B2
7202898 Braun et al. Apr 2007 B1
7206022 Miller et al. Apr 2007 B2
7209995 Pinto et al. Apr 2007 B2
7218311 Akins May 2007 B2
7219062 Colmenarez et al. May 2007 B2
7221805 Bachelder May 2007 B1
7222078 Abelow May 2007 B2
7227526 Hildreth et al. Jun 2007 B2
7227960 Kataoka Jun 2007 B2
7228275 Endo et al. Jun 2007 B1
7233345 Yoneda Jun 2007 B2
7245271 Burr et al. Jul 2007 B2
7247139 Yudkovitch et al. Jul 2007 B2
7248855 Joyce et al. Jul 2007 B2
7257831 Ozawa Aug 2007 B1
7259747 Bell Aug 2007 B2
7259785 Stavely et al. Aug 2007 B2
7263953 Sundararajan Sep 2007 B2
7271827 Nister Sep 2007 B2
7272562 Olorenshaw et al. Sep 2007 B2
7274808 Ang et al. Sep 2007 B2
7283854 Sato et al. Oct 2007 B2
7283983 Dooley et al. Oct 2007 B2
7286256 Herbert Oct 2007 B2
7287737 Rossi Oct 2007 B2
7295978 Schwartz et al. Nov 2007 B1
7299177 Broman et al. Nov 2007 B2
7301465 Tengshe et al. Nov 2007 B2
7305344 Glynn et al. Dec 2007 B2
7305535 Harari et al. Dec 2007 B2
7307653 Dutta Dec 2007 B2
7308112 Fujimura et al. Dec 2007 B2
7315323 Ito Jan 2008 B2
7317836 Fujimura et al. Jan 2008 B2
7319962 Goedeke et al. Jan 2008 B2
7321763 Tanaka et al. Jan 2008 B2
7321853 Asano Jan 2008 B2
7324649 Knapp et al. Jan 2008 B1
7324943 Rigazio et al. Jan 2008 B2
7327890 Fredlund Feb 2008 B2
7340766 Nagao Mar 2008 B2
7346176 Bernardi et al. Mar 2008 B1
7346374 Witkowski et al. Mar 2008 B2
7347551 Fergason et al. Mar 2008 B2
7348963 Bell Mar 2008 B2
7349722 Witkowski et al. Mar 2008 B2
7362490 Park Apr 2008 B2
7362966 Uchiyama Apr 2008 B2
7366540 Ansari et al. Apr 2008 B2
7367887 Watabe et al. May 2008 B2
7373389 Rosenbaum et al. May 2008 B2
7376290 Anderson et al. May 2008 B2
7379563 Shamaie May 2008 B2
7379566 Hildreth May 2008 B2
7385641 Ito Jun 2008 B2
7389591 Jaiswal et al. Jun 2008 B2
7394480 Song Jul 2008 B2
7394543 Crowther Jul 2008 B2
7400347 Krogmann et al. Jul 2008 B2
7403816 Ohkura Jul 2008 B2
7405754 Inoue Jul 2008 B2
7406408 Lackey et al. Jul 2008 B1
7408439 Wang et al. Aug 2008 B2
7415116 Rees Aug 2008 B1
7417683 Hirai Aug 2008 B2
7428000 Cutler et al. Sep 2008 B2
7428708 Okamoto et al. Sep 2008 B2
7430312 Gu Sep 2008 B2
7430503 Walker Sep 2008 B1
7436496 Kawahito Oct 2008 B2
7437488 Ito et al. Oct 2008 B2
7438414 Rosenberg Oct 2008 B2
7440013 Funakura Oct 2008 B2
7443419 Anderson et al. Oct 2008 B2
7443447 Shirakawa Oct 2008 B2
7444068 Obrador Oct 2008 B2
7444340 Padgett Oct 2008 B2
7446368 Eldridge et al. Nov 2008 B2
7447320 Bryson et al. Nov 2008 B2
7447635 Konopka et al. Nov 2008 B1
7448751 Kiderman et al. Nov 2008 B2
7452275 Kuraishi Nov 2008 B2
7453605 Parulski et al. Nov 2008 B2
7455412 Rottcher Nov 2008 B2
7461094 Morris et al. Dec 2008 B2
7463304 Murray Dec 2008 B2
7468744 Edwards et al. Dec 2008 B2
7471317 Seki Dec 2008 B2
7477207 Estep Jan 2009 B2
7479943 Lunsford Jan 2009 B1
7483057 Grosvenor et al. Jan 2009 B2
7483061 Fredlund et al. Jan 2009 B2
7489812 Fox et al. Feb 2009 B2
7492116 Oleynikov et al. Feb 2009 B2
7493312 Liu et al. Feb 2009 B2
7493559 Wolff et al. Feb 2009 B1
7499642 Nakaya Mar 2009 B2
7502731 Emonts et al. Mar 2009 B2
7503065 Packingham et al. Mar 2009 B1
7505056 Kurzweil et al. Mar 2009 B2
7511741 Son Mar 2009 B2
7515193 Honda Apr 2009 B2
7515825 Takashi Apr 2009 B2
7518631 Hershey et al. Apr 2009 B2
7518641 Mashitani et al. Apr 2009 B2
7522065 Falcon Apr 2009 B2
7526314 Kennedy Apr 2009 B2
7526735 Fischer et al. Apr 2009 B2
7528846 Zhang et al. May 2009 B2
7529772 Singh May 2009 B2
7536032 Bell May 2009 B2
7539353 Kawada May 2009 B2
7548255 Adams et al. Jun 2009 B2
7551354 Horsten et al. Jun 2009 B2
7557850 Abe Jul 2009 B2
7560701 Oggier et al. Jul 2009 B2
7561143 Milekic Jul 2009 B1
7561201 Hong Jul 2009 B2
7561741 Lee et al. Jul 2009 B2
7570884 Nonaka Aug 2009 B2
7574020 Shamaie Aug 2009 B2
7576727 Bell Aug 2009 B2
7580570 Manu et al. Aug 2009 B2
7583316 Miyashita et al. Sep 2009 B2
7583441 Taki Sep 2009 B2
7587318 Seshadri Sep 2009 B2
7590262 Fujimura et al. Sep 2009 B2
7593552 Higaki et al. Sep 2009 B2
7593854 Belrose Sep 2009 B2
7598942 Underkoffler et al. Oct 2009 B2
7600201 Endler et al. Oct 2009 B2
7607509 Schmiz et al. Oct 2009 B2
7612766 Shintome Nov 2009 B2
7617108 Matsubara et al. Nov 2009 B2
7619660 Grosvenor Nov 2009 B2
7620202 Fujimura et al. Nov 2009 B2
7620432 Willins et al. Nov 2009 B2
7629400 Hyman Dec 2009 B2
7630878 Fingscheidt et al. Dec 2009 B2
7643985 Horvitz Jan 2010 B2
7646193 Yoshio et al. Jan 2010 B2
7656426 Yamaya Feb 2010 B2
7657062 Pilu Feb 2010 B2
7672512 Cohen et al. Mar 2010 B2
7680287 Amada et al. Mar 2010 B2
7684016 Schaefer Mar 2010 B1
7684592 Paul et al. Mar 2010 B2
7684982 Taneda Mar 2010 B2
7685521 Ueda et al. Mar 2010 B1
7689404 Khasin Mar 2010 B2
7693720 Kennewick et al. Apr 2010 B2
7694218 Masuda et al. Apr 2010 B2
7698125 Graehl et al. Apr 2010 B2
7702130 Ho et al. Apr 2010 B2
7702516 Fellenstein et al. Apr 2010 B2
7702821 Feinberg et al. Apr 2010 B2
7704135 Harrison, Jr. Apr 2010 B2
7706553 Brown Apr 2010 B2
7707035 McCune Apr 2010 B2
7710391 Bell et al. May 2010 B2
7714880 Johnson May 2010 B2
7716050 Gillick et al. May 2010 B2
7742073 Brodsky et al. Jun 2010 B1
7760191 Cohen et al. Jul 2010 B2
7761297 Lee Jul 2010 B2
7764290 Fredlund et al. Jul 2010 B2
7764320 Salvato Jul 2010 B1
7772796 Farritor et al. Aug 2010 B2
7778438 Malone Aug 2010 B2
7782365 Levien et al. Aug 2010 B2
7783022 Jay et al. Aug 2010 B1
7783063 Pocino et al. Aug 2010 B2
7809197 Fedorovskaya et al. Oct 2010 B2
7809570 Kennewick et al. Oct 2010 B2
7813597 Cohen et al. Oct 2010 B2
7815507 Parrott et al. Oct 2010 B2
7821541 Delean Oct 2010 B2
7822613 Matsubara et al. Oct 2010 B2
7843495 Aas et al. Nov 2010 B2
7848535 Akino Dec 2010 B2
7849475 Covell et al. Dec 2010 B2
7853050 Wang et al. Dec 2010 B2
7864937 Bathurst et al. Jan 2011 B2
7869578 Evans et al. Jan 2011 B2
7869636 Korotkov Jan 2011 B2
7872675 Levien et al. Jan 2011 B2
7876334 Bychkov et al. Jan 2011 B2
7876357 Jung et al. Jan 2011 B2
7884849 Yin et al. Feb 2011 B2
7890862 Kompe et al. Feb 2011 B2
7896869 DiSilvestro et al. Mar 2011 B2
7898563 Park Mar 2011 B2
7904023 Viitamäki Mar 2011 B2
7907199 Seki et al. Mar 2011 B2
7907638 Norhammar et al. Mar 2011 B2
7908629 Lewis Mar 2011 B2
7916849 Bathurst et al. Mar 2011 B2
7917367 Cristo et al. Mar 2011 B2
7920102 Breed Apr 2011 B2
7920169 Jung et al. Apr 2011 B2
7940299 Geng May 2011 B2
7940897 Khor et al. May 2011 B2
7942816 Satoh et al. May 2011 B2
7949529 Weider et al. May 2011 B2
7957766 Gong et al. Jun 2011 B2
7960935 Farritor et al. Jun 2011 B2
7983917 Kennewick et al. Jul 2011 B2
7990413 Good Aug 2011 B2
8023998 Croome Sep 2011 B2
8031853 Bathurst et al. Oct 2011 B2
8035624 Bell et al. Oct 2011 B2
8036893 Reich Oct 2011 B2
8037229 Zer et al. Oct 2011 B2
8042044 Leeuwen Oct 2011 B2
8045050 Nogo et al. Oct 2011 B2
8046504 Feinberg et al. Oct 2011 B2
8046818 Ngo Oct 2011 B2
8059921 Frohlich et al. Nov 2011 B2
8064650 Webb Nov 2011 B2
8072470 Marks Dec 2011 B2
8073690 Nakadai et al. Dec 2011 B2
8085994 Kim Dec 2011 B2
8094212 Jelinek Jan 2012 B2
8102383 Cohen et al. Jan 2012 B2
8106066 Schumacher et al. Jan 2012 B2
8115868 Yang et al. Feb 2012 B2
8117623 Malasky et al. Feb 2012 B1
8125444 Noerager Feb 2012 B2
8140813 Ozceri et al. Mar 2012 B2
8165341 Rhoads Apr 2012 B2
8175883 Grant et al. May 2012 B2
8176515 Ahmad et al. May 2012 B2
8213633 Kobayashi et al. Jul 2012 B2
8214196 Yamada et al. Jul 2012 B2
8224776 Anderson et al. Jul 2012 B1
8226011 Merkli et al. Jul 2012 B2
8229252 Cohen et al. Jul 2012 B2
8232979 Cohen et al. Jul 2012 B2
8234106 Marcu et al. Jul 2012 B2
8237809 Mertens Aug 2012 B2
8238722 Bhadkamkar et al. Aug 2012 B2
8244542 Claudatos et al. Aug 2012 B2
8290313 Cohen et al. Oct 2012 B2
8296127 Marcu et al. Oct 2012 B2
8332224 Di Cristo et al. Dec 2012 B2
8339420 Hiraoka Dec 2012 B2
8341522 Jung et al. Dec 2012 B2
8345105 Fisher et al. Jan 2013 B2
8350683 DeLine et al. Jan 2013 B2
8350946 Jung et al. Jan 2013 B2
8381135 Hotelling et al. Feb 2013 B2
8384668 Barney et al. Feb 2013 B2
8386909 Lin Feb 2013 B2
8396242 Watanabe Mar 2013 B2
8407201 Wu et al. Mar 2013 B2
8429244 Naimark et al. Apr 2013 B2
8457614 Bernard et al. Jun 2013 B2
8460103 Mattice et al. Jun 2013 B2
8467672 Konicek Jun 2013 B2
8543906 Chidlovskii et al. Sep 2013 B2
8548794 Koehn Oct 2013 B2
8558921 Walker et al. Oct 2013 B2
8571851 Tickner et al. Oct 2013 B1
8582831 Miura Nov 2013 B2
8587514 Lundström Nov 2013 B2
8594341 Rothschild Nov 2013 B2
8599174 Cohen et al. Dec 2013 B2
8600669 Skarine Dec 2013 B2
8600728 Knight et al. Dec 2013 B2
8606383 Jung et al. Dec 2013 B2
8614760 Nobels Dec 2013 B2
8625880 Shillman et al. Jan 2014 B2
8631322 Isomura et al. Jan 2014 B2
8634575 Williams Jan 2014 B2
8640959 Cohen et al. Feb 2014 B2
8644525 Bathurst et al. Feb 2014 B2
8645325 Anderson et al. Feb 2014 B2
8661333 Matsuda et al. Feb 2014 B2
8666725 Och Mar 2014 B2
8668584 Wels Mar 2014 B2
8670632 Wilson Mar 2014 B2
8681225 Levien et al. Mar 2014 B2
8682005 Watson et al. Mar 2014 B2
8684839 Mattice et al. Apr 2014 B2
8687820 Truong et al. Apr 2014 B2
8699869 Kamimura Apr 2014 B2
8711188 Albrecht et al. Apr 2014 B2
8745541 Wilson et al. Jun 2014 B2
8750513 Renkis Jun 2014 B2
8761840 Dunko Jun 2014 B2
8768099 Derrenberger et al. Jul 2014 B2
8781191 Lang et al. Jul 2014 B2
8819596 Holopainen et al. Aug 2014 B2
8831951 Cohen Sep 2014 B2
8843950 Zhang Sep 2014 B2
8848987 Nölle et al. Sep 2014 B2
8886517 Soricut et al. Nov 2014 B2
8902320 Jung et al. Dec 2014 B2
8921473 Hyman Dec 2014 B1
8970725 Dekker et al. Mar 2015 B2
8988537 Jung et al. Mar 2015 B2
9001215 Jung et al. Apr 2015 B2
9041826 Jung et al. May 2015 B2
9082456 Jung et al. Jul 2015 B2
9098826 Jung et al. Aug 2015 B2
9098958 Joyce et al. Aug 2015 B2
9100742 Pearah et al. Aug 2015 B2
9124729 Jung et al. Sep 2015 B2
9152840 Puolitaival et al. Oct 2015 B2
9155373 Allen et al. Oct 2015 B2
9191611 Levien et al. Nov 2015 B2
9239677 Ordin Jan 2016 B2
9274598 Beymer et al. Mar 2016 B2
9325781 Jung et al. Apr 2016 B2
9342829 Zhou et al. May 2016 B2
9451200 Levien et al. Sep 2016 B2
9467642 Hiraide et al. Oct 2016 B2
9489671 Zhou et al. Nov 2016 B2
9489717 Jung et al. Nov 2016 B2
9600832 Zhou Mar 2017 B2
9621749 Jung et al. Apr 2017 B2
9646614 Bellegarda et al. May 2017 B2
9652032 Mitchell May 2017 B2
9652042 Oliver et al. May 2017 B2
9659212 Nguyen et al. May 2017 B2
9691388 Bodin et al. Jun 2017 B2
9704502 Malamud et al. Jul 2017 B2
9779750 Allen et al. Oct 2017 B2
9819490 Jung et al. Nov 2017 B2
9910341 Jung et al. Mar 2018 B2
9942420 Rao et al. Apr 2018 B2
9943372 Sholev et al. Apr 2018 B2
10003762 Jung et al. Jun 2018 B2
10039445 Torch Aug 2018 B1
10055046 Lengeling et al. Aug 2018 B2
10076705 Deshpande et al. Sep 2018 B2
10097756 Levien et al. Oct 2018 B2
10126828 Amento et al. Nov 2018 B2
10318871 Cheyer et al. Jun 2019 B2
10460346 Decre et al. Oct 2019 B2
10488950 Wilson Nov 2019 B2
10514816 Jung et al. Dec 2019 B2
10545645 Kim et al. Jan 2020 B2
10551930 Oliver Feb 2020 B2
10721066 Malone Jul 2020 B2
10915171 Shell et al. Feb 2021 B2
10966239 Lewis Mar 2021 B1
20010010543 Ward et al. Aug 2001 A1
20010012065 Ejima et al. Aug 2001 A1
20010012066 Parulski et al. Aug 2001 A1
20010014835 Gauthier et al. Aug 2001 A1
20010015751 Geng Aug 2001 A1
20010019359 Parulski et al. Sep 2001 A1
20010020777 Johnson et al. Sep 2001 A1
20010022618 Ward et al. Sep 2001 A1
20010028474 Parulski et al. Oct 2001 A1
20010030773 Matsuura et al. Oct 2001 A1
20010034783 Kitamura Oct 2001 A1
20010048774 Seki et al. Dec 2001 A1
20010051874 Tsuji Dec 2001 A1
20010054183 Curreri Dec 2001 A1
20010056342 Piehn et al. Dec 2001 A1
20020005907 Alten Jan 2002 A1
20020007510 Mann Jan 2002 A1
20020008765 Ejima et al. Jan 2002 A1
20020013701 Oliver et al. Jan 2002 A1
20020015037 Moore et al. Feb 2002 A1
20020019584 Schulze et al. Feb 2002 A1
20020030831 Kinjo Mar 2002 A1
20020047905 Kinjo Apr 2002 A1
20020049589 Poirier Apr 2002 A1
20020051074 Kawaoka May 2002 A1
20020051638 Arakawa May 2002 A1
20020054030 Murphy May 2002 A1
20020054175 Miettinen et al. May 2002 A1
20020059215 Kotani et al. May 2002 A1
20020068600 Chihara et al. Jun 2002 A1
20020071277 Ashbrook et al. Jun 2002 A1
20020072918 White et al. Jun 2002 A1
20020076100 Luo Jun 2002 A1
20020080239 Fujii et al. Jun 2002 A1
20020080251 Moriwaki Jun 2002 A1
20020080257 Blank Jun 2002 A1
20020082844 Van Gestel Jun 2002 A1
20020087546 Slater et al. Jul 2002 A1
20020089543 Ostergaard et al. Jul 2002 A1
20020091511 Hellwig et al. Jul 2002 A1
20020097218 Gutta et al. Jul 2002 A1
20020101539 Yokota Aug 2002 A1
20020101568 Eberl et al. Aug 2002 A1
20020101619 Tsubaki et al. Aug 2002 A1
20020103651 Alexander et al. Aug 2002 A1
20020103813 Frigon Aug 2002 A1
20020105482 Lemelson et al. Aug 2002 A1
20020105575 Hinde Aug 2002 A1
20020106041 Chang et al. Aug 2002 A1
20020107694 Lerg Aug 2002 A1
20020116197 Erten Aug 2002 A1
20020120643 Iyengar et al. Aug 2002 A1
20020140803 Gutta et al. Oct 2002 A1
20020150869 Shpiro Oct 2002 A1
20020166557 Cooper Nov 2002 A1
20020178010 Weaver et al. Nov 2002 A1
20020188571 Pilgrim Dec 2002 A1
20020188693 Simpson et al. Dec 2002 A1
20020191076 Wada et al. Dec 2002 A1
20020194414 Bateman et al. Dec 2002 A1
20020196358 Kim Dec 2002 A1
20020196360 Miyadera Dec 2002 A1
20030001908 Cohen Jan 2003 A1
20030001949 Obata et al. Jan 2003 A1
20030004727 Keiller Jan 2003 A1
20030004728 Keiller Jan 2003 A1
20030009329 Stahl et al. Jan 2003 A1
20030009335 Schalkwyk et al. Jan 2003 A1
20030016856 Walker et al. Jan 2003 A1
20030018472 Hershkovits et al. Jan 2003 A1
20030023439 Ciurpita et al. Jan 2003 A1
20030030731 Colby Feb 2003 A1
20030032435 Asada et al. Feb 2003 A1
20030035084 Makino Feb 2003 A1
20030040910 Bruwer Feb 2003 A1
20030043271 Dantwala Mar 2003 A1
20030055653 Ishii et al. Mar 2003 A1
20030063208 Kazami Apr 2003 A1
20030075067 Welch et al. Apr 2003 A1
20030076312 Yokoyama Apr 2003 A1
20030076408 Dutta Apr 2003 A1
20030076980 Zhang et al. Apr 2003 A1
20030081738 Kohnle et al. May 2003 A1
20030083872 Kikinis May 2003 A1
20030090572 Belz et al. May 2003 A1
20030095154 Colmenarez May 2003 A1
20030101052 Chen et al. May 2003 A1
20030112267 Belrose Jun 2003 A1
20030114202 Suh et al. Jun 2003 A1
20030115167 Sharif et al. Jun 2003 A1
20030120183 Simmons Jun 2003 A1
20030122507 Gutta et al. Jul 2003 A1
20030122777 Grover Jul 2003 A1
20030132950 Surucu et al. Jul 2003 A1
20030133015 Jackel et al. Jul 2003 A1
20030133577 Yoshida Jul 2003 A1
20030142041 Barlow et al. Jul 2003 A1
20030142215 Ward et al. Jul 2003 A1
20030154078 Rees Aug 2003 A1
20030163289 Whelan et al. Aug 2003 A1
20030163313 Rees Aug 2003 A1
20030163324 Abbasi Aug 2003 A1
20030163325 Maase Aug 2003 A1
20030175010 Nomura et al. Sep 2003 A1
20030177012 Drennan Sep 2003 A1
20030179888 Burnett et al. Sep 2003 A1
20030182130 Sun et al. Sep 2003 A1
20030184651 Ohsawa et al. Oct 2003 A1
20030189642 Bean et al. Oct 2003 A1
20030200089 Nakagawa et al. Oct 2003 A1
20030202243 Boys et al. Oct 2003 A1
20030204403 Browning Oct 2003 A1
20030206491 Pacheco et al. Nov 2003 A1
20030210255 Hiraki Nov 2003 A1
20030214524 Oka Nov 2003 A1
20030215128 Thompson Nov 2003 A1
20030222892 Diamond et al. Dec 2003 A1
20030234878 Yang Dec 2003 A1
20040001588 Hairston Jan 2004 A1
20040003151 Bateman et al. Jan 2004 A1
20040003341 alSafadi et al. Jan 2004 A1
20040004737 Kahn et al. Jan 2004 A1
20040005915 Hunter Jan 2004 A1
20040008263 Sayers et al. Jan 2004 A1
20040015364 Sulc Jan 2004 A1
20040037450 Bradski Feb 2004 A1
20040040086 Eisenberg et al. Mar 2004 A1
20040041904 Lapalme et al. Mar 2004 A1
20040041921 Coates Mar 2004 A1
20040051804 Veturino et al. Mar 2004 A1
20040054358 Cox et al. Mar 2004 A1
20040054539 Simpson Mar 2004 A1
20040056870 Shimoyama et al. Mar 2004 A1
20040059573 Cheong Mar 2004 A1
20040061783 Choi et al. Apr 2004 A1
20040064834 Kuwata Apr 2004 A1
20040070670 Foster Apr 2004 A1
20040080624 Yuen Apr 2004 A1
20040082341 Stanforth Apr 2004 A1
20040085454 Liao May 2004 A1
20040087838 Galloway et al. May 2004 A1
20040095395 Kurtenbach May 2004 A1
20040100505 Cazier May 2004 A1
20040103111 Miller et al. May 2004 A1
20040109096 Anderson et al. Jun 2004 A1
20040109150 Igarashi Jun 2004 A1
20040119754 Bangalore et al. Jun 2004 A1
20040125220 Fukuda et al. Jul 2004 A1
20040139929 Nightlinger et al. Jul 2004 A1
20040140971 Yamazaki et al. Jul 2004 A1
20040143440 Prasad et al. Jul 2004 A1
20040145660 Kusaka Jul 2004 A1
20040160463 Battles et al. Aug 2004 A1
20040172419 Morris et al. Sep 2004 A1
20040189856 Tanaka Sep 2004 A1
20040190874 Lei et al. Sep 2004 A1
20040192421 Kawahara Sep 2004 A1
20040193326 Phillips et al. Sep 2004 A1
20040196399 Stavely Oct 2004 A1
20040196400 Battles et al. Oct 2004 A1
20040201681 Chen et al. Oct 2004 A1
20040201709 Mcintyre et al. Oct 2004 A1
20040201738 Moores et al. Oct 2004 A1
20040205655 Wu Oct 2004 A1
20040212713 Takemoto et al. Oct 2004 A1
20040215464 Nelson Oct 2004 A1
20040218045 Bodnar et al. Nov 2004 A1
20040233173 Bryant Nov 2004 A1
20040246272 Ramian Dec 2004 A1
20040246386 Thomas et al. Dec 2004 A1
20040256009 Valenzuela Dec 2004 A1
20040260554 Connell et al. Dec 2004 A1
20040264726 Gauger, Jr. et al. Dec 2004 A1
20040267521 Cutler et al. Dec 2004 A1
20050001024 Kusaka et al. Jan 2005 A1
20050001902 Brogan et al. Jan 2005 A1
20050007468 Stavely et al. Jan 2005 A1
20050007552 Fergason et al. Jan 2005 A1
20050014998 Korotkov Jan 2005 A1
20050015710 Williams Jan 2005 A1
20050030296 Stohrer et al. Feb 2005 A1
20050036034 Rea et al. Feb 2005 A1
20050047629 Farrell et al. Mar 2005 A1
20050048918 Frost et al. Mar 2005 A1
20050052548 Delaney Mar 2005 A1
20050052558 Hikeki et al. Mar 2005 A1
20050055479 Zer et al. Mar 2005 A1
20050055636 Graves Mar 2005 A1
20050060142 Visser et al. Mar 2005 A1
20050068171 Kelliher et al. Mar 2005 A1
20050086056 Yoda et al. Apr 2005 A1
20050090201 Lengies et al. Apr 2005 A1
20050093976 Valleriano et al. May 2005 A1
20050094019 Grosvenor et al. May 2005 A1
20050096034 Petermann May 2005 A1
20050096084 Pohja et al. May 2005 A1
20050097173 Johns et al. May 2005 A1
20050100224 Henry et al. May 2005 A1
20050102133 Rees May 2005 A1
20050102141 Chikuri May 2005 A1
20050102148 Rogitz May 2005 A1
20050102167 Kapoor May 2005 A1
20050104958 Egnal et al. May 2005 A1
20050110878 Dalton May 2005 A1
20050114131 Stoimenov et al. May 2005 A1
20050114357 Chengalvarayan et al. May 2005 A1
20050118990 Stephens Jun 2005 A1
20050119894 Cutler et al. Jun 2005 A1
20050122404 Liu Jun 2005 A1
20050128192 Heintzman et al. Jun 2005 A1
20050128311 Rees et al. Jun 2005 A1
20050130611 Lu et al. Jun 2005 A1
20050131685 Roth et al. Jun 2005 A1
20050134685 Egnal et al. Jun 2005 A1
20050134939 Ikeda et al. Jun 2005 A1
20050137786 Breed et al. Jun 2005 A1
20050146609 Creamer et al. Jul 2005 A1
20050146612 Ward et al. Jul 2005 A1
20050146620 Jour et al. Jul 2005 A1
20050146621 Tanaka et al. Jul 2005 A1
20050146746 Parulski et al. Jul 2005 A1
20050149334 Chen Jul 2005 A1
20050149336 Cooley Jul 2005 A1
20050149979 Creamer et al. Jul 2005 A1
20050159955 Oerder Jul 2005 A1
20050161510 Kiiskinen Jul 2005 A1
20050164148 Sinclair Jul 2005 A1
20050168579 Imamura Aug 2005 A1
20050171955 Hull et al. Aug 2005 A1
20050179811 Palatov Aug 2005 A1
20050181774 Miyata Aug 2005 A1
20050181806 Dowling et al. Aug 2005 A1
20050192808 Sugiyama Sep 2005 A1
20050195309 Kim et al. Sep 2005 A1
20050200478 Koch et al. Sep 2005 A1
20050200718 Lee Sep 2005 A1
20050202844 Jabri et al. Sep 2005 A1
20050203740 Chambers et al. Sep 2005 A1
20050212765 Ogino Sep 2005 A1
20050212817 Cannon et al. Sep 2005 A1
20050213147 Minatogawa Sep 2005 A1
20050216862 Shinohara et al. Sep 2005 A1
20050219396 Tella Oct 2005 A1
20050249023 Bodlaender Nov 2005 A1
20050254813 Brendzel Nov 2005 A1
20050259173 Nakajima et al. Nov 2005 A1
20050266839 Paul et al. Dec 2005 A1
20050267676 Nezu et al. Dec 2005 A1
20050271117 Grassl et al. Dec 2005 A1
20050273489 Pecht et al. Dec 2005 A1
20050275632 Pu et al. Dec 2005 A1
20060005629 Tokunaga et al. Jan 2006 A1
20060008256 Khedouri et al. Jan 2006 A1
20060013197 Anderson Jan 2006 A1
20060013446 Stephens Jan 2006 A1
20060017832 Kemppinen Jan 2006 A1
20060017833 Gong et al. Jan 2006 A1
20060030956 Kumar Feb 2006 A1
20060031126 Ma et al. Feb 2006 A1
20060035651 Arponen et al. Feb 2006 A1
20060036441 Hirota Feb 2006 A1
20060036947 Crenshaw et al. Feb 2006 A1
20060041632 Shah et al. Feb 2006 A1
20060044285 Ito et al. Mar 2006 A1
20060061544 Ho et al. Mar 2006 A1
20060061663 Park Mar 2006 A1
20060066744 Stavely et al. Mar 2006 A1
20060075344 Jung et al. Apr 2006 A1
20060078275 Oowa Apr 2006 A1
20060085187 Barquilla Apr 2006 A1
20060090132 Jung et al. Apr 2006 A1
20060092291 Bodie May 2006 A1
20060097993 Hietala et al. May 2006 A1
20060099995 Kim et al. May 2006 A1
20060101116 Rittman et al. May 2006 A1
20060101464 Dohrmann May 2006 A1
20060103627 Watanabe et al. May 2006 A1
20060103762 Ly et al. May 2006 A1
20060104454 Guitarte et al. May 2006 A1
20060109201 Lee et al. May 2006 A1
20060109242 Simpkins May 2006 A1
20060114337 Rothschild Jun 2006 A1
20060114338 Rothschild Jun 2006 A1
20060114514 Rothschild Jun 2006 A1
20060114516 Rothschild Jun 2006 A1
20060120712 Kim Jun 2006 A1
20060129908 Markel Jun 2006 A1
20060132431 Eliezer et al. Jun 2006 A1
20060132624 Yuyama Jun 2006 A1
20060136221 James et al. Jun 2006 A1
20060139459 Zhong Jun 2006 A1
20060140420 Machida Jun 2006 A1
20060142740 Sherman et al. Jun 2006 A1
20060143017 Sonoura et al. Jun 2006 A1
20060143607 Morris Jun 2006 A1
20060143684 Morris Jun 2006 A1
20060146009 Koivunen et al. Jul 2006 A1
20060155549 Miyazaki Jul 2006 A1
20060158426 Hagiwara Jul 2006 A1
20060166620 Sorensen Jul 2006 A1
20060170669 Garcia et al. Aug 2006 A1
20060176305 Arcas et al. Aug 2006 A1
20060182045 Anderson Aug 2006 A1
20060187212 Park et al. Aug 2006 A1
20060189349 Montulli et al. Aug 2006 A1
20060192775 Demaio et al. Aug 2006 A1
20060289348 Montulli et al. Aug 2006 A1
20060206331 Hennecke et al. Sep 2006 A1
20060208169 Breed Sep 2006 A1
20060209013 Fengels Sep 2006 A1
20060215035 Kulas Sep 2006 A1
20060215041 Kobayashi Sep 2006 A1
20060221197 Jung et al. Oct 2006 A1
20060222216 Harris et al. Oct 2006 A1
20060223503 Muhonen et al. Oct 2006 A1
20060232551 Matta Oct 2006 A1
20060238550 Page Oct 2006 A1
20060239672 Yost et al. Oct 2006 A1
20060250505 Gennetten et al. Nov 2006 A1
20060251338 Gokturk et al. Nov 2006 A1
20060251339 Gokturk et al. Nov 2006 A1
20060256082 Cho et al. Nov 2006 A1
20060256090 Huppi Nov 2006 A1
20060257827 Ellenson Nov 2006 A1
20060262192 Ejima Nov 2006 A1
20060266371 Vainshelboim et al. Nov 2006 A1
20060267927 Augustine et al. Nov 2006 A1
20060271612 Ritter et al. Nov 2006 A1
20060282472 Ng et al. Dec 2006 A1
20060282572 Steinberg et al. Dec 2006 A1
20060284969 Kim et al. Dec 2006 A1
20070003140 Morita et al. Jan 2007 A1
20070003168 Oliver Jan 2007 A1
20070013662 Fauth Jan 2007 A1
20070021068 Dewhurst Jan 2007 A1
20070030351 Blancoj et al. Feb 2007 A1
20070046641 Lim Mar 2007 A1
20070046694 Aizikowitz et al. Mar 2007 A1
20070050433 Kim Mar 2007 A1
20070057912 Cupal et al. Mar 2007 A1
20070058990 Weaver et al. Mar 2007 A1
20070063979 Tran Mar 2007 A1
20070067054 Danish Mar 2007 A1
20070067707 Travis et al. Mar 2007 A1
20070081090 Singh Apr 2007 A1
20070081744 Gokturk et al. Apr 2007 A1
20070085914 Lim Apr 2007 A1
20070086773 Hansson et al. Apr 2007 A1
20070088556 Andrew Apr 2007 A1
20070100632 Aubauer May 2007 A1
20070123251 McElvaney May 2007 A1
20070124694 Sluis et al. May 2007 A1
20070127575 Ho Jun 2007 A1
20070132413 Mays Jun 2007 A1
20070242269 Trainer Oct 2007 A1
20070262965 Hirai et al. Nov 2007 A1
20070273611 Torch Nov 2007 A1
20080019489 Lynn Jan 2008 A1
20080024594 Ritchey Jan 2008 A1
20080026838 Dunstan et al. Jan 2008 A1
20080082426 Gokturk et al. Apr 2008 A1
20080096587 Rubinstein Apr 2008 A1
20080163416 Go Jul 2008 A1
20080174547 Kanevsky et al. Jul 2008 A1
20080177640 Gokturk et al. Jul 2008 A1
20080215337 Greene et al. Sep 2008 A1
20080225001 Lefebure et al. Sep 2008 A1
20080229198 Jung et al. Sep 2008 A1
20080239085 Kruijtzer Oct 2008 A1
20080249777 Thelen et al. Oct 2008 A1
20080273764 Scholl Nov 2008 A1
20080285886 Allen Nov 2008 A1
20080288895 Hollemans et al. Nov 2008 A1
20080309761 Kienzle et al. Dec 2008 A1
20090015509 Gottwald et al. Jan 2009 A1
20090018419 Torch Jan 2009 A1
20090018432 He et al. Jan 2009 A1
20090018828 Nakadai et al. Jan 2009 A1
20090030552 Nakadai et al. Jan 2009 A1
20090043580 Mozer et al. Feb 2009 A1
20090067590 Bushey et al. Mar 2009 A1
20090092955 Hwang Apr 2009 A1
20090215503 Zhang et al. Aug 2009 A1
20090227283 Pylvanainen Sep 2009 A1
20090247245 Strawn et al. Oct 2009 A1
20090280873 Burson Nov 2009 A1
20090316006 Vau et al. Dec 2009 A1
20100063820 Seshadri Mar 2010 A1
20100205667 Anderson et al. Aug 2010 A1
20110043617 Vertegaal et al. Feb 2011 A1
20120206050 Spero Aug 2012 A1
20120308039 Kobayashi et al. Dec 2012 A1
20130010208 Chiang Jan 2013 A1
20130016120 Karmanenko et al. Jan 2013 A1
20130114943 Ejima et al. May 2013 A1
20130155309 Hill et al. Jun 2013 A1
20130158367 Pacione Jun 2013 A1
20130215014 Pryor Aug 2013 A1
20130257709 Raffle Oct 2013 A1
20140104197 Khosravy et al. Apr 2014 A1
20140206479 Marty et al. Jul 2014 A1
20140282196 Zhao et al. Sep 2014 A1
20140347363 Kaburlasos Nov 2014 A1
20150029322 Ragland et al. Jan 2015 A1
20150070262 Karmarkar et al. Mar 2015 A1
20150312397 Chiang Oct 2015 A1
20160218884 Ebrom et al. Jul 2016 A1
20170161720 Xing et al. Jun 2017 A1
20190058847 Mayer et al. Feb 2019 A1
20200408965 Karam Dec 2020 A1
Foreign Referenced Citations (360)
Number Date Country
709833 Sep 1999 AU
2004221365 Feb 2011 AU
2498505 Aug 2006 CA
2423142 Mar 2013 CA
2409562 Dec 2000 CN
1338863 Mar 2002 CN
1391690 Jan 2003 CN
1394299 Jan 2003 CN
1412687 Apr 2003 CN
2591682 Dec 2003 CN
1507268 Jun 2004 CN
2717364 Aug 2005 CN
1954292 Apr 2007 CN
100345085 Oct 2007 CN
101262813 Sep 2008 CN
100454388 Jan 2009 CN
100542848 Sep 2009 CN
3102208 Dec 1981 DE
3219242 Jan 1983 DE
3238853 May 1983 DE
4022511 Jan 1992 DE
29510157 Aug 1995 DE
19529571 Feb 1997 DE
19856798 Dec 1999 DE
19829568 Jan 2000 DE
10022321 Nov 2001 DE
10313019 Feb 2005 DE
102004038965 Mar 2005 DE
0078015 May 1983 EP
0078016 May 1983 EP
0094449 Nov 1983 EP
0300648 Jan 1989 EP
0342628 Nov 1989 EP
0350957 Jan 1990 EP
0376618 Jul 1990 EP
0407914 Jul 1990 EP
0387341 Sep 1990 EP
0317758 Feb 1993 EP
0547357 Jun 1993 EP
0583061 Feb 1994 EP
0588161 Mar 1994 EP
0589622 Mar 1994 EP
0620941 Oct 1994 EP
0699940 Mar 1996 EP
0699941 Mar 1996 EP
0714586 Jun 1996 EP
0729266 Aug 1996 EP
0739121 Oct 1996 EP
0742679 Nov 1996 EP
0765079 Mar 1997 EP
0776130 May 1997 EP
0841655 May 1998 EP
0847003 Jun 1998 EP
0876035 Nov 1998 EP
0900424 Mar 1999 EP
0839349 Sep 1999 EP
0944019 Sep 1999 EP
0948198 Oct 1999 EP
0970583 Jan 2000 EP
0977080 Feb 2000 EP
0986230 Mar 2000 EP
0991260 Apr 2000 EP
0840920 May 2000 EP
0999518 May 2000 EP
1014338 Jun 2000 EP
1020847 Jul 2000 EP
1024658 Aug 2000 EP
1054391 Nov 2000 EP
1058876 Dec 2000 EP
1064783 Jan 2001 EP
1071277 Jan 2001 EP
1113416 Jul 2001 EP
1143724 Oct 2001 EP
1148703 Oct 2001 EP
1180903 Feb 2002 EP
1159670 Sep 2002 EP
1075760 Nov 2002 EP
1271095 Jan 2003 EP
1271346 Jan 2003 EP
1293927 Mar 2003 EP
1062800 Apr 2003 EP
1066717 May 2003 EP
1315146 May 2003 EP
1186162 Jul 2003 EP
1344445 Sep 2003 EP
1351544 Oct 2003 EP
1377041 Jan 2004 EP
1391806 Feb 2004 EP
1400814 Mar 2004 EP
1404105 Mar 2004 EP
1404108 Mar 2004 EP
1406133 Apr 2004 EP
1455529 Sep 2004 EP
1465420 Oct 2004 EP
1471455 Oct 2004 EP
1472679 Nov 2004 EP
1475968 Nov 2004 EP
1491980 Dec 2004 EP
0890156 Jan 2005 EP
1503581 Feb 2005 EP
1552698 Jul 2005 EP
1558028 Jul 2005 EP
1596362 Nov 2005 EP
1604350 Dec 2005 EP
1613061 Jan 2006 EP
1621017 Feb 2006 EP
1622349 Feb 2006 EP
1626574 Feb 2006 EP
1661122 May 2006 EP
1662362 May 2006 EP
1045586 Aug 2006 EP
1690410 Aug 2006 EP
1696363 Aug 2006 EP
1704710 Sep 2006 EP
1284080 Nov 2006 EP
1721452 Nov 2006 EP
1751741 Feb 2007 EP
1755441 Feb 2007 EP
1538821 Aug 2007 EP
1082671 Mar 2008 EP
1027627 Feb 2009 EP
2096405 Sep 2009 EP
2264895 Dec 2010 EP
1693827 Mar 2011 EP
1314151 May 2011 EP
2325722 May 2011 EP
0899650 Jun 2011 EP
1938573 Aug 2011 EP
1130906 Sep 2011 EP
1569076 Jan 2012 EP
2261778 Feb 2012 EP
1371233 Apr 2012 EP
1634432 Mar 2013 EP
2650759 Oct 2013 EP
2945154 Nov 2015 EP
2770400 Sep 2016 EP
1078818 Nov 2017 EP
1671480 May 2019 EP
2998781 Dec 2019 EP
2368347 Nov 2011 ES
2382694 Jun 2012 ES
2533513 Mar 1984 FR
2800571 May 2001 FR
2832016 May 2003 FR
2066620 Jul 1981 GB
2242989 Oct 1991 GB
2300742 Nov 1996 GB
2329800 Mar 1999 GB
2351817 Aug 1999 GB
2380556 Apr 2003 GB
2401752 Nov 2004 GB
2405948 Mar 2005 GB
2406455 Mar 2005 GB
2420251 May 2006 GB
2424055 Sep 2006 GB
2424730 Oct 2006 GB
2430332 Mar 2007 GB
S54107343 Aug 1979 JP
56012632 Feb 1981 JP
S5612632 Feb 1981 JP
58080631 May 1983 JP
S5880631 May 1983 JP
58137828 Aug 1983 JP
60205433 Oct 1985 JP
S60205433 Oct 1985 JP
S62189898 Aug 1987 JP
S6382197 Apr 1988 JP
1056428 Mar 1989 JP
S6456428 Mar 1989 JP
1191838 Aug 1989 JP
1191840 Aug 1989 JP
H01191838 Aug 1989 JP
H01191839 Aug 1989 JP
H01191840 Aug 1989 JP
H01193722 Aug 1989 JP
H0270195 Mar 1990 JP
H02153415 Jun 1990 JP
H02206975 Aug 1990 JP
64-56428 Sep 1990 JP
2230225 Sep 1990 JP
H02230225 Sep 1990 JP
H03180690 Aug 1991 JP
H04175073 Jun 1992 JP
H04-316035 Nov 1992 JP
H06321011 Nov 1994 JP
H07-84302 Mar 1995 JP
H07-84311 Mar 1995 JP
H0755755 Mar 1995 JP
H0772792 Mar 1995 JP
H08139980 May 1995 JP
H07333716 Dec 1995 JP
H09-186954 Jul 1997 JP
H1024785 Jan 1998 JP
H1031551 Feb 1998 JP
H1056428 Feb 1998 JP
H10117212 May 1998 JP
H10199422 Jul 1998 JP
H10269022 Oct 1998 JP
H11109498 Apr 1999 JP
H11143487 May 1999 JP
H11198745 Jul 1999 JP
H11- 212726 Aug 1999 JP
H11511301 Sep 1999 JP
H11-355617 Dec 1999 JP
2000020677 Jan 2000 JP
2000-083186 Mar 2000 JP
2000101898 Apr 2000 JP
2000-163193 Jun 2000 JP
2000-221582 Aug 2000 JP
2000-231151 Aug 2000 JP
2000214525 Aug 2000 JP
2000227633 Aug 2000 JP
2000231142 Aug 2000 JP
2000235216 Aug 2000 JP
2000-285413 Oct 2000 JP
2000284794 Oct 2000 JP
2000347277 Dec 2000 JP
3124275 Jan 2001 JP
2001005485 Jan 2001 JP
2001027897 Jan 2001 JP
2001056796 Feb 2001 JP
2001305642 Feb 2001 JP
2001109878 Apr 2001 JP
3180690 Jun 2001 JP
2001266254 Sep 2001 JP
2001518828 Oct 2001 JP
2001320610 Nov 2001 JP
2002010369 Jan 2002 JP
2002-040545 Feb 2002 JP
2002049327 Feb 2002 JP
2002057764 Feb 2002 JP
2002135376 May 2002 JP
2002158953 May 2002 JP
2002183579 Jun 2002 JP
2002189723 Jul 2002 JP
2002-218092 Aug 2002 JP
2002252806 Sep 2002 JP
2002311990 Oct 2002 JP
2002345756 Dec 2002 JP
2002358162 Dec 2002 JP
2003010521 Jan 2003 JP
2003506148 Feb 2003 JP
2003066419 Mar 2003 JP
2003069884 Mar 2003 JP
2003075905 Mar 2003 JP
2003169291 Jun 2003 JP
2003281028 Oct 2003 JP
2003284050 Oct 2003 JP
2003309748 Oct 2003 JP
2003324649 Nov 2003 JP
2004504077 Feb 2004 JP
2004120526 Apr 2004 JP
2004140641 May 2004 JP
2004180181 Jun 2004 JP
2004221908 Aug 2004 JP
2004303000 Oct 2004 JP
2004333738 Nov 2004 JP
2004334590 Nov 2004 JP
2005004410 Jan 2005 JP
2005024792 Jan 2005 JP
2005027002 Jan 2005 JP
2005033454 Feb 2005 JP
2005-134819 May 2005 JP
2005148151 Jun 2005 JP
2005-181365 Jul 2005 JP
2005527256 Sep 2005 JP
2005333582 Dec 2005 JP
2006031499 Feb 2006 JP
2006039953 Feb 2006 JP
2006121671 May 2006 JP
2006145918 Jun 2006 JP
2006155452 Jun 2006 JP
2006515694 Jun 2006 JP
2006184589 Jul 2006 JP
2006287749 Oct 2006 JP
3915291 May 2007 JP
2009504081 Jan 2009 JP
2009291657 Dec 2009 JP
2011086315 Apr 2011 JP
2012179370 Sep 2012 JP
19990036555 May 1999 KR
19990054524 Jul 1999 KR
20010111127 Dec 2001 KR
20040054225 Jun 2004 KR
20040065987 Jul 2004 KR
20040075419 Aug 2004 KR
20040075420 Aug 2004 KR
20040076916 Sep 2004 KR
20040100995 Dec 2004 KR
20050011277 Jan 2005 KR
20050083364 Aug 2005 KR
20050089371 Sep 2005 KR
20050090265 Sep 2005 KR
20060034453 Apr 2006 KR
20070000023 Jan 2007 KR
100700537 Mar 2007 KR
100795450 Jan 2008 KR
100896245 May 2009 KR
100978689 Aug 2010 KR
2143841 Jan 2000 RU
2220057 Dec 2003 RU
200520512 Jun 2005 TW
WO1989003519 Apr 1989 WO
WO1995001757 Jan 1995 WO
WO1996003741 Feb 1996 WO
WO1996009587 Mar 1996 WO
WO1997024905 Jul 1997 WO
WO1997049340 Dec 1997 WO
WO1998012685 Mar 1998 WO
WO1999003253 Jan 1999 WO
WO1999021122 Apr 1999 WO
WO1999021165 Apr 1999 WO
WO9936826 Jul 1999 WO
WO1999057937 Nov 1999 WO
WO9965381 Dec 1999 WO
WO2000003348 Jan 2000 WO
WO2000065873 Nov 2000 WO
WO2000075766 Dec 2000 WO
WO2002008860 Jan 2001 WO
WO2001011896 Feb 2001 WO
WO2001026092 Apr 2001 WO
WO2001060029 Aug 2001 WO
WO200109012 Nov 2001 WO
WO2001091107 Nov 2001 WO
WO2001099096 Dec 2001 WO
WO2002012966 Feb 2002 WO
WO2002021274 Mar 2002 WO
WO2002027535 Apr 2002 WO
WO2002029640 Apr 2002 WO
WO2002054309 Jul 2002 WO
WO2002102072 Dec 2002 WO
WO2003003185 Jan 2003 WO
WO2003071391 Aug 2003 WO
WO2003093879 Nov 2003 WO
WO2004001576 Dec 2003 WO
WO2004005141 Jan 2004 WO
WO2004032014 Apr 2004 WO
WO2004051392 Jun 2004 WO
WO2004052035 Jun 2004 WO
WO2004057451 Jul 2004 WO
WO2004078536 Sep 2004 WO
WO2004105523 Dec 2004 WO
WO2005018219 Feb 2005 WO
WO2005026940 Mar 2005 WO
WO2005050308 Jun 2005 WO
WO2005058705 Jun 2005 WO
WO2005062591 Jul 2005 WO
WO2005061249 Jul 2005 WO
WO2005107407 Nov 2005 WO
WO2006003588 Jan 2006 WO
WO2006003591 Jan 2006 WO
WO2006006108 Jan 2006 WO
WO2006036069 Apr 2006 WO
WO2006062966 Jun 2006 WO
WO2006068123 Jun 2006 WO
WO2006086863 Aug 2006 WO
WO2006093003 Sep 2006 WO
WO2006103437 Oct 2006 WO
WO2006110765 Oct 2006 WO
WO2007034392 Mar 2007 WO
Non-Patent Literature Citations (156)
Entry
Adams, Russ, “Sourcebook of Automatic Identification and Data, Collection,” Van Norstand Reinhold; New York, Dec. 31, 1990.
Bernardi, Bryan D., “Speech Recognition Camer with a Pmmpting Display,” The Journal of the Acoustical Seciety of America, vol. 108, Issue 4, Oct. 2000, p. 1383.
Bernardi, Bryan D., “Speech Recognition Camera with a Prompting Display,” The Journal of the Acoustical Society of America, vol. 109, Issue 4, Apr. 2001, p. 1287.
Chapman, William D. “Prospectives in Voice Response from Computers,” R.L.A. Trost, “Film Slave,” Nov. 1976, Elektor, vol. 2, No. 11, pp. 1135-1137.
Goode, Georgianna, et al., Voice Controlled Stereographic Video Camera System, Proc. SPIE vol. 1083, p. 35, Three-Dimensional Visualization and Display Technologies; Scott S. Fisher: Woodrow E. Robbins, Eds.
Harif, Shlomi, Recognizing non-verbal sound commands in an interactive computer controlled speech word recognition display system, Acoustical Society of America Journal, vol. 118, Issue 2, pp. 599-599 (2005).
Hermes operating system now also listens to “his British master's voice” (Nov. 1999).
Morgan, Scott Anthony, Speech command input recognition system for interactive computer display with term weighting means used in interpreting potential commands from relevant speech terms, The Journal of the Acoustical Society of America, vol. 110, Issue 4, Oct. 2001, p. 1723.
Panasonic VLG201CE-S Video Intercom System with Silver door station.
Philips, M.L. Adv. Resource Dev. Corp., Columbia, MD, Voice control of remote stereoscopic systems Voice control of remote stereoscopic systems, by, Southeastcon '90. Proceedings., IEEE, Apr. 1-4, 1990, 594-598 vol. 2.
Reichenspurner, et al., Use of the voice-controlled and computer-assisted surgical system ZEUS for endoscopic coronary artery bypass grafting. The Journal of thoracic and cardiovascular surgery, Jul. 1999.
Robotics: the Future of Minimally Invasive Heart Surgery (May 2000).
ST Microelectronics TSH512 Hi-fi Stereo/mono Infrared Transmitter and Stereo Sub-carrier Generator (Oct. 2005).
Non-Final Office Action in U.S. Appl. No. 11/163,391, (dated Sep. 25, 2008).
Response to Non-Final Office Action in U.S. Appl. No. 11/163,391 (dated Jan. 9, 2009).
Non-Final Office Action in U.S. Appl. No. 11/163,391, (dated Apr. 22, 2009).
Response to Non-Final Office Action in U.S. Appl. No. 11/163,391 (Sep. 22, 2009).
Final Office Action in U.S. Appl. No. 11/163,391, (dated Dec. 18, 2009).
Response to Final Office Action in U.S. Appl. No. 11/163,391 (dated Jan. 11, 2010).
Non-Final Office Action in U.S. Appl. No. 12/710,066, (dated May 3, 2010).
Response to Non-Final Office Action in U.S. Appl. No. 12/710,066 (dated Aug. 3, 2010).
Final Office Action in U.S. Appl. No. 12/710,066, (dated Oct. 18, 2010).
Response to Final Office Action in U.S. Appl. No. 12/710,066 (dated Dec. 20, 2010).
Non-Final Office Action in U.S. Appl. No. 13/087,650, (dated Apr. 19, 2012).
Response to Non-Final Office Action in U.S. Appl. No. 13/087,650 (dated Jul. 19, 2012).
Non-Final Office Action in U.S. Appl. No. 13/717,681, (dated May 21, 2013).
Response to Non-Final Office Action in U.S. Appl. No. 13/717,681 (dated Nov. 15, 2013).
File History, U.S. Appl. No. 11/163,391 (now issued U.S. Pat. No. 7,697,827) to Konicek (filed Oct. 2005).
File History, U.S. Appl. No. 12/710,066 (now issued U.S. Pat. No. 7,933,508) to Konicek (filed Feb. 2010).
File History, U.S. Appl. No. 13/087,650 (now issued U.S Pat. No. 8,467,672) to Konicek (filed Apr. 2011).
File History, U.S. Appl. No. 13/717,681 to Konicek (filed Dec. 2012).
Notice of Allowance in U.S. Appl. No. 13/717,681, (dated Jan. 24, 2014).
Request for Continued Examination in U.S. Appl. No. 13/717,681 (dated Mar. 14, 2014).
Non-Final Office Action in U.S. Appl. No. 13/717,681, (dated Apr. 3, 2014).
Non-Final Office Action in U.S. Appl. No. 14/199,855, (dated Apr. 24, 2014).
Response to Non-Final Office Action in U.S. Appl. No. 14/199,855, (dated May 21, 2014).
Non-Final Office Action in U.S. Appl. No. 14/203,129, (dated Apr. 25, 2014).
Response to Non-Final Office Action in U.S. Appl. No. 14/203,129, (dated Jun. 3, 2014).
File History, U.S. Appl. No. 14/199,855 to Konicek (filed Mar. 2014).
File History, U.S. Appl. No. 14/203,129 to Konicek (filed Mar. 2014).
Response to Non-Final Office Action in U.S. Appl. No. 13/717,681 (dated Jun. 30, 2014).
File History, U.S. Appl. No. 14/315,544 to Konicek (filed Jun. 2014).
Notice of Allowance in U.S. Appl. No. 13/717,681, (dated Aug. 4, 2014).
Notice of Allowance in U.S. Appl. No. 14/199,855, (dated Jul. 14, 2014).
Notice of Allowance in U.S. Appl. No. 14/203,129, (dated Jul. 14, 2014).
Notice of Allowance in U.S. Appl. No. 14/315,544, (dated Sep. 29, 2014).
Notice of Allowance in U.S. Appl. No. 14/453,511, (dated Oct. 20, 2014).
Notice of Allowance in U.S. Appl. No. 14/495,976, (dated Oct. 22, 2014).
RSC-164i Datasheet, “General Purpose Microcontroller Featuring Speech Recognition, Speaker Verification, and Speech Synthesis,” Sensory, Inc. (1996).
Non-Final Office Action in U.S. Appl. No. 14/539,687, (dated Apr. 17, 2015).
Machine Translation of JP2000214525 to Yoji (date unknown).
U.S. Appl. No. 60/718,155 to Feinberg et al. (filed Sep. 15, 2005).
Smart Commander Guide to Voice Recognition (date unknown).
Network Smart Capture Ver.1.2 (dated 1997).
Partial English Translation of Network Smart Capture Ver.1.2 (date unknown).
Smart Capture Smart Commander (date unknown).
Partial English Translation of Smart Capture Smart Commander (date unknown).
Final Office Action in U.S. Appl. No. 14/539,687, (dated Nov. 16, 2015).
Response to Final Office Action in U.S. Appl. No. 14/539,687 (dated Jan. 15, 2016).
Non-Final Office Action in U.S. Appl. No. 14/539,687, (dated Feb. 4, 2016).
Response to Non-Final Office Action in U.S. Appl. No. 14/539,687 (dated May 4, 2016).
Notice of Allowance in U.S. Appl. No. 14/539,687, (dated Jul. 15, 2016).
BMW Group—Voice Commands for BMW 5 Series & 6 Series MY2004 Equipped with CCC (date unknown).
Non-Final Office Action in U.S. Appl. No. 14/950,338 (dated Oct. 7, 2016).
Non-Final Office Action in U.S. Appl. No. 15/188,736 (dated Oct. 12, 2016).
Non-Final Office Action in U.S. Appl. No. 14/614,515 (dated Mar. 6, 2017).
Response to Non-Final Office Action in U.S. Appl. No. 14/950,338 (dated Apr. 7, 2017).
Declaration of Jeffrey C. Konicek Under Rule 1.132 in U.S. Appl. No. 14/950,338 (Apr. 7, 2017).
Response to Non-Final Office Action in U.S. Appl. No. 15/188,736 (dated Apr. 12, 2017).
Declaration of Jeffrey C. Konicek Under Rule 1.132 in U.S. Appl. No. 15/188,736 (Apr. 12, 2017).
Nokia 9500 Communicator User Guide (p. 38) (Copyright 2004-2005).
HP iPAQ rX3715 Quick Specs (Jul. 27, 2004).
HP iPAQ rX3715 Data Sheet (Copyright 2004).
Ricoh RDC-1700 Operation Manual (Copyright 2000).
Machine English Translation of JP 2005-181365 to Imamura et. al.
Machine English Translation of JP H09-186954 to Yasuyuki, et al.
Machine English Translation of JP 2000-221582 to Yoji.
Machine English Translation of JP 2000-231151 to Yoji.
Machine English Translation of JP2000-083186 to Hiroshi.
Machine English Translation of JP 2002-218092 to Nobuaki.
Machine English Translation of JP 2000-285413 to Kenji et al.
Machine English Translation of JP H11- 212726 to Hideyuki et al.
Machine English Translation of JP H11-355617 to Manbu.
Machine English Translation of JP 2005-134819 to Mineko et al.
Response to Non-Final Office Action in U.S. Appl. No. 14/614,515 (dated Sep. 6, 2017).
Final Office Action in U.S. Appl. No. 14/614,515, (dated Nov. 15, 2017).
RCE and Response to Final Office Action in U.S. Appl. No. 14/614,515 (dated Mar. 15, 2018).
Non-Final Office Action in U.S. Appl. No. 14/614,515, (dated May 10, 2018).
Response to Non-Final Office Action in U.S. Appl. No. 14/614,515 (dated Nov. 2, 2018).
Non-Final Office Action in U.S. Appl. No. 14/950,370, (dated Jun. 20, 2017).
Response to Non-Final Office Action in U.S. Appl. No. 14/950,370 (dated Dec. 20, 2017).
Supplemental Response and Amendment in U.S. Appl. No. 14/950,370 (dated Feb. 8, 2018).
Notice of Allowance in U.S. Appl. No. 14/950,370, (dated May 29, 2018).
Corrected Notice of Allowance in U.S. Appl. No. 14/950,370, (dated Jun. 12, 2018).
Interview Summary in U.S. Appl. No. 15/188,736, (dated May 9, 2017).
Interview Summary in U.S. Appl. No. 15/188,736, (dated Jun. 15, 2017).
Final Office Action in U.S. Appl. No. 15/188,736, (dated Jun. 19, 2017).
Response to Final Office Action in U.S. Appl. No. 15/188,736 (dated Dec. 11, 2017).
Interview Summary in U.S. Appl. No. 15/188,736, (dated Dec. 12, 2017).
Notice of Allowance in U.S. Appl. No. 15/188,736, (dated Jan. 19, 2018).
Final Office Action in U.S. Appl. No. 14/950,338, (dated Jun. 20, 2017).
Appeal Brief in U.S. Appl. No. 14/950,338 (filed Feb. 19, 2018).
Non-Final Office Action in U.S. Appl. No. 14/950,338, (dated May 3, 2018).
Response to Non-Final Office Action in U.S. Appl. No. 14/950,338 (dated Oct. 19, 2018).
Supplemental Amendment in U.S. Appl. No. 14/950,338 (dated Nov. 6, 2018).
Notice of Allowance in U.S. Appl. No. 14/950,338, (dated Jan. 31, 2019).
Supplemental Amendment in U.S. Appl. No. 14/950,370 (dated Feb. 8, 2018).
Final Office Action in U.S. Appl. No. 14/614,515, (dated Jan. 30, 2019).
RCE and Response to Final Office Action in U.S. Appl. No. 14/614,515 (dated Jul. 17, 2019).
Non-Final Office Action in U.S. Appl. No. 14/614,515, (dated Aug. 5, 2019).
Machine English Translation of JP H07-84302 to Kawamura.
Machine English Translation of JP H07-84311 to Kawamura.
Machine English Translation of JP H04-316035 to Yoshimura et al.
Machine English Translation of TW 200520512 to Liu et al.
Machine English Translation of KR2004/0065987 to Matsufune.
Apex Standards —Invalidity Analysis (date Unknown) (last accessed Aug. 18, 2021).
Techson IP—Limestone Report, Report Generated: Apr. 21, 2021 (last accessed Aug. 18, 2021).
Amplified—AI Invalidity Report (date Unknown) (last accessed Aug. 18, 2021).
Traindex—Prior Art report for U. Pat. No. 7,697,827-B2 (date Unknown) (last accessed Aug. 18, 2021).
1997 IEEE International Conference on Acotics, Speech, and Signal Processing, vols. I-V. : 371-374 1997.
2004 IEEE International Conference On Multimedia and EXP (ICME), vols. 1-3. : 1579-1582 2004.
Advanced Microsystems for Automotive Applications 2000. : 181-203 2000.
Computational Linguistics. 23 (2): 269-311 Jun. 1997.
Computer Vision and Image Understanding. 73 (3): 428-440 Mar. 1999.
ETFA 2003: IEEE Conference On Emerging Technologies and Factory Automation, vol. 2, Proceedings. : 545-551 2003.
IEEE Conference on Intelligent Transportation Systems. : 397-402 1997.
IEEE Nonrigid and Articulated Motion Workshop, Proceedings. : 90-102 1997.
IEEE Transactions on Consumer Electronics. 51 (1): 240-244 Feb. 2005.
IEEE Transactions on Electron Devices. 44 (10): 1648-1652 Oct. 1997.
IEEE Transactions on Pattern Analysis and Machine Intelligence. 19 (7): 677-695 Jul. 1997.
Image Understanding Workshop, 1996 Proceedings, vols. I and II. : 805-811 1996.
Journal of Thoracic and Cardiovascular Surgery. 118 (1): Jul. 11-16, 1999.
Laser Radar Technology and Applications III. 3380: 270-278 1998.
Proceedings of the 1998 IEEE International Conference on Acotics, Speech and Signal Processing, vols. 1-6. : 3737-3740 1998.
Proceedings of the 1998 IEEE International Conference on Acotics, Speech and Signal Processing, vols. 1-6. : 665-668 1998.
Proceedings of the IEEE Intelligent Vehicles Symposium 2000. : 314-319 2000.
Speech Communication. 30 (1): 37-53 Jan. 2000.
Speech Communication. 44 (1-4): 97-112 Oct. 2004.
Non-Final Office Action in U.S. Appl. No. 16/663,742, (dated Feb. 1, 2021).
Response to Non-Final Office Action in U.S. Appl. No. 16/663,742, (dated Jun. 11, 2021).
Interview Summary in U.S. Appl. No. 16/663,742, (dated Jun. 16, 2021).
Notice of Allowance in U.S. Appl. No. 16/663,742, (dated Jul. 20, 2021).
GSM Arena specs-HP iPAQ h6325 (date unknown).
HP User Guide—HP iPAQ Pocket PC h6300 Series (2004).
Cingular and HP Launch HP iPAQ Pocket PC (Jun. 6, 2005).
Machine English translation of KR 20050083364 to Kwon.
English translation of KR 20050011277 to So.
Palm Zire Handheld Wikipedia Article (accessed Sep. 15, 2022).
Palm Z22 Wikipedia Article (accessed Sep. 15, 2022).
Palmone, Inc. “using your Treo650 smartphone” (2004).
Palm TX Wikipedia Article (accessed Sep. 15, 2022).
Machine English translation of JP 2004140641 to Takeda et al.
Palm Treo650 Wikipedia Article (accessed Sep. 15, 2022).
Leggitt, Bob “The History of Online Photo Sharing: Part 1” (https://twirpz.files.wordpress.com/2015/09/fuji-finepix-6800-and-pixology.jpg) (Sep. 26, 2015).
Kodak Gallery Wikipedia Article (date unknown).
Machine English translation of JPH11511301 to Anderson.
Related Publications (1)
Number Date Country
20210352205 A1 Nov 2021 US
Divisions (1)
Number Date Country
Parent 11163391 Oct 2005 US
Child 12710066 US
Continuations (10)
Number Date Country
Parent 16663742 Oct 2019 US
Child 17383397 US
Parent 14614515 Feb 2015 US
Child 16663742 US
Parent 14539687 Nov 2014 US
Child 14614515 US
Parent 14495976 Sep 2014 US
Child 14539687 US
Parent 14453511 Aug 2014 US
Child 14495976 US
Parent 14315544 Jun 2014 US
Child 14453511 US
Parent 14203129 Mar 2014 US
Child 14315544 US
Parent 13717681 Dec 2012 US
Child 14203129 US
Parent 13087650 Apr 2011 US
Child 13717681 US
Parent 12710066 Feb 2010 US
Child 13087650 US