This invention relates generally to portable or mobile computer terminals and more specifically to mobile terminals having speech functionality for executing and directing tasks using voice or speech.
Wearable, mobile and/or portable computer terminals are used for a wide variety of tasks. Such terminals allow the workers using them (“users”) to maintain mobility, while providing the worker with desirable computing and data-processing functions. Furthermore, such terminals often provide a communication link to a larger, more centralized computer system that directs the activities of the user and processes any collected data. One example of a specific use for a wearable/mobile/portable terminal is a product management system that involves product distribution and tracking as well as product inventory management.
Computerized product management systems with mobile terminals are used in various inventory/order-based industries, such as food and retail product distribution, manufacturing, and quality control, for example. An overall integrated product management system may utilize a central computer system that runs a program for product tracking and management and for order filling. A plurality of mobile terminals is employed by the users of the system to communicate (usually in a wireless fashion) with the central system for the product handling. The users perform various manual tasks, such as product picking and placement, per instructions they receive through the terminals, via the central system. The terminals also allow the users to interface with the computer system, such as to respond to inquiries or confirm the completion of certain tasks. Therefore, an overall integrated management system involves a combination of a central computer system for tracking and management, and the people who use and interface with the computer system in the form of order fillers, pickers and other workers. The workers handle the manual aspects of the integrated system under the command and control of information transmitted from the central computer system to the wireless mobile terminals worn by the users.
To provide an interface between the central computer system and the workers, such mobile terminals and the central systems to which they are connected may be voice-driven or speech-driven; i.e., the system operates using human speech. Speech is synthesized and played to the user, via the mobile terminal, to direct the tasks of the user and collect data. The user then answers or asks questions; and the speech recognition capabilities of the mobile terminal convert the user speech to a form suitable for use by the terminal and central system. Thereby, a bidirectional communication stream of information is exchanged over a wireless network between the wireless wearable terminals and the central computer system using speech.
Conventionally, mobile computer terminals having voice or speech capabilities utilize a headset device that is coupled to the mobile terminal. The terminal is worn on the body of a user, such as around the waist, and the headset connects to the terminal, such as with a cord or cable. The headset has a microphone for capturing the voice of the user for voice data entry and commands, and also includes one or more ear speakers for both confirming the spoken words of the user and also for playing voice instructions and other audio that are generated or synthesized by the terminal. Through the headset, the workers are able to receive voice instructions or questions about their tasks, ask and answer questions, report the progress of their tasks, and report working conditions, such as inventory shortages, for example. Therefore, in conventional mobile terminal systems, headsets are matched with respective terminals and worn by the user to operate in conjunction with the terminals.
An illustrative example of a set of worker tasks suitable for a wireless mobile terminal with speech capabilities may involve initially welcoming the worker to the computerized inventory management system and defining a particular task or order, for example, filling a load for a particular truck scheduled to depart from a warehouse. The worker may then answer with a particular area (e.g., freezer) that they will be working in for that order. The system then vocally directs the worker to a particular aisle and bin to pick a particular quantity of an item for the order. The worker then vocally confirms the location that they have gone to and vocally confirms the number of picked items, and/or various other information about the picked items. The system then directs the worker to the next items to be picked for the order, and this continues until the order is filled or otherwise completed. The system may then direct the worker to a loading dock or bay for a particular truck to receive the finished order. As may be appreciated, the specific communications exchanged between the wireless mobile terminal and the central computer system using speech can be task-specific and highly variable.
The mobile speech terminals provide a significant efficiency in the performance of the workers tasks. Specifically, using such terminals, the work is done virtually hands-free without equipment to juggle or paperwork to carry around. However, while existing speech systems provide hands-free operations, they also have various drawbacks associated with their configuration, and particularly with the headset and its interface with the mobile terminal.
One drawback with current systems is that the headset is attached to a terminal with a cord which extends generally from the terminal (typically worn on a belt) to the head of the worker where the headset is located. As may be appreciated, the workers are moving rapidly around their work area and are often jumping on and off forklifts, pallet loaders, and other equipment. Therefore, there is a possibility for a cord to get caught on some object, such as a forklift. When this occurs, the cord will tend to want to separate either from the headset or from the terminal, thus requiring repair or replacement. Generally, the cords are permanently attached to a headset and each worker maintains their own headset (e.g. for individual responsibility and/or hygiene purposes). The cords are then plugged into the terminals; therefore, the separation will generally occur at the terminal socket.
Attempts have been made to appropriately handle a snagged cord and cord separation to prevent such an event from rendering the terminal inoperable and in need of repair and replacement. One suitable approach is illustrated in U.S. Pat. No. 6,910,911, which is commonly owned with the present application. However, the loose and dangling cord still remains somewhat of an issue with voice-enabled mobile terminals and their headsets.
Attempts have been made to eliminate the cords between the headset and mobile terminals by using wireless headsets. For example, such an approach is set forth in U.S. patent application Ser. No. 11/303,271 entitled Wireless Headset and Method for Robust Voice Data Communication, filed Dec. 16, 2005, which application is incorporated herein by reference in its entirety. However, such a system still requires a separate mobile terminal for use with the headset. As may be appreciated, multiple headsets and mobile terminals increases the number of units that must be purchased, maintained and tracked at a facility. In a large warehouse facility, this may be a significant task and also present a significant cost in maintaining the equipment. Therefore, there is still a need to improve upon existing mobile terminal systems and particularly to improve upon such systems that are utilized in speech tasks or speech-enabled environments. One suitable solution is to incorporate the functionality of a speech terminal with a head-worn device. This eliminates the need for separate headsets and addresses the issues noted above. However, other issues have not been adequately addressed and thus there remains a need for a mobile head-worn terminal that is suitable for speech-directed applications.
Any solution to the above-noted issues must address wearability and control issues by providing a headset that is operable on both sides of the head without a significant positional shift in the layout of the terminal and its controls. Furthermore, since the headset terminal is worn for extended periods on the head, it must be comfortable for the user and readily positioned on either side of the head. Weight is also a consideration, as is complexity in the construction of the headset terminal. Because of the increased processing functions that are necessary in a speech-enabled headset terminal, the space usage, the circuit component layout, and necessary wiring must also be addressed in a suitably robust, yet aesthetically pleasing design. Loose or exposed wires or cables in a headset are unappealing and certainly undesirable in a work environment.
Power considerations are also an issue in a headset terminal, as the weight of a battery is no longer carried at the waist of a user. Any battery must be readily removable and replaceable without a complicated mounting assembly that adds complexity and weight to the overall headset design.
Furthermore, because of the increased functionality of a headset terminal, it must have the ability to operate wirelessly with an overall central system or other components.
Still further, in conventional headset/terminal assemblies, the users generally maintain their own headset for hygiene purposes and share the mobile terminals. Incorporating the terminal functionality into a headset eliminates the separate shared terminal, and thus there is a need to address the hygiene aspects of the work environment in a headset terminal, while allowing sharing of the headset terminal among various worker shifts.
Accordingly, there is a need, unmet by current communication systems and mobile terminals, to address the issues noted above. There is particularly an unmet need in the area of terminals for performing speech-directed work and other speech-directed tasks using synthesized speech and speech recognition.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with a general description of the invention given above and the Detailed Description given below, serve to explain the invention.
The present invention is directed to a unique headset configuration. One embodiment of the present invention is a speech-enabled mobile computer in the form of a wireless headset for handling speech-directed applications that require high mobility and high data transmission speed, such as warehousing, manufacturing, pharmaceutical, logging, and defense applications. The headset terminal of the present invention provides full speech functionality, is ultra lightweight, i.e., less than 10 ounces, provides full shift operation on a single battery charge, and includes a modular architecture that allows the separation of the “personal” components of the wireless headset mobile computer, i.e., those that touch the user's head, ears, or mouth, from the non-personal expensive electronics and, thereby, promotes good hygiene and low cost of ownership. The embodiment of the present invention provides the full speech functionality of a Vocollect Talkman® or T2® or T5® which is sold by Vocollect of Pittsburgh, Pa., the owner of the present application.
The mobile headset of the invention also incorporates unique features in its controls, headband structure, battery configuration and microphone/speaker assembly, that enhance the operation, comfort, durability, versatility and robustness of the headset. While one particular embodiment of the invention as discussed herein is in the form of a fully speech-enabled mobile headset terminal, the various aspects of the headset design as disclosed herein are equally applicable in a stand-alone headset that operates with a separate, body-worn, mobile computer terminal. That is, the headset features disclosed herein are also equally applicable to a conventional headset that couples by wire or wirelessly to a body-worn terminal. The features of the invention, for example, are applicable to use with the wireless headset and system set forth in U.S. patent application Ser. No. 11/303,271, noted above. Furthermore, the aspects of the invention have applicability to headsets in general, and not just to those used in conjunction with mobile terminals. Therefore, various aspects of the present invention are not limited only to mobile speech terminals and similar applications, but have applicability to headsets in general, wired or wireless. Of course, the aspects of the invention have particular applicability to wireless headsets and mobile headset terminals.
Headset terminal 50 includes one or more printed circuit boards (PCBs) 10 that contain the electronic components of the headset terminal. For example, the PCB 10 might be located in the earcup assembly 52 of headset terminal 50 as shown in
For example, headset terminal may operate with the functionality of the system disclosed in U.S. patent application Ser. No. 11/247,291 entitled Integrated Wearable Terminal for Voice-Directed Work and RFID Identification/Verification, filed Oct. 11, 2005, which application is incorporated by reference herein in its entirety. To that end, the processor 12 may include the necessary speech recognition/synthesis circuitry for voice or speech applications, such as those applications that direct the work of a user. The headset terminal supports various operator languages, with a wide range of text-to-speech functionality. Terminal 50 is also configured with “record and playback” technology. Terminal 50 and processor 12 are configured, as appropriate, to be fully functional with existing Talkman™ software infrastructure components, including Voice Console™, Voice Link™ and Voice Builder™ components available from Vocollect.
Wireless headset terminal 50 is a strong, lightweight computer terminal that is especially designed for use in industrial environments. The terminal may operate in an environment −30° C. to 50° C. The user wears headset terminal 50 on their head and, thus, retains full freedom of movement. There are no exposed wires or cords to get caught or snagged. Through speaker 28, the operator receives information or commands in a speech or voice format and responds directly to the commands by speaking into a microphone 26. All information is relayed, in real time or batch, to and from a central computer (not shown) through a wireless RF network (not shown), as is known in the art of speech-enabled systems.
Processor/CPU 12 is a general purpose processor for managing the overall operation of wireless headset terminal 50. Processor/CPU 12 may be, for example, a 600 MHz Intel® XScale™ processor, or other processor, indicative of currently available technology. The XScale™ processor combines the processor and memory in a small square device. Processor 12 is capable of handling various speech recognition algorithms and speech synthesis algorithms without the need for additional speech recognition technology, such as ASICs or DSP components. Processor 12, in one embodiment, thus includes speech recognition circuitry and speech synthesis circuitry for recognizing and synthesizing speech. Processor 12 also includes suitable software for providing speech applications, such as work applications to communicate activity information with a user by speech and also to collect data from the user about the activity also using speech. Such speech applications as used for worker direction are known and are available from Vocollect, Inc., Pittsburgh, Pa. Processor 12 is suitably electrically connected to the various components of the terminal as shown in
The audio input/output stage 14 receives an audio signal from microphone 26, which may be a standard boom-mounted, directional, noise-canceling microphone that is positioned near the user's mouth. Audio input/output stage 14 also provides a standard audio output circuit for driving speaker 28, which may be a standard audio speaker located in the earcup of wireless headset terminal 50 as shown in
WLAN radio component 18 is a standard WLAN radio that uses well-known wireless networking technology, such as WiFi, for example, that allows multiple devices to share a single high-speed connection for a WLAN. WLAN refers to any type of wireless local area network, including 802.11b, 802.11a, and 802.11g and a full 801.22i wireless security suite. WLAN radio 18 is integrated into wireless headset terminal 50. Furthermore, WLAN radio 18 provides high bandwidth that is suitable for handling applications that require high data transmission speed, such as warehousing, manufacturing, pharmaceutical, logging, and defense applications. WLAN radio 18 may be used for transmitting data in real time or in batch form to/from the central computer 19 and receiving work applications, tasks or assignments, for example.
User interface 20 provides control of the headset terminal and is coupled with suitable control components 64, such as control buttons as illustrated in
WPAN interface device 22 is a component that permits communication in a wireless personal area network (WPAN), such as Bluetooth, for example, which is a wireless network for interconnecting devices centered around a person's workspace, e.g., the headset terminal user's workspace. The WPAN interface device 22 allows terminal 50 to interface with any WPAN-compatible, body-worn wireless peripheral devices associated with the terminal user, such as Bluetooth devices.
Battery pack 30 is a lightweight, rechargeable power source that provides suitable power for running terminal 50 and its components. Battery pack 30, for example, may include one or more lithium-sulfur batteries that have suitable capacity to provide full shift operation of wireless headset terminal 50 on a single charge.
As noted,
Headset 50 includes an earcup structure or assembly 52 connected with an opposing power source/electronics structure or assembly 54. As may be appreciated, the earcup assembly 52 couples with the ear of a user while the power source/electronics assembly 54 sits on the opposite side of a user's head. Both structures 52, 54 are coupled together by a headband assembly 56 as discussed further hereinbelow. Headset 50 incorporates various features of the invention. In one embodiment of the invention, the headset 50 itself is a fully-operable, voice-enabled mobile computer terminal that includes the necessary electronics, as discussed above, to provide speech recognition and speech synthesis for various speech-directed applications. To that end, the electronics, which would be incorporated on a suitable printed circuit board 10, may be located in either the earcup assembly 52 and/or the power supply/electronics assembly 54. The earcup assembly 52 is adjustable as discussed further hereinbelow and shown in
The earcup assembly 52 includes a housing 58 which houses the various components of the earcup assembly, such as a speaker 28, and supports the boom assembly 62 that may include electronics 10, including any electronics which might be utilized to make the headset a mobile terminal for voice applications as shown in
The headband assembly 56 includes two transverse bands 74a, 74b which extend from side-to-side across a user's head to hold the earcup assembly 52 and power source/electronics assembly 54 on the user's head, in a somewhat typical headband fashion. The multiple transverse bands assure a secure fit on the user's head and may include cushions or pads 76, also made of foam or another suitable material for comfort and fit. A stabilizing strap 78 intersects the two transverse bands 74a, 74b and is coupled to each transverse band respectively with a clip 80 or other suitable fixation structure. The stabilizing strap 78 is free to slide through the clips for positioning between the transverse bands. The stabilizing strap 78 also extends partially along the back of the user's head and/or the forehead, as desired by the user, to provide additional stability to headset terminal 50. The strap may slide back and forth so that the headset terminal 50 may be worn on either side of the head. At the end of the stabilizing strap 78 are stop structures 82 and respective cushions 84. The stop structures limit the sliding of the stabilizing strap 78 through the clips 80, so the stabilizing strap cannot be slid past the endmost position. The cushions 84 provide suitable comfort for the user.
Stabilizing strap 78 provides a significant advantage in combination with the multiple transverse bands 74a, 74b. As may be appreciated, the headset terminal 50 may carry significant weight when utilized as a mobile, voice-enabled terminal with suitable processing electronics and a power source, such as a battery. The battery in particular, located in power source/electronics assembly 54 is oftentimes significantly heavy so as to cause a stability issue. The present invention, which utilizes multiple transverse bands 74a, 74b coupled with a stabilizing strap 78, provides the desired stability and comfort for the user. Furthermore, headset terminal 50, is utilized in environments wherein the user is moving very rapidly through multiple tasks and is bending over and standing up quite often. Therefore, the increased stability of the headset provided by one aspect of the present invention is certainly a desirable feature. The power source/electronics assembly 54, as illustrated in
The UP and DOWN buttons 102, 104 are coupled to user interface components 20 and may provide a way of moving through various control menus provided by software that is run by the headset terminal 50. For example, the buttons might provide UP/DOWN volume control or allow UP/DOWN scrolling through a menu. The buttons might also have the functionality of turning the headset terminal ON and OFF or providing any of a variety of different controls for the headset terminal. Accordingly, while the buttons 102, 104 of controls 64 are indicated as UP/DOWN buttons herein, that terminology is not limiting with respect to their functionality. Furthermore, while two buttons are illustrated in the Figures of this application, multiple other control buttons or controls might be utilized in accordance with the principles of the present invention.
In accordance with another aspect of the present invention, the buttons 102, 104 are positioned on opposite sides of the boom assembly rotation axis 106 as illustrated in
For example, as illustrated in
Along those lines, the stabilizing strap 78 as illustrated in
In one embodiment of the invention, an auxiliary microphone 27 might be utilized to reduce noise, to determine when the user speaks into the microphone 26 or for other purposes (see
Turning again to
Turning now to the boom assembly 62, one section of the boom housing 132 cooperates with another section 134 of the boom housing in a clamshell fashion to capture a printed circuit board 124 and an anchor structure 136 for the boom arm 108. A portion of the anchor structure is captured between the sides of the boom housing sections 132, 134. Controls 64 are appropriately and operationally coupled with the boom housing 132, 134 and printed circuit board 124 through a mounting bracket 65 as illustrated in
Printed circuit board 124 contains one or more of the components illustrated on PCB 10 in
The boom assembly housing, and particularly section 134 of the housing rotatably interfaces with the retainer 126 which is secured with earcup housing 58. More specifically, the present invention provides a snap retaining arrangement which secures the rotating boom assembly 62 with adequate bearing surfaces in the earcup housing 58. The present invention does so without shoulder screws, washers, or other elements which have traditionally resided in or through valuable circuit board space. The boom assembly 62 readily snaps in place with housing 58 and freely rotates therewith as necessary for utilization of the headset terminal 50 on either the right side of the head or the left side of the head. Furthermore, the rotating boom assembly provides adjustment of the microphone 66 with respect to the user's mouth.
More specifically, referring to
The unique snap fit provided by the invention eliminates the screws, washers and other fasteners engaging the circuit board 124. Thus the entire board may be used for electronic components. Therefore, a greater amount of the circuit board may be used for the processing circuitry 12, such as for voice processing in accordance with one aspect of the invention. The invention thus provides sufficient board space while keeping the headset terminal 50 small and lightweight. Component costs are further reduced, as are assembly costs and time. The boom assembly 62, housing 58 and other components might be made of a suitable, lightweight plastic.
Turning now to
Specifically, as illustrated in
Referring now to
In accordance with one aspect of the invention, the headset terminal 50 is configured so that the cable 160 articulates completely within the structures of the headset terminal and is hidden thereby. The effective length of the cable 160 may dynamically change while adjusting the headset fit due to the unique configuration of saddle 72 and the sliding arm 68 in hiding and guiding the cable, and providing protection and control of the cable dynamics. Referring to
When the headset terminal 50 is adjusted so that the earcup assembly is moved as shown by reference arrows 182 in
Referring again to
Referring to
Referring to
The snap retention ribs 213 and specifically the stop surfaces 211 are normal to the sliding plane of the latch as illustrated by reference arrow 230 in
In another aspect of the invention, the modular architecture of wireless headset terminal 50 allows the separation of the “personal” components of headset terminal 50, i.e., those that touch the user's head, ears, or mouth, from the non-personal, expensive electronics since the headset is a unitary system with no separate body-worn terminal.
In single shift operations, the entire wireless headset terminal 50 is placed in the charger while not in use. In multi-shift operations, the personal components can be removed from the terminal 50 so the terminal might be reused. Referring to
In use, one typical operation of terminal 50 might be as follows. At the beginning of a shift, a user selects any available terminal at their workplace from a pool of terminals. The user then assembles their personal items to the earcup assembly and microphone boom assembly. In particular, the user might secure pad 60 to the earcup assembly. A fresh battery 92 might be installed and latched. The user may then install their microphone windscreen 29 onto microphone 26 of microphone boom assembly 62. Once all assembly is complete, the user places wireless headset terminal 50 on their head, such that earpad 60 is in contact with their ear, microphone 26 is positioned in close proximity to their mouth, and headpad 96 is in contact with their head. The user then activates terminal 50 by use of controls 64 of user interface 20 and, as a result, power is delivered from battery 92 to wireless headset terminal 50. Subsequently, program and product data may be loaded from a central system (not shown) into terminal 50 via the Wi-Fi radio aspects. Voice commands are processed by CPU 12 and the appropriate response is generated, which is directed digitally to audio input/output stage 14. Audio input/output stage 14 then converts the digital data to an analog audio signal, which is transmitted to speaker 28. Subsequently, the user hears spoken words through speaker 28 and takes action accordingly. The user may then speak into microphone 26, which generates an analog audio signal, that is then transmitted to audio input/output stage 14. Audio input/output stage 14 then converts the analog signal to a digital word that represents the audio sound received from microphone 26 and, subsequently, CPU 12 processes the information. During the operation of headset terminal 50, data within memory 16 or CPU 12 is being updated continuously with a record of task data, under the control of CPU 12. Furthermore, radio transmission occurs between Wi-Fi radio 18 and a central computer (not shown) through a wireless RF network for transmitting or receiving task data in real time or batch form. When the user has completed their tasks, such as at the end of a shift, the user removes headset terminal 50 from their head and deactivates the headset with the controls 64.
In one embodiment, wireless headset terminal 50, in addition to the noted features above, provides the following features:
While the present invention has been illustrated by the description of the embodiments thereof, and while the embodiments have been described in considerable detail, it is not the intention of the applicant to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications will readily appear to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details of representative apparatus and method, and illustrative examples shown and described. Accordingly, departures may be made from such details without departure from the spirit or scope of applicant's general inventive concept.
Number | Name | Date | Kind |
---|---|---|---|
1483315 | Saal | Feb 1924 | A |
D130619 | Tresie et al. | Dec 1941 | S |
D153112 | Braun et al. | Mar 1949 | S |
2506524 | Stuck | May 1950 | A |
2782423 | Wiegand et al. | Feb 1957 | A |
2958769 | Bounds | Nov 1960 | A |
3087028 | Louis | Apr 1963 | A |
D196654 | Van Den Berg | Oct 1963 | S |
3192326 | Chapman | Jun 1965 | A |
D206665 | Sanzone | Jan 1967 | S |
3327807 | Mullin | Jun 1967 | A |
D212863 | Roberts | Dec 1968 | S |
3568271 | Husserl | Mar 1971 | A |
3654406 | Reinthaler | Apr 1972 | A |
3682268 | Gorike | Aug 1972 | A |
3969796 | Hodsdon et al. | Jul 1976 | A |
3971900 | Foley | Jul 1976 | A |
3971901 | Foley | Jul 1976 | A |
3984885 | Yoshimura et al. | Oct 1976 | A |
4018599 | Hill et al. | Apr 1977 | A |
4020297 | Brodie | Apr 1977 | A |
4024368 | Shattuck | May 1977 | A |
4031295 | Rigazio | Jun 1977 | A |
4039765 | Tichy et al. | Aug 1977 | A |
4138598 | Cech | Feb 1979 | A |
4189788 | Schenke et al. | Feb 1980 | A |
4239936 | Sakoe | Dec 1980 | A |
RE30662 | Foley | Jun 1981 | E |
4302635 | Jacobsen et al. | Nov 1981 | A |
D265989 | Harris et al. | Aug 1982 | S |
4357488 | Knighton et al. | Nov 1982 | A |
D268675 | Hass | Apr 1983 | S |
4418248 | Mathis | Nov 1983 | A |
4471496 | Gardner, Jr. et al. | Sep 1984 | A |
4472607 | Houng | Sep 1984 | A |
4499593 | Antle | Feb 1985 | A |
D278805 | Bulgari | May 1985 | S |
4625083 | Poikela | Nov 1986 | A |
4634816 | O'Malley | Jan 1987 | A |
4672672 | Eggert et al. | Jun 1987 | A |
4672674 | Clough et al. | Jun 1987 | A |
4689822 | Houng | Aug 1987 | A |
D299129 | Wiegel | Dec 1988 | S |
4821318 | Wu | Apr 1989 | A |
D301145 | Besasie et al. | May 1989 | S |
4845650 | Meade et al. | Jul 1989 | A |
4875233 | Derhaag et al. | Oct 1989 | A |
4907266 | Chen | Mar 1990 | A |
4952024 | Gale | Aug 1990 | A |
D313092 | Nilsson | Dec 1990 | S |
5003589 | Chen | Mar 1991 | A |
5018599 | Dohi | May 1991 | A |
5023824 | Chadima, Jr. et al. | Jun 1991 | A |
D318670 | Taniguchi | Jul 1991 | S |
5028083 | Mischenko | Jul 1991 | A |
5056161 | Breen | Oct 1991 | A |
D321879 | Emmerling | Nov 1991 | S |
5113428 | Fitzgerald | May 1992 | A |
D326655 | Iribe | Jun 1992 | S |
5155659 | Kunert | Oct 1992 | A |
5177784 | Hu et al. | Jan 1993 | A |
5179736 | Scanlon | Jan 1993 | A |
D334043 | Taniguchi et al. | Mar 1993 | S |
5197332 | Shennib | Mar 1993 | A |
5202197 | Ansell et al. | Apr 1993 | A |
D337116 | Hattori | Jul 1993 | S |
5225293 | Mitchell et al. | Jul 1993 | A |
5251105 | Kobayashi et al. | Oct 1993 | A |
D341567 | Acker et al. | Nov 1993 | S |
5267181 | George | Nov 1993 | A |
5281957 | Schoolman | Jan 1994 | A |
D344494 | Cardenas | Feb 1994 | S |
D344522 | Taniguchi | Feb 1994 | S |
5305244 | Newman et al. | Apr 1994 | A |
5369857 | Sacherman et al. | Dec 1994 | A |
5371679 | Abe et al. | Dec 1994 | A |
5381473 | Andrea et al. | Jan 1995 | A |
5381486 | Ludeke | Jan 1995 | A |
5406037 | Nageno et al. | Apr 1995 | A |
5438626 | Neuman et al. | Aug 1995 | A |
5438698 | Burton et al. | Aug 1995 | A |
5446788 | Lucey et al. | Aug 1995 | A |
5469505 | Gattey et al. | Nov 1995 | A |
D365559 | Fathi | Dec 1995 | S |
5475791 | Schalk | Dec 1995 | A |
5479001 | Kumar | Dec 1995 | A |
D367256 | Tokunaga | Feb 1996 | S |
5491651 | Janik | Feb 1996 | A |
5501571 | Van Durrett et al. | Mar 1996 | A |
5515303 | Cargin, Jr. et al. | May 1996 | A |
5535437 | Karl et al. | Jul 1996 | A |
5553312 | Gattey et al. | Sep 1996 | A |
5555490 | Carroll | Sep 1996 | A |
5555554 | Hofer | Sep 1996 | A |
5563952 | Mercer | Oct 1996 | A |
5572401 | Carroll | Nov 1996 | A |
5572623 | Pastor | Nov 1996 | A |
5579400 | Ballein | Nov 1996 | A |
D376598 | Hayashi | Dec 1996 | S |
D377020 | Bungardt et al. | Dec 1996 | S |
5581492 | Janik | Dec 1996 | A |
5604050 | Brunette et al. | Feb 1997 | A |
5604813 | Evans et al. | Feb 1997 | A |
5607792 | Garcia et al. | Mar 1997 | A |
D380199 | Beruscha et al. | Jun 1997 | S |
5637417 | Engmark et al. | Jun 1997 | A |
D384072 | Ng | Sep 1997 | S |
5665485 | Kuwayama et al. | Sep 1997 | A |
5671037 | Ogasawara et al. | Sep 1997 | A |
5673325 | Andrea et al. | Sep 1997 | A |
5673364 | Bialik | Sep 1997 | A |
D385272 | Jensen | Oct 1997 | S |
5680465 | Boyden | Oct 1997 | A |
D385855 | Ronzani | Nov 1997 | S |
D387898 | Ronzani | Dec 1997 | S |
D390552 | Ronzani | Feb 1998 | S |
D391234 | Chacon et al. | Feb 1998 | S |
5716730 | Deguchi | Feb 1998 | A |
5719743 | Jenkins et al. | Feb 1998 | A |
5719744 | Jenkins et al. | Feb 1998 | A |
D394436 | Hall et al. | May 1998 | S |
5749072 | Mazurkiewicz et al. | May 1998 | A |
5757339 | Williams et al. | May 1998 | A |
5762512 | Trant et al. | Jun 1998 | A |
5766794 | Brunette et al. | Jun 1998 | A |
5774096 | Usuki et al. | Jun 1998 | A |
5774837 | Yeldener et al. | Jun 1998 | A |
5778026 | Zak | Jul 1998 | A |
5781644 | Chang | Jul 1998 | A |
5787166 | Ullman | Jul 1998 | A |
5787361 | Chen | Jul 1998 | A |
5787387 | Aguilar | Jul 1998 | A |
5787390 | Quinquis et al. | Jul 1998 | A |
5793865 | Leifer | Aug 1998 | A |
5793878 | Chang | Aug 1998 | A |
D398899 | Chaco | Sep 1998 | S |
D400848 | Clark et al. | Nov 1998 | S |
5832098 | Chen | Nov 1998 | A |
5841630 | Seto et al. | Nov 1998 | A |
5841859 | Chen | Nov 1998 | A |
D402651 | Depay et al. | Dec 1998 | S |
5844824 | Newman et al. | Dec 1998 | A |
5856038 | Mason | Jan 1999 | A |
5857148 | Weisshappel et al. | Jan 1999 | A |
5862241 | Nelson | Jan 1999 | A |
D406098 | Walter et al. | Feb 1999 | S |
5869204 | Kottke et al. | Feb 1999 | A |
5873070 | Bunte et al. | Feb 1999 | A |
5890074 | Rydbeck | Mar 1999 | A |
5890108 | Yeldener | Mar 1999 | A |
5895729 | Phelps, III et al. | Apr 1999 | A |
D409137 | Sumita et al. | May 1999 | S |
5905632 | Seto et al. | May 1999 | A |
D410466 | Mouri | Jun 1999 | S |
D410921 | Luchs et al. | Jun 1999 | S |
D411179 | Toyosato | Jun 1999 | S |
5931513 | Conti | Aug 1999 | A |
5933330 | Beutler et al. | Aug 1999 | A |
5935729 | Mareno et al. | Aug 1999 | A |
D413582 | Tompkins | Sep 1999 | S |
D414470 | Chacon et al. | Sep 1999 | S |
5991085 | Rallison et al. | Nov 1999 | A |
5999085 | Szwarc et al. | Dec 1999 | A |
6014619 | Wuppermann et al. | Jan 2000 | A |
6016347 | Magnasco et al. | Jan 2000 | A |
6021207 | Puthuff et al. | Feb 2000 | A |
D422962 | Shevlin et al. | Apr 2000 | S |
6051334 | Tsurumaru et al. | Apr 2000 | A |
D424035 | Steiner et al. | May 2000 | S |
6060193 | Remes et al. | May 2000 | A |
6061647 | Barrett | May 2000 | A |
6071640 | Robertson, Jr. et al. | Jun 2000 | A |
6075857 | Doss, Jr. | Jun 2000 | A |
6078825 | Hahn et al. | Jun 2000 | A |
6084556 | Zwern | Jul 2000 | A |
6085428 | Casby et al. | Jul 2000 | A |
6091546 | Spitzer | Jul 2000 | A |
D430158 | Bhatia et al. | Aug 2000 | S |
D430159 | Bhatia et al. | Aug 2000 | S |
6101260 | Jensen et al. | Aug 2000 | A |
6114625 | Hughes et al. | Sep 2000 | A |
6120932 | Slipy et al. | Sep 2000 | A |
D431562 | Bhatia et al. | Oct 2000 | S |
6127990 | Zwern | Oct 2000 | A |
6136467 | Phelps, III et al. | Oct 2000 | A |
6137868 | Leach | Oct 2000 | A |
6137879 | Papadopoulos et al. | Oct 2000 | A |
6154669 | Hunter | Nov 2000 | A |
D434762 | Ikenaga | Dec 2000 | S |
6157533 | Sallam et al. | Dec 2000 | A |
6160702 | Lee et al. | Dec 2000 | A |
6167413 | Daley, III | Dec 2000 | A |
D436104 | Bhatia et al. | Jan 2001 | S |
6171138 | Lefebvre et al. | Jan 2001 | B1 |
6179192 | Weinger et al. | Jan 2001 | B1 |
6188985 | Thrift et al. | Feb 2001 | B1 |
6190795 | Daley | Feb 2001 | B1 |
D440966 | Ronzani | Apr 2001 | S |
6225777 | Garcia et al. | May 2001 | B1 |
6226622 | Dabbiere | May 2001 | B1 |
6229694 | Kono | May 2001 | B1 |
6230029 | Hahn et al. | May 2001 | B1 |
6235420 | Ng | May 2001 | B1 |
6237051 | Collins | May 2001 | B1 |
D443870 | Carpenter et al. | Jun 2001 | S |
6252970 | Poon et al. | Jun 2001 | B1 |
6261715 | Nakamura et al. | Jul 2001 | B1 |
D449289 | Weikel et al. | Oct 2001 | S |
6302454 | Tsurumaru et al. | Oct 2001 | B1 |
6304430 | Laine et al. | Oct 2001 | B1 |
6304459 | Toyosato et al. | Oct 2001 | B1 |
6310888 | Hamlin | Oct 2001 | B1 |
6324053 | Kamijo | Nov 2001 | B1 |
D451903 | Amae et al. | Dec 2001 | S |
D451907 | Amae et al. | Dec 2001 | S |
6325507 | Jannard et al. | Dec 2001 | B1 |
6326543 | Lamp et al. | Dec 2001 | B1 |
6327152 | Saye | Dec 2001 | B1 |
6339706 | Tillgren et al. | Jan 2002 | B1 |
6339764 | Livesay et al. | Jan 2002 | B1 |
6349001 | Spitzer | Feb 2002 | B1 |
6353313 | Estep et al. | Mar 2002 | B1 |
6356635 | Lyman et al. | Mar 2002 | B1 |
6357534 | Buetow et al. | Mar 2002 | B1 |
6359603 | Zwern | Mar 2002 | B1 |
6359777 | Newman et al. | Mar 2002 | B1 |
6359995 | Ou | Mar 2002 | B1 |
6369952 | Rallison et al. | Apr 2002 | B1 |
6371535 | Wei et al. | Apr 2002 | B2 |
6373693 | Seto et al. | Apr 2002 | B1 |
6373942 | Braund | Apr 2002 | B1 |
6374126 | MacDonald, Jr. et al. | Apr 2002 | B1 |
6376942 | Burger et al. | Apr 2002 | B1 |
6377825 | Kennedy et al. | Apr 2002 | B1 |
D457133 | Yoneyama | May 2002 | S |
6384591 | Estep et al. | May 2002 | B1 |
6384982 | Spitzer | May 2002 | B1 |
6386107 | Rancourt | May 2002 | B1 |
6394278 | Reed | May 2002 | B1 |
6434251 | Jensen et al. | Aug 2002 | B1 |
6445175 | Estep et al. | Sep 2002 | B1 |
6446042 | Detlef et al. | Sep 2002 | B1 |
6453020 | Hughes et al. | Sep 2002 | B1 |
D463784 | Taylor et al. | Oct 2002 | S |
6466681 | Siska et al. | Oct 2002 | B1 |
D465208 | Lee et al. | Nov 2002 | S |
D465209 | Rath | Nov 2002 | S |
D466497 | Wikel et al. | Dec 2002 | S |
6496111 | Hosack | Dec 2002 | B1 |
D469080 | Kohli | Jan 2003 | S |
6511770 | Chang | Jan 2003 | B2 |
6562950 | Peretz et al. | May 2003 | B2 |
6581782 | Reed | Jun 2003 | B2 |
6600798 | Wuppermann et al. | Jul 2003 | B2 |
6615174 | Arslan et al. | Sep 2003 | B1 |
D482019 | Petersen et al. | Nov 2003 | S |
6660427 | Hukill et al. | Dec 2003 | B1 |
D487064 | Stekelenburg | Feb 2004 | S |
6697465 | Goss | Feb 2004 | B1 |
D488146 | Minto | Apr 2004 | S |
D488461 | Okada et al. | Apr 2004 | S |
6728325 | Hwang et al. | Apr 2004 | B1 |
D491917 | Asai | Jun 2004 | S |
D492295 | Glatt | Jun 2004 | S |
6745014 | Seibert et al. | Jun 2004 | B1 |
6754361 | Hall et al. | Jun 2004 | B1 |
6754632 | Kalinowski et al. | Jun 2004 | B1 |
6757651 | Vergin | Jun 2004 | B2 |
D494517 | Platto et al. | Aug 2004 | S |
6769762 | Saito et al. | Aug 2004 | B2 |
6769767 | Swab et al. | Aug 2004 | B2 |
6772114 | Sluijter et al. | Aug 2004 | B1 |
6778676 | Groth et al. | Aug 2004 | B2 |
6795805 | Bessette et al. | Sep 2004 | B1 |
D498231 | Jacobson et al. | Nov 2004 | S |
6826532 | Casby et al. | Nov 2004 | B1 |
6847336 | Lemelson et al. | Jan 2005 | B1 |
6885735 | Odinak et al. | Apr 2005 | B2 |
D506065 | Sugino et al. | Jun 2005 | S |
6909546 | Hirai | Jun 2005 | B2 |
D507523 | Resch et al. | Jul 2005 | S |
D512417 | Hirakawa et al. | Dec 2005 | S |
D512984 | Ham | Dec 2005 | S |
D512985 | Travers et al. | Dec 2005 | S |
7013018 | Bogeskov-Jensen | Mar 2006 | B2 |
D519497 | Komiyama | Apr 2006 | S |
7027774 | Kuon | Apr 2006 | B2 |
D521492 | Ham | May 2006 | S |
7046649 | Awater et al. | May 2006 | B2 |
7050598 | Ham | May 2006 | B1 |
7052799 | Zatezalo | May 2006 | B2 |
D524794 | Kim | Jul 2006 | S |
D525237 | Viduya | Jul 2006 | S |
7106877 | Linville | Sep 2006 | B1 |
7107057 | Arazi et al. | Sep 2006 | B2 |
7110800 | Nagayasu et al. | Sep 2006 | B2 |
D529447 | Greenfield | Oct 2006 | S |
D531586 | Poulet | Nov 2006 | S |
7136684 | Matsuura et al. | Nov 2006 | B2 |
D537438 | Hermansen | Feb 2007 | S |
7203651 | Baruch et al. | Apr 2007 | B2 |
D549694 | Viduya et al. | Aug 2007 | S |
D567218 | Viduya et al. | Apr 2008 | S |
D567219 | Viduya et al. | Apr 2008 | S |
D567799 | Viduya et al. | Apr 2008 | S |
D567806 | Viduya et al. | Apr 2008 | S |
7391863 | Viduya | Jun 2008 | B2 |
7519186 | Varma et al. | Apr 2009 | B2 |
7596489 | Kovesi et al. | Sep 2009 | B2 |
20010004310 | Kono | Jun 2001 | A1 |
20010017925 | Ceravolo | Aug 2001 | A1 |
20010017926 | Vicamini | Aug 2001 | A1 |
20010036058 | Jenks et al. | Nov 2001 | A1 |
20010036291 | Pallai | Nov 2001 | A1 |
20010046305 | Muranami et al. | Nov 2001 | A1 |
20010048586 | Itou et al. | Dec 2001 | A1 |
20020000470 | Lanzaro et al. | Jan 2002 | A1 |
20020003889 | Fischer | Jan 2002 | A1 |
20020009191 | Lucey et al. | Jan 2002 | A1 |
20020012832 | White et al. | Jan 2002 | A1 |
20020015008 | Kishida et al. | Feb 2002 | A1 |
20020016161 | Dellien et al. | Feb 2002 | A1 |
20020025455 | Yoneyama | Feb 2002 | A1 |
20020034683 | Takeshita et al. | Mar 2002 | A1 |
20020067825 | Baranowski et al. | Jun 2002 | A1 |
20020068610 | Anvekar et al. | Jun 2002 | A1 |
20020076060 | Hall et al. | Jun 2002 | A1 |
20020080987 | Almqvist | Jun 2002 | A1 |
20020085733 | Cottrell | Jul 2002 | A1 |
20020091518 | Baruch et al. | Jul 2002 | A1 |
20020091526 | Kiessling et al. | Jul 2002 | A1 |
20020110246 | Gosior et al. | Aug 2002 | A1 |
20020111197 | Fitzgerald | Aug 2002 | A1 |
20020131616 | Bronnikov et al. | Sep 2002 | A1 |
20020147016 | Arazi et al. | Oct 2002 | A1 |
20020147579 | Kushner | Oct 2002 | A1 |
20020152065 | Kopp | Oct 2002 | A1 |
20020159574 | Stogel | Oct 2002 | A1 |
20020194004 | Glinski et al. | Dec 2002 | A1 |
20020194005 | Lahr | Dec 2002 | A1 |
20030050786 | Jax et al. | Mar 2003 | A1 |
20030068061 | Huang | Apr 2003 | A1 |
20030095525 | Lavin et al. | May 2003 | A1 |
20030103413 | Jacobi, Jr. et al. | Jun 2003 | A1 |
20030118197 | Nagayasu et al. | Jun 2003 | A1 |
20030130852 | Tanaka et al. | Jul 2003 | A1 |
20030171921 | Manabe et al. | Sep 2003 | A1 |
20030182243 | Gerson et al. | Sep 2003 | A1 |
20030212480 | Lutter et al. | Nov 2003 | A1 |
20030217367 | Romano | Nov 2003 | A1 |
20040001588 | Hairston | Jan 2004 | A1 |
20040010407 | Kovesi et al. | Jan 2004 | A1 |
20040024586 | Andersen | Feb 2004 | A1 |
20040029610 | Ihira et al. | Feb 2004 | A1 |
20040046637 | Wesby Van Swaay | Mar 2004 | A1 |
20040049388 | Roth et al. | Mar 2004 | A1 |
20040063475 | Weng | Apr 2004 | A1 |
20040083095 | Ashley et al. | Apr 2004 | A1 |
20040091129 | Jensen et al. | May 2004 | A1 |
20040137969 | Nassimi | Jul 2004 | A1 |
20040138781 | Sacks et al. | Jul 2004 | A1 |
20040193411 | Hui et al. | Sep 2004 | A1 |
20040203521 | Nassimi | Oct 2004 | A1 |
20050040230 | Swartz et al. | Feb 2005 | A1 |
20050070337 | Byford et al. | Mar 2005 | A1 |
20050149414 | Schrodt et al. | Jul 2005 | A1 |
20050157903 | Bech | Jul 2005 | A1 |
20050232436 | Nagayasu et al. | Oct 2005 | A1 |
20050272401 | Zatezalo | Dec 2005 | A1 |
20070223766 | Davis | Sep 2007 | A1 |
Number | Date | Country |
---|---|---|
2628259 | Dec 1977 | DE |
3604292 | Aug 1987 | DE |
0380290 | Aug 1990 | EP |
1018854 | Jul 2000 | EP |
Number | Date | Country | |
---|---|---|---|
20070184881 A1 | Aug 2007 | US |