1. Field of the Invention
The present invention relates in general to communication systems such as voice paging systems, cellular telephone systems, text paging systems, voice mail systems, and conventional land line telephone systems and, more specifically, to methods, systems and apparatus for non-real-time audio and visual messaging between two messaging devices wherein a communication device receives visual messages from a visual message originator device and transmits audio messages (e.g., voice messages) to the visual message originator device for playback.
2. Discussion of the Background
A conventional mobile communication device (MCD), such as a pager or a mobile telephone, can receive various types of messages. There are numeric pagers which receive only numbers such as a telephone numbers (i.e. 818-555-1212), and there are alpha/numeric pagers which can receive alpha/numeric messages (e.g., Please call me at 929-555-1212). Additionally, there are voice pagers which receive voice messages. In addition to receiving messages, pagers are often configured to transmit messages. An alpha/numeric pager may be configured to receive alpha/numeric messages from an Internet-based computer and to transmit to the Internet-based computer originator alpha/numeric reply messages. Similarly, a voice pager that receives voice messages from a telephone-based originator can be configured to transmit a voice message to a voice mail system for retrieval by the originator via a telephone. In short, alpha/numeric pagers are configured to transmit alpha/numeric reply messages and voice pagers are configured to transmit voice reply messages.
In the field of wireless messaging generally, numeric and alpha/numeric messaging is, by far, more bandwidth efficient than voice messaging. Radio spectrum is a dwindling resource and wireless messaging companies are increasingly sensitive to their bandwidth efficiencies. Better bandwidth utilization means more capacity on the system. More capacity equates to more customers. Hence, the wireless messaging industry has migrated from voice paging to alpha/numeric paging in order to provide a bandwidth efficient, robust and high information content messaging service for their so called “road warrior” customers.
However, to date, there is no simple method for sending a message containing more than a small amount of information from an alpha/numeric pager. Generally, alpha/numeric pagers that have the ability to send messages are configured to transmit small, preprogrammed (“canned”) text messages, such as “I will call you tonight” or “yes” or “no” or “I will reply later.” Other alpha/numeric pagers have been configured with a miniature, QWERTY type, text keyboard. The miniature keyboards are difficult to type on because of their size. This makes input very slow, inaccurate and very annoying to operate. Additionally, the miniature size of the keyboard is still too large for a pocket-sized pager.
Audio pagers are generally larger than keyboardless alpha/numeric pagers, in part because of the large speaker required in order to reproduce a high quality, audible audio message. An audio pager's battery life is lower than an alpha/numeric pager due to the amount of time it takes to receive a message and the amount of power required to process and play the message.
One solution to the above problem is to route a mobile telephone originated voice reply message to a computer. However, it's simply not convenient to send a voice message via a mobile telephone, configure a computer to receive a voice reply message and then mentally correlate the sent and received messages together (i.e. Was the reply “sounds good to me” meant to be an answer to the message, “I will pay $140,000.00 for your home” or “lets have lunch tomorrow at the fish place”).
A need, therefore, exists to blend audio (e.g., voice) and visual (e.g., text and/or graphics) messaging in a manner that conserves valuable bandwidth and simplifies user input of messages in a mobile communication device. An attempt to that end, is the so-called “smart phone.” Smart Phones are wireless mobile telephones that have added features, implying that they are smarter than the average telephone. These features may include a numeric and alpha/numeric messaging feature, a personal digital assistant (PDA), computer functions, Internet access, and a miniature keyboard. Similar to an alpha/numeric pager, many mobile telephones today are capable of receiving alpha/numeric messages and have the capability of connecting to the Internet for sending alpha/numeric reply messages via a miniature keyboard. There is a need to simplify the input of reply messages into wireless devices for delivery of the reply messages to the originating device, wherein both messages may be correlated.
In an alpha/numeric pager messaging environment, text messages are routinely exchanged between a computer and a text pager (i.e. text in/text out). In a voice pager messaging environment, voice messages are routinely exchanged between a telephone configured voice mail system and a voice pager (i.e. voice in/voice out). In a mobile telephone messaging environment, as with an alpha/numeric pager, text messages routinely are exchanged in non real-time (i.e. text in/text out). However, mobile telephones also have the ability to make and receive real-time voice calls. More often than not, when a mobile telephone user receives a text message requiring some type of response, he or she will simply use the mobile telephone to place a real-time telephone call to the message originator. In many cases, the line is busy because the originator is either on the phone or connected to the Internet. In either case, the mobile caller is diverted to a voice mail system or answering device and is instructed to leave a message, which is then retrieved by the called person at a later time.
Many people prefer to communicate by messaging as opposed to real-time conversation in order to manage their time. Thus, there is a need for a device that can not only send non real-time text messages, but also receive non real-time voice messages (i.e. text out/voice in). At the wireless side of the messaging loop, there is a need to send non real-time voice messages from the same mobile device that receives non real-time text messages (i.e. text in/voice out).
It is widely accepted in the field of two way paging, that only a fraction of received messages generate a reply message response. On the other hand, when text message reception capability is combined with a mobile telephone, the mobile telephone user will attempt to reply much more often via a real-time voice call. Mobile telephone companies call this process “call completion” and it is highly favored among mobile telephone companies because additional calling generates more revenue. There is a need to increase reply traffic in a wireless environment without decreasing the efficiency of text message delivery to a mobile communication device. There is also a need to simplify the input of messages in a mobile communication device, such as a pager or wireless mobile telephone. Finally, there is a need for a wireless messaging system that provides end to end audio and visual messaging, wherein only one device is required at each end of the messaging loop (e.g., computer on the one end and mobile communication device on the other).
The present invention provides a communication system for integrating audio and visual messaging. The communication system includes a communication device for receiving visual messages and for transmitting voice messages to a recipient, and an integrated mail gateway for receiving from the communication device a voice message and addressing information. The integrated mail gateway is programmed to create an electronic mail (hereinafter “e-mail”) message comprising the voice message. The integrated mail gateway is also programmed to use the addressing information to address the e-mail message, and to send the addressed e-mail message to the recipient.
Preferably, the communication device is a wireless mobile communication device. However, this is not a requirement. The present invention is contemplated to be used with wireless as well as non-wireless communication devices.
In one embodiment, the communication system is used by a user of a communication device to send a voice message in reply to a received visual message. For example, consider the situation where a first person uses a messaging device with Internet e-mail capability to transmit a visual message to a second person. The communication system of the present invention enables the second person to easily transmit a voice reply message to the first user's messaging device. In one embodiment, the second person uses a communication device (such as a conventional telephone or mobile telephone having visual message reception capability) to establish a telephone call with an integrated mail gateway (IMG) that preferably has access to the visual message sent to the second person. After the telephone call is established, the second person uses the communication device to transmit, or the communication device automatically transmits, to the IMG addressing information associated with the visual message received from the first person. For example, a keypad on the communication device could be used to transmit the addressing information, or a processor in the communication device can be programmed to automatically transmit the addressing information. In one embodiment, after the IMG receives the addressing information, it prompts the second user to begin speaking a voice reply message for the first person after hearing a tone. The IMG records and stores the voice reply message. Optionally, the IMG will format the voice reply message into a conventional audio file format. The IMG then creates an e-mail message and includes the voice reply message in the e-mail. Optionally, the e-mail message includes at least part of the received visual message to which the voice message is a reply. The IMG uses the addressing information provided by the mobile communication device to address the e-mail message. After the e-mail message has been addressed, the e-mail message can be sent to the first user. Upon receiving the e-mail message, the first user's messaging device can play the voice reply message associated with the original visual message so that the voice reply message is heard by the first user.
Further features and advantages of the present invention, as well as the structure and operation of various embodiments of the present invention, are described in detail below with reference to the accompanying drawings.
The accompanying drawings, which are incorporated herein and form part of the specification, illustrate various embodiments of the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
A first embodiment of the present invention is hereinafter described, with reference to the diagram of
Preferably, messaging device 105 is directly connected to a data network 120 (such as the Internet or other like network) or connected to an internet service provider (ISP) 170(1) or 170(2) (such as America Online) that has a connection to data network 120. ISP 170(1) and 170(2) each has a point of presence including data storage and retrieval equipment for enabling e-mail services and file transfer as is well know to those skilled in the art of internet communications. Messaging device 105(3) is shown connecting to ISP 170(2) through PSTN 110. Although this is the most common way today for messaging devices to connect to an ISP, other connectivity options are available, such as satellite links and cable modems. Messaging device 105 is enabled for two-way e-mail and file transfer communications. Data network 120 is a communications network for transporting data. There are no limits to the type of data carried by data network 120. For example, the data can be simple text data or it can be voice or video.
An integrated message gateway (IMG) 150, is connected to a public switched telephone network (PSTN) 110 for automated storage and forwarding of audio messages. Operation of the IMG 150 will be described in detail hereinafter.
A Mobile Switching Office (MSO) 125 (also referred to as a base station) comprises all of the necessary equipment and software for enabling communications between mobile telephone 145 and PSTN 110 as well as data network 120. The MSO 125 and mobile telephone 145 are configured for voice communications and visual messaging. Other than the improvements to be described in detail hereinafter, the MSO 125 and mobile telephone 145 components are readily available and are well known to those skilled in the art of mobile telephone communications. The mobile telephone system described herein may be configured to operate using various format technologies well known in the art of mobile telephone systems such as for example, CDMA, TDMA, GSM, N-AMPS, etc.
A Pager Switching Office (PSO) 130 (also referred to as a base station), comprises all of the necessary equipment and software for enabling communications between a paging transceiver 140 and the PSTN 110 or the data network 120. The PSO 130 and paging transceiver 140 are configured for voice messaging from the paging transceiver 140 to the PSO 130 and for visual messaging from the PSO 130 to the paging transceiver 140. Visual messaging is well known to those skilled in the art of paging systems. Paging systems capable of transmitting visual messages may be purchased from Motorola, a U.S. manufacturer of paging equipment. Voice paging systems are also well known to those skilled in the art of paging systems. Motorola manufacturers paging systems for transmitting voice messages to voice pagers. Readycom of Chapel Hill, N.C. produces a system for transmitting voice messages to cellular voice pagers and for transmitting voice messages from cellular voice pagers.
Other than the improvements to be described in detail hereinafter, the PSO 130 and paging transceiver 140 components are readily available and are well known to those skilled in the art of paging communications. The paging system described herein may be configured to operate using various format technologies well known to those skilled in the art of mobile telephone systems and paging systems, such as for example Inflection, pACT, TDMA, etc.
In operation, a user (not shown) enters an e-mail message via a keyboard (not shown) attached to messaging device 105. The e-mail message is addressed to one or more communication devices, such as paging transceiver 140 and/or a mobile telephone 145. The e-mail message is routed through the data network 120 to MSO 125 or PSO 130 for transmission to the designated communication device(s).
Referring to
The process of sending an e-mail message from a messaging device 105 to a mobile communication device is well known to those skilled in the art of wireless visual messaging systems. However, the integration of audio and visual messaging presents novel features never before available to a wireless service company or end user. With the present invention, a cellular telephone company may now offer integrated (i.e., audio/visual) messaging services which are transported over the Internet, thereby achieving a substantial reduction in cost. Cellular telephone companies are offering e-mail type text messaging to mobile telephone subscribers today (through the Internet). In short, the cellular telephone company is already connected to the Internet. The present invention provides new opportunities for transporting non real-time voice messages over a network connection that would not be possible in real-time.
Referring to
Referring to
Referring now to
Referring to
As illustrated above, a voice message may be routed from a mobile communication device 140, 145 to a messaging device 105 utilizing a variety of message routing designs. It is important to note that the voice message may be routed through one particular path while a visual message may be routed through a different path. For example, a text e-mail message may be routed from the messaging device 105(1) through the data network 120 to an e-mail server and short messaging service at the MSO 125. From the MSO 125, the visual e-mail text message is transmitted to the mobile telephone 145. The e-mail message is viewed by the user and the user speaks a voice reply message to be delivered back to the message originator at the messaging device 105(1). The voice message is then routed to the messaging device 105(1) through one of, or a variation of, the routes previously described. The system operator is given the option to choose a two-way messaging system for voice and visual messaging that utilizes the most efficient delivery path for routing messages depending on the type of message to be delivered or received (i.e. audio or visual). An MSO 125 or PSO 130 may now utilize the data network 120 for transporting voice messages.
Prior art systems currently exist for sending e-mail text messages from a computer over the Internet to a mobile communication device. The present invention enables a mobile communication device to send a voice reply message over the Internet (or other data network) to the user that originated the e-mail text message. This is a highly desirable feature. For example, consider a mobile device user who is driving a car and receives an e-mail message to which a reply is urgently required. Such a user is unable to safely use a keyboard to enter a text reply message, but can easily create a voice reply message while keeping his or her hands on the wheel.
Referring to
The necessary software instructions and operating system for enabling mobile telephone 145 or paging transceiver 140 to receive visual messages are included and well known to those skilled in the art of mobile telephone and paging systems. Mobile telephones for two-way voice communications are commonly available today. Many of these mobile telephones receive and display visual messages such as text messages. Generally, this service is called Short Messaging Service or SMS.
One format for receiving SMS is known as Cellular Digital Packet Data or CDPD. There are many variations for text messaging in mobile telephones and there are many formats in which the text messages may be transmitted. A few mobile telephone types that receive text messaging are CDMA (Code Division Multiple Access), TDMA (Time Division Multiple Access), and GSM (Global System for Mobile Communications). There are also a plurality of languages and protocols for enabling a wireless mobile telephone to communicate over the Internet. A few of them are HDML (Handheld Device Mark up Language), HDTP (Handheld Device Transport Protocol), TTML (Tagged Text Mark up Language) and WAP (Wireless Application Protocol).
There are many prior art configurations for receiving visual messages by mobile telephone 145 or paging transceiver 140, and all such configurations are herein contemplated for use in combination with the novel features of the present invention. In short, it is widely known in the art of wireless paging systems and mobile telephone systems that paging transceivers and mobile telephones receive visual messages.
The process by which a visual message, such as text, is received is not critical. However, the novel system and method for processing the received message information in connection with transmitting a voice reply message will be described in detail hereinafter.
Referring again to
The antenna 351, antenna interface 352, receiver 353, transmitter 354, processing module 355 and user interface 356 are connected in a typical mobile telephone or paging transceiver configuration. A controller 357 and memory 358 have been included for processing of received visual messages, storage of visual messages, and processing of transmitted voice messages. The controller 357 comprises a conventional microprocessor of the type commonly used in mobile telephones and paging transceivers. The controller 357 also includes a memory manager, I/O ports, RAM and ROM memory and all necessary software instructions required to communicate with the processing module 355, user interface 356, and memory 358. The controller 357 connects to the processing module 355 for receiving and sending data, messages, and commands. The controller 357 is connected to memory 358 for storage and retrieval of messages and message data and to the user interface 356 for input and output interfacing with a user.
Continuing with
When a user wishes to send a voice message in reply to a received visual message, the user enables a voice reply mode via user interface 356. This causes the controller 357 to read the necessary addressing information from the processing module 355 and memory 358 for addressing and sending the voice reply message to the originator of the received message. Optionally, the user may enter addressing information associated with the received visual message manually using a keypad, as previously described. After enabling the voice reply mode, the user speaks into the microphone of the user interface 356, and a real-time voice message is transmitted to the IMG 150 for forwarding, in non real-time, to the visual message originator (e.g., messaging device 105). In an alternative embodiment of the present invention, a pre-recorded voice message is transmitted to the IMG 150 as will be described in detail hereinafter.
When a mobile communication device receives a visual message, the mobile communication device also receives and stores messaging data associated with the visual message. This messaging data may include, but is not limited to: a reply name, a reply address, a reply code, a reply type code, and reply subject matter.
The reply name is generally the name of the person who sent the visual message. The reply address may be an e-mail address such as “jsmith@hotmail.com” or an abbreviated address such as a sequence of letters and/or numbers that is associated with an e-mail address stored at the IMG 150. The reply address may also include an IMG 150 system identifier or e-mail system identifier or a telephone number to a particular IMG 150. The reply code is a code that corresponds to the original visual message stored at the MSO's 125 or PSO's 130 e-mail system or at IMG 150. The reply type code indicates the type of voice message that is allowable by the system (i.e. pre-recorded, real-time, analog, digital, format, etc.). The type code may also indicate the allowable length or size of a voice message. The reply subject matter may contain the original text subject matter of the received visual message.
A complete system according to the preferred embodiment of the present invention will now be described in connection with the mobile telephone 145 and the MSO 125 system of
In order to send a visual message from the messaging device 105 to the mobile telephone 145, a user of messaging device 105 creates an e-mail message using an e-mail program, such as one provided from AOL, Lotus, Netscape or Microsoft. The e-mail message is sent from the messaging device 105 to the MSO 125 via data network 120. An e-mail gateway (EMG) 115 is configured at the MSO 125 for receiving the e-mail message for delivery to the mobile telephone 145. Such a configuration is well known to those skilled in the art of mobile telephone systems.
After the e-mail message is received by the EMG 115, the MSO 125 transmits a signal to the mobile telephone 145. The signal includes the e-mail message and messaging data associated with the e-mail message. For this example, the signal consists of a reply name of “John Doe,” a reply code, and the e-mail message, as shown below:
The antenna 351 of mobile telephone 145 receives the signal sent from the MSO 125. The receiver 353 demodulates the signal to recover the e-mail message and message data contained in the signal. The processing module 355 stores the e-mail message and message data in memory and causes the user interface 356 to alert the user that a message has arrived.
Upon being alerted, the user may activate the user interface 356 to cause the processor 355 to read the stored visual message and associated messaging data from the memory and display it on a display screen for viewing by the user. The time at which the message was sent or delivered may also be displayed on the display screen. Additionally, a reply indicator, which is triggered by the reply code and gives an indication to the user that the visual message selected is one in which a voice reply may be sent, is displayed. After viewing the displayed information, the user makes a decision to send a voice reply message to the originator by selecting a reply option provided by the user interface.
Referring now to
In step 406, the controller 357 searches an electronic address book (hereafter “address book”) stored in memory 358 for an address associated with the reply name that was included in the messaging data. If the controller 357 does not find such an address, then flow proceeds to step 410, otherwise flow proceeds to step 412. In step 410, the user is informed via a visual or audible indicator that there is no return address and the process ends.
It should be noted that the reply code search at step 404 was for a code that corresponds to the visual message stored at the MSO 125. If the reply code does not exist, the reply name included in the messaging data is used to search for a corresponding address. As previously described, the reply code might simply be the sequence “12”. Reply codes reduce air time because all of the necessary addressing information is stored at the MSO 125 and the reply code points to that data.
In step 412, the controller 357 of mobile telephone 145 attempts to establish communications with the IMG 150 at the MSO 125 by sending the pre-programmed telephone number read from the memory 358 or obtained from the messaging data. This attempt is automatically accomplished by the controller 357. After step 412, flow proceeds to step 414, where a determination is made to see if communication is established. If communication is not established, flow proceeds to step 416, otherwise flow proceeds to step 418.
In step 416, the user is informed that a connection could not be established and the process ends. If this occurs, the user can simply attempt to send the reply message at a later time. In step 418, the mobile telephone transmits addressing data to the IMG 150, and the IMG 150 transmits acceptance or error codes back to the mobile telephone 145. The addressing data either includes the reply code or the address determined in step 406.
After step 418, flow proceeds to step 420, where the controller 357 checks for “invalid data” error messages sent from the IMG 150. If an invalid data error is detected, the controller 357 tags the invalid addressing data stored in the memory 358 as invalid and reads the memory 358 for valid messaging data in order to correct the problem. For example, if the mobile telephone 145 sends a reply code associated with a message that is no longer available to the IMG 150, the IMG 150 will send an error code to the mobile telephone 145 indicating that the message can not be replied to using the current reply code.
When the controller receives this error code it reads the memory 358, to see if there is a name and address for the recipient stored in memory 358 (step 422). If yes, at step 428, flow proceeds to step 418 and the process continues. If the controller 357 does not find valid addressing data, flow proceeds to step 430 where the user is informed of the problem via an indicator and the process ends.
If at step 420 it is determined that the addressing data is valid, flow proceeds to the message record process at step 424. At step 424, the mobile telephone 145 user is informed by the IMG 150 to begin speaking a message at the tone for the designated recipient. The mobile telephone user may also be informed via an LCD or any other means of indication to the user to begin speaking. The user may also be informed as to how much time the user may speak a voice message. For example, “begin speaking a 30 second message for Mary Jane at the tone.” However, a preferred embodiment provides for a more efficient means for notifying a user of the amount of available record time. As previously discussed, a reply type code can be included in the messaging data transmitted with the visual message. The reply type code can include a length code indicating the amount of time allowable for a reply message. This feature can be very useful for a mobile telephone 145 user, in that the user is informed at the time the visual message is viewed, of the amount of time given to reply. This advanced notice gives the user time to compose a reply message in advance as opposed to finding out two seconds before recording starts.
At step 424, the voice message is recorded by the IMG 150 in a manner consistent with conventional voice mail systems well known to those skilled in the art of voice mail systems. For example, the user may be given the option to review the message, delete the message, re-record the message, etc. When the user is finished recording, flow proceeds to step 426, where the IMG 150 notifies the mobile telephone 145 user that the voice message has been accepted and or sent and the process ends.
As illustrated by the above process, a voice message may be sent as a reply message to a visual message. A number of schemes may be used to send the voice reply message to a messaging device 105 so that the messaging device 105 user knows that the voice message is a voice reply to a particular visual message. In one scheme, the mobile telephone 145 simply transmits a reply code to the IMG 150. The reply code corresponds to a particular visual message available to the IMG 150. For example, the visual message could be stored within the IMG 150, MSO 125 or EMG 115. When the IMG 150 formats the voice reply message, the IMG 150 can include in the reply message the particular visual message associated with the reply code.
In another scheme, if the visual message is not available to the IMG 150, the mobile telephone 145 searches its memory 358 for a name or address. If found, the name or address, together with the received visual message or part of the received visual message, is transmitted along with the voice message to the IMG 150, which will then create an e-mail message containing the voice message and the received visual message or part thereof. The name or address sent to the IMG 150 is used by the IMG 150 to address the e-mail message.
In short, if the visual message is available to the IMG 150, and the messaging data transmitted to the mobile telephone 145 includes a reply code, then there is no need for the mobile telephone 145 to transmit to the IMG 150 anything other than the voice message and the reply code, thereby saving valuable bandwidth.
Voice messages that are not reply messages may also be initiated by the mobile telephone 145. The mobile telephone user may simply select a pre-stored name or address from a name and address book (also referred to as, “send message list”) stored in memory 358. When the name appears on the display screen, the user simply uses the user interface to select a send message function and the process starts at step 412,
Voice messages transmitted from the mobile telephone 145 may be analog or digital. If analog transmission is utilized, the IMG 150 converts the analog voice message to digital data representative of the voice message for storage and transmission to the messaging device 105. If the voice message is in digital format, the digital data representative of the voice message may be stored directly by the IMG 150.
Additionally, a voice message may be pre-recorded and stored in memory 358 for transmission to the IMG 150. In such a configuration, a digital signal processor and associated analog to digital converter may also be configured with the controller 357 in a conventional manner for recording voice messages and storing the voice messages as data in memory 358. If the stored voice message is to be transmitted in analog format, then a digital to analog converter may also be utilized for converting the voice message data stored in memory 358 back to analog. For a pre-recorded voice message configuration, step 424 of
In connection with the paging transceiver 140, it should be noted that a telephone call is not normally initiated between a paging transceiver and the pager switching office (PSO) 300. In the case of paging transceiver 140 sending voice messages to the PSO 130, a pre-recorded voice message, as opposed to a real-time voice message, may be sent as previously described. However, at step 412, a telephone connection is not established. Step 412 would be changed to: Page PSO 130 utilizing a conventional voice paging protocol such as Inflection, pACT, etc. Step 414 would be changed to: Did the PSO 130 respond to the selective call (i.e. page and handshake). Finally, step 424 would be changed to: Transmit the pre-recorded voice message using a paging protocol. Optionally, prior to transmission, the pre-recorded voice message can be compressed using conventional compression algorithms to decrease air time.
Referring now to
The mail server 530(2) is coupled to data network 120 and receives visual messages sent from messaging device 105. The ECS 531 retrieves visual messages from the mail server 530 and reformats the messages for wireless transmission. The ECS 531 sends the reformatted messages to the SMS 532 where it is queued and sent to the MSO 125 for transmission to a targeted mobile telephone 145. There are many formats and systems available for delivering e-mail messages to the MSO 125 or PSO 130 for transmission to a mobile telephone 145 or paging transceiver 140, respectively. These systems are well known to those skilled in the art of wireless messaging. Other than the improvement described hereinafter, these systems are readily available.
The VMS 510 is connected to the MSO 125 so that voice messages sent from mobile telephone 145 can be received and stored by the VMS 510. The VMS can be directly connected to the MSO 125 (as shown in
The VMS 510 is an enhanced voice mail system. Voice mail systems are readily available from manufacturers such as Centigram Communications Corporation in San Jose, Calif., Nortel Networks in Santa Clara, Calif. and AVT in Kirkland, Wash., to name a few. The VMG 520 is an enhanced voice mail gateway. Voice mail gateway systems are also readily available from the above manufacturers. Voice mail systems and voice mail gateways are generally software driven and adaptable to various messaging environments. These systems are easily networked for communication between different manufacturer's platforms. Additionally, these systems have evolved to the point that many have the capability of sharing messages and data between platforms. One such system manufacturer is Data Connection Limited in Enfield, England. Data Connection Limited manufactures voice mail systems, voice mail gateways and networking systems which will communicate utilizing a protocol called “Voice Profile for Internet Mail” (VPIM). The VPIM protocol is a common messaging language for the transport of voice messages between platforms. VPIM additionally allows for the sending of voice messages from a computer or voice mail system to a voice mail system or computer via the Internet. Other than the improvements described hereinafter, voice mail systems, voice mail gateways, voice mail networks, mail servers, e-mail gateways, short messaging service systems, MSO systems, protocols for transmitting voice messages over the Internet and protocols for transmitting e-mail messages to a wireless transceiver are well known to those skilled in the art of these systems and are readily available.
After step 602, flow proceeds to step 604, where the VMS 510 accepts or rejects the call based on the subscriber identification data. If the call is rejected, flow proceeds to step 606, otherwise flow proceeds to step 608. In step 606 the call is terminated, an error message is transmitted to the mobile telephone 145, and flow returns to step 600.
In step 608, the VMS 510 receives data from the mobile telephone 145. The data may include a request code. The request code indicates the type of action requested. For example, a request code of “01” indicates that the current request is for sending a general message, and a request code “02” indicates that the current request is for sending a reply message. Request codes may also be used for forwarding, cc, bcc, etc. The data may also include addressing information such as an e-mail address, an abbreviated e-mail address, a name, subject matter, type, a reply code, a coded address, etc.
In a preferred embodiment, the present invention utilizes messaging data codes whenever possible in order to conserve transmission time. When a mobile telephone 145 user wishes to originate a message (as opposed to reply to a message), the user selects the name of the person to whom the message is to be sent from an address book stored in the telephone 145. When the user activates the send message command after selecting a recipient, only the message to be forwarded to the recipient, a request code, and a coded address normally needs to be transmitted to the IMG 150.
The request code directs the VMS 510 to perform an action, in this case, the example is to send a message. The coded address corresponds to all other information required to send the message to the recipient, such as the intended recipient's name, e-mail address, message type code, etc. To send a reply message, only the request code and a reply code need be sent together with the reply message.
It should be noted that there are many methods contemplated that may be used for finding information stored at the MSO 125 or IMG 150. A mobile telephone 145 may, for example, transmit the name of the message recipient. The VMS 510, may utilize the received name for looking up the associated address stored in a database associated with the IMG 150 in order to reduce the amount of transmission time required by the mobile telephone 145. The mobile telephone 145 according to the preferred embodiment of the present invention does not transmit the actual addressing information if that information is otherwise available to the IMG 150 or associated interconnected systems via a coded address or the like.
Referring back to
Referring to
It is beneficial for the mobile telephone to transmit a complete recipient name and address when the mobile telephone 145 may be roaming in another system area where the IMG 150 does not have a corresponding address book. If at step 616 the VMS 510 determines that coded data representing a recipient was received (i.e. coded address), flow proceeds to step 618. At step 618, the VMS 510 searches an address book stored in its database for a corresponding match. The VMS 510 may alternatively utilize an algorithm for converting the code to an address or name.
After step 618, flow proceeds to step 620, where the VMS 510 determines if the recipient address can be produced from the coded information or found in the VMS 510 address book. If at step 620 an address cannot be obtained, flow proceeds to step 622, where an error is transmitted to the mobile telephone 145, the connection is terminated and the process is returned. If an address is obtained at step 620 flow proceeds to step 626.
Step 626 is a voice recording process performed by the VMS 510. If a real-time voice message is to be received by the VMS 510, then voice prompts are returned to the mobile telephone 145 (e.g., “start recording at the tone”). The VMS 510 allows a caller to review, re-record, append, erase, etc., messages in a manner consistent with typical voice mail systems. If, on the other hand, a non-real-time message is to be received, then voice prompts are not returned. A record type indicator is transmitted from the mobile telephone 145 at the beginning of step 626 indicating the desired recording format (i.e., real-time vs. pre-recorded). A paging transceiver 140 may, for example, utilize the pre-recorded format while a mobile telephone 145 may utilize a real-time recording format.
At step 626, the voice message is received from the mobile telephone 145 and recorded by the VMS 510. Recording stops after an assigned amount of time or after the VMS 510 receives a stop record command from the mobile telephone 145. Recording may also be terminated by the VMS 510 responsive to a noisy communications connection. When the recording is complete, the VMS 510 sends an acknowledgment to the mobile telephone 145 and the communication is terminated (step 628). Flow then proceeds to step 630, where the VMS 510 transfers the voice message and the necessary addressing information to the VMG 520.
The VMG 520 converts the voice message to a conventional audio file format suitable for transmission over the Internet 120 and reproduction by the destination messaging device 105. An example of a conventional audio file format is the “.wav” format developed by Microsoft. The VMG 520 also creates an e-mail message comprising the converted voice message and uses the addressing information to address the e-mail. The converted voice message can be stored in a file and attached to the e-mail message.
The addressed e-mail message is then forwarded to a mail server such as the mail server 530 (step 632). The VMG 520 preferably inserts the words “voice message” in the subject line of the e-mail and inserts instructions for playing the message in the body of the e-mail. It is preferable that the e-mail message be given a priority level equal to primary mail so that it will not be inadvertently filtered by a recipient's e-mail system that limits attachments or treats attachments as secondary mail. After the addressed e-mail message is delivered to the mail server 530, the process returns to step 600. The mail server 530 is responsible for sending the addressed e-mail message to the intended recipient.
Referring back to
After step 640, flow proceeds to step 642, where the VMS 510 determines if the reply information corresponding to the received reply code is present in an archive file. If the reply information is not present, then flow proceeds to step 644, otherwise flow proceeds to step 658.
In step 644, the VMS 510 requests reply addressing information from the mobile telephone 145. Flow then proceeds to step 646, where the VMS 510 analyzes the information received from the mobile telephone 145. If the reply addressing information is complete, flow proceeds to the record process (step 658). If the reply information is coded, flow proceeds to step 648 where the address and or name is calculated or found, as previously described. Flow then proceeds to step 650 where the VMS 510 makes a determination as to the validity of received data. If no addressing data was received or if the data received was not valid, flow proceeds to step 652, where an error code is returned to the mobile telephone 145, communication is terminated, and the process returns to step 600. If at step 650 it is determined that the data is valid, flow proceeds to step 658.
At step 658, if the process flow is from step 642, the “reply to” name, address, subject matter and text are obtained from the e-mail archive at the mail server 530 via the VMG 520 and appended to the voice message by the VMS 510. The “mailed from” information, as previously described, is obtained by the VMS 510 when communication is established by cross indexing the subscriber identification data with the subscriber data base associated with the VMS 510. At step 658, if the process flow is from step 650 or step 646, a “reply to” name, address and subject matter including “mail from” information is appended or attached to the voice message by the VMS 510.
It should be noted that the preferred method for receiving reply information from a mobile telephone 145 is to receive a reply code in order to conserve transmission time. The original message file stored in the archive at the mail server 530 provides all of the necessary information required to send a reply. The next preferred method for receiving reply information from the mobile telephone 145 is to send a coded address and subject matter. The coded address may then be correlated with the subscribers mailing list stored at the VMS 510 in order to produce the name and/or address. The least favorable method is to receive from the mobile telephone 145 the complete name, complete address and subject matter.
During step 658, the voice message is recorded by the VMS 510 in the same manner as described with reference to step 626. After step 658, flow proceeds to step 660 where the call is terminated. The VMS 510 then sends to the VMG 520 the recorded voice message, addressing data, and a pointer to or the actual visual message to which the voice message is a reply (step 662). The VMG 520 converts the recorded voice message to an acceptable Internet and messaging device 105 format and packages the reply voice message with the original visual message for transmission. Flow proceeds to step 664, where the VMG 520 forwards the packaged addressed message (i.e. audio and visual) to the mail server 530 for transmission to the messaging device 105, as previously described, and the process returns to step 600.
Referring once again to
As previously described, it is preferable that the mobile telephone 145 transmit coded address information representative of an actual address stored in the IMG 150. However, there may be times, such as when the mobile telephone is roaming in another IMG 150 system area, when it is necessary to transmit an un-coded name and address. In order to solve this problem, the mobile telephone 145, according to one embodiment, may store both the complete name and address and a coded address which corresponds to the complete address information stored at the IMG 150.
When the IMG 150 sends a visual message to the mobile telephone 145, complete addressing information is also sent if the coded address is not already stored in memory 358 of the mobile telephone 145. In other words, the IMG 150 need not transmit information to the mobile telephone 145, if the information is already stored there. When a visual message is transmitted to the mobile telephone 145, the IMG 150 first sends a coded address to the mobile telephone 145. The controller 357 then searches for corresponding complete addressing information stored in memory 358. If a match is found, the mobile telephone 145 indicates to the IMG 150 that a match was found. The IMG 150 then does not need to transmit the actual data. It is a simple matter for the controller 357 to insert the corresponding name and address in the proper place within the message indicating from whom the message was sent. If the complete address information including the name is not stored in the memory 358, the mobile telephone 145 indicates to the IMG 150 that a match was not found. The IMG 150 then transmits the complete information to the mobile telephone 145 for storage.
When the mobile telephone 145 receives a visual message from the IMG 150, the message is stored as previously described for viewing on an LCD type display. When a mobile telephone 145 user views the message, the user may elect to save the name and address. The name and address may be saved in memory 358 by selecting the “add to list” option on the user interface which causes the controller 357 to store the address information in the address book in memory 358 for addressing out-going voice messages.
The first time that the mobile telephone 145 transmits the saved address information to the IMG 150 (i.e. when sending a message), the IMG 150 will issue an associated coded address to the mobile telephone 145 for storage in memory 358. The mobile telephone 145 then adds the coded address to the entry in its address book associated with the stored address. The IMG 150 adds the address information and coded address to its address book. In this manner, the mobile telephone 145 need only transmit the complete address information one time. Thereafter, only the coded address need be transmitted.
A mobile telephone subscriber may also add an address to the mobile telephone 145 address book using a messaging device 105. The subscriber may simply send an “address list message” to their own mobile telephone 145. An “address list message” is a visual message having a predetermined format and including a list of names and corresponding addresses. One example of an address list message is an e-mail message wherein the body of the e-mail includes a list of address book entries, wherein each entry includes a name and at least one corresponding address.
The integrated voice and visual messaging system described and illustrated herein is readily adaptable to a plurality of messaging formats, protocols, modulation schemes and system configurations. Voice messages may be transmitted from a mobile communication device such as a paging transceiver 140 or a mobile telephone 145 to an integrated message gateway for forwarding over the Internet or other type of network to a personal computer or other visual messaging device. Voice messages may be transmitted to the integrated mail gateway in analog or digital format. Additionally, voice messages may be pre-recorded at the mobile communication device for non real-time transmission or real-time voice messages may be transmitted to the integrated mail gateway for forwarding in non real-time. The system is adaptable to various wireless telephone systems and paging systems. The mobile communication device may be configured in a mobile telephone, pager, wireless PDA, or other wireless device which provides visual messaging and includes means for voice communications.
While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
This application is a continuation of U.S. patent application Ser. No. 09/408,841 filed Sep. 30, 1999, which claims the benefit of U.S. Provisional Application Ser. No. 60/126,939, filed Mar. 29, 1999, and U.S. Provisional Application Ser. No. 60/155,055, filed Sep. 21, 1999. All of the above mentioned applications are hereby incorporated herein by this reference.
Number | Name | Date | Kind |
---|---|---|---|
2003576 | Buhren | Jun 1935 | A |
3118145 | Nee | Jan 1964 | A |
3794983 | Sahin | Feb 1974 | A |
4039761 | Nicoud et al. | Aug 1977 | A |
4042906 | Ezell | Aug 1977 | A |
4124773 | Elkins | Nov 1978 | A |
4371752 | Matthews et al. | Feb 1983 | A |
4412217 | Willard et al. | Oct 1983 | A |
4468813 | Burke et al. | Aug 1984 | A |
4480253 | Anderson | Oct 1984 | A |
4495647 | Burke et al. | Jan 1985 | A |
4549047 | Brian et al. | Oct 1985 | A |
4573140 | Szeto | Feb 1986 | A |
4602129 | Matthews et al. | Jul 1986 | A |
4640991 | Matthews et al. | Feb 1987 | A |
4644352 | Fujii | Feb 1987 | A |
4654713 | Boucharlat et al. | Mar 1987 | A |
4710955 | Kauffman | Dec 1987 | A |
4713837 | Gordon | Dec 1987 | A |
4737979 | Hashimoto | Apr 1988 | A |
4769641 | Yoshizawa et al. | Sep 1988 | A |
4769642 | Davis et al. | Sep 1988 | A |
4786902 | Davis et al. | Nov 1988 | A |
4807155 | Cree et al. | Feb 1989 | A |
4811376 | Davis et al. | Mar 1989 | A |
4812843 | Champion, III et al. | Mar 1989 | A |
4821308 | Hashimoto | Apr 1989 | A |
4825456 | Rosenberg | Apr 1989 | A |
4837798 | Cohen et al. | Jun 1989 | A |
4837800 | Freeburg et al. | Jun 1989 | A |
4839734 | Takemura | Jun 1989 | A |
4845491 | Fascenda et al. | Jul 1989 | A |
4853688 | Andros et al. | Aug 1989 | A |
4858232 | Diaz et al. | Aug 1989 | A |
4864301 | Helferich | Sep 1989 | A |
4868560 | Oliwa et al. | Sep 1989 | A |
4873520 | Fisch et al. | Oct 1989 | A |
4885577 | Nelson | Dec 1989 | A |
4897835 | Gaskill et al. | Jan 1990 | A |
4905003 | Helferich | Feb 1990 | A |
4916737 | Chomet et al. | Apr 1990 | A |
4942598 | Davis | Jul 1990 | A |
4949143 | Iesaka et al. | Aug 1990 | A |
4961216 | Baehr et al. | Oct 1990 | A |
4965569 | Bennett et al. | Oct 1990 | A |
4975694 | McLaughlin et al. | Dec 1990 | A |
5003576 | Helferich | Mar 1991 | A |
5005013 | Tsukamoto et al. | Apr 1991 | A |
5007105 | Kudoh et al. | Apr 1991 | A |
5029199 | Jones et al. | Jul 1991 | A |
5040204 | Sasaki et al. | Aug 1991 | A |
5043721 | May | Aug 1991 | A |
5047763 | Kuznicki et al. | Sep 1991 | A |
5065423 | Gaskill | Nov 1991 | A |
5070536 | Mahany et al. | Dec 1991 | A |
5093659 | Yamada | Mar 1992 | A |
5093901 | Cree et al. | Mar 1992 | A |
5115233 | Zdunek et al. | May 1992 | A |
5117449 | Metroka et al. | May 1992 | A |
5128980 | Choi | Jul 1992 | A |
5128981 | Tsukamoto et al. | Jul 1992 | A |
5134724 | Gehring et al. | Jul 1992 | A |
5138311 | Weinberg | Aug 1992 | A |
5138312 | Tsukamoto et al. | Aug 1992 | A |
5140419 | Galumbeck et al. | Aug 1992 | A |
5146612 | Grosjean et al. | Sep 1992 | A |
5153903 | Eastmond et al. | Oct 1992 | A |
5159331 | Park et al. | Oct 1992 | A |
5159592 | Perkins | Oct 1992 | A |
5159713 | Gaskill et al. | Oct 1992 | A |
5173688 | DeLuca et al. | Dec 1992 | A |
5175758 | Levanto et al. | Dec 1992 | A |
5177780 | Kasper et al. | Jan 1993 | A |
5182553 | Kung | Jan 1993 | A |
5185604 | Nepple et al. | Feb 1993 | A |
5192947 | Neustein | Mar 1993 | A |
5194857 | Gomez | Mar 1993 | A |
5212636 | Nakazawa | May 1993 | A |
5220366 | King | Jun 1993 | A |
5227774 | Benoist | Jul 1993 | A |
5239466 | Morgan et al. | Aug 1993 | A |
5239679 | Murai | Aug 1993 | A |
5241305 | Fascenda et al. | Aug 1993 | A |
5247700 | Wohl et al. | Sep 1993 | A |
5249230 | Mihm, Jr. | Sep 1993 | A |
5255305 | Sattar | Oct 1993 | A |
5257307 | Ise | Oct 1993 | A |
5265033 | Vajk et al. | Nov 1993 | A |
5283818 | Klausner et al. | Feb 1994 | A |
5285426 | Teodoridis | Feb 1994 | A |
5285496 | Frank et al. | Feb 1994 | A |
5315635 | Kane et al. | May 1994 | A |
5329501 | Meister et al. | Jul 1994 | A |
5329550 | Rousseau et al. | Jul 1994 | A |
5333266 | Boaz et al. | Jul 1994 | A |
5375161 | Fuller et al. | Dec 1994 | A |
5379031 | Mondrosch et al. | Jan 1995 | A |
5390362 | Modjeska et al. | Feb 1995 | A |
5396537 | Schwendeman | Mar 1995 | A |
5396547 | Baals et al. | Mar 1995 | A |
5398021 | Moore | Mar 1995 | A |
5402466 | Delahanty | Mar 1995 | A |
5406557 | Baudoin | Apr 1995 | A |
5406616 | Bjorndahl | Apr 1995 | A |
5410302 | Dulaney, III et al. | Apr 1995 | A |
5418835 | Frohman et al. | May 1995 | A |
5420922 | Lundblad et al. | May 1995 | A |
RE34976 | Helferich et al. | Jun 1995 | E |
5426422 | Vanden Heuvel et al. | Jun 1995 | A |
5426594 | Wright et al. | Jun 1995 | A |
5428663 | Grimes et al. | Jun 1995 | A |
5428784 | Cahill, Jr. | Jun 1995 | A |
5432839 | DeLuca | Jul 1995 | A |
5436960 | Campana, Jr. et al. | Jul 1995 | A |
5440559 | Gaskill | Aug 1995 | A |
5442706 | Kung | Aug 1995 | A |
5444438 | Goldberg | Aug 1995 | A |
5444671 | Tschannen et al. | Aug 1995 | A |
5448759 | Krebs et al. | Sep 1995 | A |
5452356 | Albert | Sep 1995 | A |
5455579 | Bennett et al. | Oct 1995 | A |
5455823 | Noreen et al. | Oct 1995 | A |
5457732 | Goldberg | Oct 1995 | A |
5463382 | Nikas et al. | Oct 1995 | A |
5463672 | Kage | Oct 1995 | A |
5473143 | Vak et al. | Dec 1995 | A |
5473320 | DeLuca et al. | Dec 1995 | A |
5473667 | Neustein | Dec 1995 | A |
5475653 | Yamada et al. | Dec 1995 | A |
5475863 | Simpson et al. | Dec 1995 | A |
5475866 | Ruthenberg | Dec 1995 | A |
5479378 | Yamada et al. | Dec 1995 | A |
5479408 | Will | Dec 1995 | A |
5479472 | Campana, Jr. et al. | Dec 1995 | A |
5481255 | Albert et al. | Jan 1996 | A |
5483352 | Fukuyama et al. | Jan 1996 | A |
5483595 | Owen | Jan 1996 | A |
5485503 | Diem | Jan 1996 | A |
5487100 | Kane | Jan 1996 | A |
5487167 | Dinallo et al. | Jan 1996 | A |
5489894 | Murray | Feb 1996 | A |
5493692 | Theimer et al. | Feb 1996 | A |
5497373 | Hulen et al. | Mar 1996 | A |
5506886 | Maine et al. | Apr 1996 | A |
5524009 | Tuutijarvi et al. | Jun 1996 | A |
5530438 | Bickham et al. | Jun 1996 | A |
5530930 | Hahn | Jun 1996 | A |
5544321 | Theimer et al. | Aug 1996 | A |
5550535 | Park | Aug 1996 | A |
5555376 | Theimer et al. | Sep 1996 | A |
5555446 | Jasinski | Sep 1996 | A |
5557606 | Moon et al. | Sep 1996 | A |
5557659 | Hyde-Thomson | Sep 1996 | A |
5557749 | Norris | Sep 1996 | A |
5559800 | Mousseau et al. | Sep 1996 | A |
5559862 | Bhagat et al. | Sep 1996 | A |
5561702 | Lipp et al. | Oct 1996 | A |
5564018 | Flores et al. | Oct 1996 | A |
5568540 | Greco et al. | Oct 1996 | A |
5572196 | Sakumoto et al. | Nov 1996 | A |
5572488 | Yamada et al. | Nov 1996 | A |
5579372 | Astrom | Nov 1996 | A |
5581366 | Merchant et al. | Dec 1996 | A |
5581593 | Engelke et al. | Dec 1996 | A |
5584070 | Harris et al. | Dec 1996 | A |
5588009 | Will | Dec 1996 | A |
5590092 | Fehnel | Dec 1996 | A |
5592532 | Koizumi et al. | Jan 1997 | A |
5600703 | Dang et al. | Feb 1997 | A |
5600708 | Meche et al. | Feb 1997 | A |
5603054 | Theimer et al. | Feb 1997 | A |
5604788 | Tett | Feb 1997 | A |
5608446 | Carr et al. | Mar 1997 | A |
5608786 | Gordon | Mar 1997 | A |
5611050 | Theimer et al. | Mar 1997 | A |
5623242 | Dawson, Jr. et al. | Apr 1997 | A |
5625870 | Moon | Apr 1997 | A |
5627525 | Kudoh et al. | May 1997 | A |
5627876 | Moon | May 1997 | A |
5630060 | Tang et al. | May 1997 | A |
5630207 | Gitlin et al. | May 1997 | A |
5631635 | Robertson | May 1997 | A |
5631948 | Bartholomew et al. | May 1997 | A |
5633916 | Goldhagen et al. | May 1997 | A |
5635918 | Tett | Jun 1997 | A |
5636265 | O'Connell et al. | Jun 1997 | A |
5638369 | Ayerst et al. | Jun 1997 | A |
5644627 | Segal et al. | Jul 1997 | A |
5649305 | Yoshida | Jul 1997 | A |
5652789 | Miner et al. | Jul 1997 | A |
5654942 | Akahane | Aug 1997 | A |
5661782 | Bartholomew et al. | Aug 1997 | A |
5663703 | Pearlman et al. | Sep 1997 | A |
5668880 | Alajajian | Sep 1997 | A |
5675507 | Bobo, II | Oct 1997 | A |
5675627 | Yaker | Oct 1997 | A |
5678176 | Moon | Oct 1997 | A |
5678179 | Turcotte et al. | Oct 1997 | A |
5680442 | Bartholomew et al. | Oct 1997 | A |
5684859 | Chanroo et al. | Nov 1997 | A |
5694120 | Indekeu et al. | Dec 1997 | A |
5694454 | Hill et al. | Dec 1997 | A |
5696500 | Diem | Dec 1997 | A |
5699053 | Jonsson | Dec 1997 | A |
5703934 | Zicker et al. | Dec 1997 | A |
5706211 | Beletic et al. | Jan 1998 | A |
5710816 | Stork et al. | Jan 1998 | A |
5717742 | Hyde-Thomson | Feb 1998 | A |
5722071 | Berg et al. | Feb 1998 | A |
5724410 | Parvulescu et al. | Mar 1998 | A |
5726643 | Tani | Mar 1998 | A |
5737394 | Anderson et al. | Apr 1998 | A |
5737395 | Irribarren | Apr 1998 | A |
5737688 | Sakai et al. | Apr 1998 | A |
5740230 | Vaudreuil | Apr 1998 | A |
5742668 | Pepe et al. | Apr 1998 | A |
5745689 | Yeager et al. | Apr 1998 | A |
5751793 | Davies et al. | May 1998 | A |
5751814 | Kafri | May 1998 | A |
5752191 | Fuller et al. | May 1998 | A |
5761622 | Priest | Jun 1998 | A |
5768381 | Hawthorne | Jun 1998 | A |
5774803 | Kariya | Jun 1998 | A |
5778315 | Proietti | Jul 1998 | A |
5781857 | Hwang et al. | Jul 1998 | A |
5787345 | Moon | Jul 1998 | A |
5796394 | Wicks et al. | Aug 1998 | A |
5797091 | Clise et al. | Aug 1998 | A |
5802165 | Kim | Sep 1998 | A |
5802466 | Gallant et al. | Sep 1998 | A |
5805886 | Skarbo et al. | Sep 1998 | A |
5809130 | Ayala | Sep 1998 | A |
5809413 | Meche et al. | Sep 1998 | A |
5809415 | Rossman | Sep 1998 | A |
5809424 | Eizenhoefer | Sep 1998 | A |
5812639 | Bartholomew et al. | Sep 1998 | A |
5812671 | Ross, Jr. | Sep 1998 | A |
5812795 | Horovitz et al. | Sep 1998 | A |
5812865 | Theimer et al. | Sep 1998 | A |
5815800 | Su et al. | Sep 1998 | A |
5818824 | Lu et al. | Oct 1998 | A |
5821874 | Parvulescu et al. | Oct 1998 | A |
5826187 | Core et al. | Oct 1998 | A |
5826191 | Krishnan | Oct 1998 | A |
5828882 | Hinckley | Oct 1998 | A |
5835089 | Skarbo et al. | Nov 1998 | A |
5838252 | Kikinis | Nov 1998 | A |
5841837 | Fuller et al. | Nov 1998 | A |
5845211 | Roach, Jr. | Dec 1998 | A |
5850594 | Cannon et al. | Dec 1998 | A |
5857020 | Peterson, Jr. | Jan 1999 | A |
5862325 | Reed et al. | Jan 1999 | A |
5864606 | Hanson et al. | Jan 1999 | A |
5867793 | Davis | Feb 1999 | A |
5870030 | DeLuca et al. | Feb 1999 | A |
5870454 | Dahlen | Feb 1999 | A |
5872779 | Vaudreuil | Feb 1999 | A |
5872847 | Boyle et al. | Feb 1999 | A |
5872926 | Levac et al. | Feb 1999 | A |
5872930 | Masters et al. | Feb 1999 | A |
5875436 | Kikinis | Feb 1999 | A |
5878230 | Weber et al. | Mar 1999 | A |
5878351 | Alanara et al. | Mar 1999 | A |
5884159 | Thro et al. | Mar 1999 | A |
5884160 | Kanazaki | Mar 1999 | A |
5887249 | Schmid | Mar 1999 | A |
5889852 | Rosecrans et al. | Mar 1999 | A |
5892909 | Grasso et al. | Apr 1999 | A |
5893032 | Maeda et al. | Apr 1999 | A |
5893091 | Hunt et al. | Apr 1999 | A |
5895471 | King et al. | Apr 1999 | A |
5903723 | Beck et al. | May 1999 | A |
5903845 | Buhrmann et al. | May 1999 | A |
5905495 | Tanaka et al. | May 1999 | A |
5905777 | Foladare et al. | May 1999 | A |
5905865 | Palmer et al. | May 1999 | A |
5907805 | Chotai | May 1999 | A |
5909491 | Luo | Jun 1999 | A |
5913032 | Schwartz et al. | Jun 1999 | A |
5918158 | LaPorta et al. | Jun 1999 | A |
5920826 | Metso et al. | Jul 1999 | A |
5924016 | Fuller et al. | Jul 1999 | A |
5928325 | Shaughnessy et al. | Jul 1999 | A |
5930250 | Klok et al. | Jul 1999 | A |
5930701 | Skog | Jul 1999 | A |
5933478 | Ozaki et al. | Aug 1999 | A |
5936547 | Lund | Aug 1999 | A |
5938725 | Hara | Aug 1999 | A |
5943397 | Gabin et al. | Aug 1999 | A |
5948059 | Woo et al. | Sep 1999 | A |
5951638 | Hoss et al. | Sep 1999 | A |
5958006 | Eggleston et al. | Sep 1999 | A |
5961590 | Mendez et al. | Oct 1999 | A |
5961620 | Trent et al. | Oct 1999 | A |
5963618 | Porter | Oct 1999 | A |
5966652 | Coad et al. | Oct 1999 | A |
5974447 | Cannon et al. | Oct 1999 | A |
5974449 | Chang et al. | Oct 1999 | A |
5978837 | Foladare et al. | Nov 1999 | A |
5988857 | Ozawa et al. | Nov 1999 | A |
5991615 | Coppinger et al. | Nov 1999 | A |
5995597 | Woltz et al. | Nov 1999 | A |
6009173 | Sumner | Dec 1999 | A |
6014559 | Amin | Jan 2000 | A |
6018654 | Valentine et al. | Jan 2000 | A |
6018657 | Kennedy et al. | Jan 2000 | A |
6018774 | Mayle et al. | Jan 2000 | A |
6021114 | Shaffer et al. | Feb 2000 | A |
6021190 | Fuller et al. | Feb 2000 | A |
6021433 | Payne et al. | Feb 2000 | A |
6026153 | Fuller et al. | Feb 2000 | A |
6026292 | Coppinger et al. | Feb 2000 | A |
6029065 | Shah | Feb 2000 | A |
6029171 | Smiga et al. | Feb 2000 | A |
6032039 | Kaplan | Feb 2000 | A |
6035104 | Zahariev | Mar 2000 | A |
6044247 | Taskett et al. | Mar 2000 | A |
6047053 | Miner et al. | Apr 2000 | A |
6047327 | Tso et al. | Apr 2000 | A |
6052442 | Cooper et al. | Apr 2000 | A |
6052595 | Schellinger et al. | Apr 2000 | A |
6058304 | Callaghan et al. | May 2000 | A |
6060997 | Taubenheim et al. | May 2000 | A |
6061570 | Janow | May 2000 | A |
6064342 | Sandhu et al. | May 2000 | A |
6064879 | Fujiwara et al. | May 2000 | A |
6070067 | Nguyen et al. | May 2000 | A |
6081703 | Hallqvist | Jun 2000 | A |
6087956 | Helferich | Jul 2000 | A |
6088127 | Pieterse | Jul 2000 | A |
6088717 | Reed et al. | Jul 2000 | A |
6091957 | Larkins et al. | Jul 2000 | A |
6094574 | Vance et al. | Jul 2000 | A |
6097941 | Helferich | Aug 2000 | A |
6115248 | Canova et al. | Sep 2000 | A |
6119167 | Boyle et al. | Sep 2000 | A |
6122484 | Fuller et al. | Sep 2000 | A |
6125281 | Wells et al. | Sep 2000 | A |
6128490 | Shaheen et al. | Oct 2000 | A |
6134325 | Vanstone et al. | Oct 2000 | A |
6134432 | Holmes et al. | Oct 2000 | A |
6138158 | Boyle et al. | Oct 2000 | A |
6144313 | Nakano | Nov 2000 | A |
6144671 | Perinpanathan et al. | Nov 2000 | A |
6145079 | Mitty | Nov 2000 | A |
6147314 | Han et al. | Nov 2000 | A |
6151443 | Gable et al. | Nov 2000 | A |
6151491 | Farris et al. | Nov 2000 | A |
6151507 | Laiho | Nov 2000 | A |
6169883 | Vimpari et al. | Jan 2001 | B1 |
6169911 | Wagner et al. | Jan 2001 | B1 |
6175859 | Mohler | Jan 2001 | B1 |
6178331 | Holmes et al. | Jan 2001 | B1 |
6185423 | Brown et al. | Feb 2001 | B1 |
6195564 | Rydbeck et al. | Feb 2001 | B1 |
6205330 | Winbladh | Mar 2001 | B1 |
6208839 | Davani | Mar 2001 | B1 |
6212550 | Segur | Apr 2001 | B1 |
6215858 | Bartholomew et al. | Apr 2001 | B1 |
6216165 | Woltz et al. | Apr 2001 | B1 |
6219694 | Lazaridis et al. | Apr 2001 | B1 |
6222857 | Kammer et al. | Apr 2001 | B1 |
6226495 | Neustein | May 2001 | B1 |
6226533 | Akahane | May 2001 | B1 |
6230133 | Bennett, III et al. | May 2001 | B1 |
6230188 | Marcus | May 2001 | B1 |
6233318 | Picard et al. | May 2001 | B1 |
6233430 | Helferich | May 2001 | B1 |
6236804 | Tozaki et al. | May 2001 | B1 |
6246871 | Ala-Laurila | Jun 2001 | B1 |
6252588 | Dawson | Jun 2001 | B1 |
6253061 | Helferich | Jun 2001 | B1 |
6259892 | Helferich | Jul 2001 | B1 |
6272532 | Feinleib | Aug 2001 | B1 |
6278862 | Henderson | Aug 2001 | B1 |
6282435 | Wagner et al. | Aug 2001 | B1 |
6285745 | Bartholomew et al. | Sep 2001 | B1 |
6285777 | Kanevsky et al. | Sep 2001 | B2 |
6288715 | Bain et al. | Sep 2001 | B1 |
6292668 | Alanara et al. | Sep 2001 | B1 |
6298231 | Heinz | Oct 2001 | B1 |
6301471 | Dahm et al. | Oct 2001 | B1 |
6301513 | Divon et al. | Oct 2001 | B1 |
6317085 | Sandhu et al. | Nov 2001 | B1 |
6317594 | Gossman et al. | Nov 2001 | B1 |
6320957 | Draganoff | Nov 2001 | B1 |
6321094 | Hayashi et al. | Nov 2001 | B1 |
6321267 | Donaldson | Nov 2001 | B1 |
6330244 | Swartz et al. | Dec 2001 | B1 |
6330308 | Cheston et al. | Dec 2001 | B1 |
6333919 | Gaffney | Dec 2001 | B2 |
6333973 | Smith et al. | Dec 2001 | B1 |
6343219 | Wada | Jan 2002 | B1 |
6344848 | Rowe et al. | Feb 2002 | B1 |
6351523 | Detlef | Feb 2002 | B1 |
6356939 | Dahl | Mar 2002 | B1 |
6361523 | Bierman | Mar 2002 | B1 |
6363082 | Kammer et al. | Mar 2002 | B1 |
RE37618 | Helferich | Apr 2002 | E |
6370389 | Isomursu et al. | Apr 2002 | B1 |
6373835 | Ng et al. | Apr 2002 | B1 |
6381650 | Peacock et al. | Apr 2002 | B1 |
6388877 | Canova et al. | May 2002 | B1 |
6389457 | Lazaridis et al. | May 2002 | B2 |
6389572 | Garrabrant et al. | May 2002 | B1 |
6397059 | Vance et al. | May 2002 | B1 |
6401113 | Lazaridis et al. | Jun 2002 | B2 |
6411827 | Minata | Jun 2002 | B1 |
6418305 | Neustein | Jul 2002 | B1 |
6418307 | Amin | Jul 2002 | B1 |
6421678 | Smiga et al. | Jul 2002 | B2 |
6422147 | Shann | Jul 2002 | B1 |
6424841 | Gustafsson | Jul 2002 | B1 |
6425087 | Osborn et al. | Jul 2002 | B1 |
6438585 | Mousseau et al. | Aug 2002 | B2 |
6441824 | Hertzfeld et al. | Aug 2002 | B2 |
6442243 | Valco et al. | Aug 2002 | B1 |
6442637 | Hawkins et al. | Aug 2002 | B1 |
6449344 | Goldfinger | Sep 2002 | B1 |
6457134 | Lemke et al. | Sep 2002 | B1 |
6459360 | Helferich | Oct 2002 | B1 |
6462646 | Helferich | Oct 2002 | B2 |
6463463 | Godfrey et al. | Oct 2002 | B1 |
6463464 | Lazaridis et al. | Oct 2002 | B1 |
6484027 | Mauney et al. | Nov 2002 | B1 |
6501834 | Milewski et al. | Dec 2002 | B1 |
6505237 | Beyda et al. | Jan 2003 | B2 |
6510453 | Apfel et al. | Jan 2003 | B1 |
6516202 | Hawkins et al. | Feb 2003 | B1 |
6522879 | Myer et al. | Feb 2003 | B2 |
6523124 | Lunsford et al. | Feb 2003 | B1 |
6526127 | Piotrowski et al. | Feb 2003 | B1 |
6539476 | Marianetti et al. | Mar 2003 | B1 |
6546083 | Chaves et al. | Apr 2003 | B1 |
6564249 | Shiigi | May 2003 | B2 |
6567179 | Sato et al. | May 2003 | B1 |
6580784 | Rodriguez et al. | Jun 2003 | B2 |
6580787 | Akhteruzzaman et al. | Jun 2003 | B1 |
6587681 | Sawai | Jul 2003 | B1 |
6587693 | Lumme et al. | Jul 2003 | B1 |
6590588 | Lincke et al. | Jul 2003 | B2 |
6597688 | Narasimhan et al. | Jul 2003 | B2 |
6597903 | Dahm et al. | Jul 2003 | B1 |
6611254 | Griffin et al. | Aug 2003 | B1 |
6622147 | Smiga et al. | Sep 2003 | B1 |
6625142 | Joffe et al. | Sep 2003 | B1 |
6625642 | Naylor et al. | Sep 2003 | B1 |
6636522 | Perinpanathan et al. | Oct 2003 | B1 |
6636733 | Helferich | Oct 2003 | B1 |
6658409 | Nomura et al. | Dec 2003 | B1 |
6662195 | Langseth et al. | Dec 2003 | B1 |
6665547 | Ehara | Dec 2003 | B1 |
6665803 | Lunsford et al. | Dec 2003 | B2 |
6671715 | Langseth et al. | Dec 2003 | B1 |
6687839 | Tate et al. | Feb 2004 | B1 |
6694316 | Langseth et al. | Feb 2004 | B1 |
6701378 | Gilhuly et al. | Mar 2004 | B1 |
6728530 | Heinonen et al. | Apr 2004 | B1 |
6741980 | Langseth et al. | May 2004 | B1 |
6744528 | Picoult et al. | Jun 2004 | B2 |
6744874 | Wu | Jun 2004 | B2 |
6751453 | Schemers et al. | Jun 2004 | B2 |
6760423 | Todd | Jul 2004 | B1 |
6766490 | Garrabrant et al. | Jul 2004 | B1 |
6771949 | Corliss | Aug 2004 | B1 |
6775264 | Kurganov | Aug 2004 | B1 |
6775689 | Raghunandan | Aug 2004 | B1 |
6779019 | Mousseau et al. | Aug 2004 | B1 |
6779022 | Horstmann et al. | Aug 2004 | B1 |
6788767 | Lambke | Sep 2004 | B2 |
6792112 | Campbell et al. | Sep 2004 | B1 |
6792544 | Hashem et al. | Sep 2004 | B2 |
6807277 | Doonan et al. | Oct 2004 | B1 |
6813489 | Wu et al. | Nov 2004 | B1 |
6816723 | Borland | Nov 2004 | B1 |
6823225 | Sass | Nov 2004 | B1 |
6826407 | Helferich | Nov 2004 | B1 |
6832130 | Pintsov et al. | Dec 2004 | B2 |
6868498 | Katsikas | Mar 2005 | B1 |
6869016 | Waxelbaum | Mar 2005 | B2 |
6871214 | Parsons et al. | Mar 2005 | B2 |
6880079 | Kefford et al. | Apr 2005 | B2 |
6882709 | Sherlock et al. | Apr 2005 | B1 |
6886096 | Appenseller et al. | Apr 2005 | B2 |
6892074 | Tarkiainen et al. | May 2005 | B2 |
6912285 | Jevans | Jun 2005 | B2 |
6912400 | Olsson et al. | Jun 2005 | B1 |
RE38787 | Sainton et al. | Aug 2005 | E |
6938065 | Jain | Aug 2005 | B2 |
6941349 | Godfrey et al. | Sep 2005 | B2 |
6944283 | Klein | Sep 2005 | B1 |
6950679 | Sugiyama et al. | Sep 2005 | B2 |
6952599 | Noda et al. | Oct 2005 | B2 |
6980792 | Iivonen et al. | Dec 2005 | B2 |
6983138 | Helferich | Jan 2006 | B1 |
6990587 | Willins et al. | Jan 2006 | B2 |
7003308 | Fuoss et al. | Feb 2006 | B1 |
7006459 | Kokot et al. | Feb 2006 | B2 |
7007239 | Hawkins et al. | Feb 2006 | B1 |
7013391 | Herle et al. | Mar 2006 | B2 |
7017181 | Spies et al. | Mar 2006 | B2 |
7020688 | Sykes, Jr. | Mar 2006 | B2 |
7023967 | Andersson et al. | Apr 2006 | B1 |
7031437 | Parsons et al. | Apr 2006 | B1 |
7054905 | Hanna et al. | May 2006 | B1 |
7058390 | Chikazawa | Jun 2006 | B2 |
7062536 | Fellenstein et al. | Jun 2006 | B2 |
7065189 | Wakabayashi | Jun 2006 | B2 |
7068993 | Rai et al. | Jun 2006 | B2 |
7072642 | Yabe et al. | Jul 2006 | B2 |
7076528 | Premutico | Jul 2006 | B2 |
7079006 | Abe | Jul 2006 | B1 |
7082469 | Gold et al. | Jul 2006 | B2 |
7082536 | Filipi-Martin | Jul 2006 | B2 |
7088990 | Isomursu et al. | Aug 2006 | B1 |
7092743 | Vegh | Aug 2006 | B2 |
7100048 | Czajkowski et al. | Aug 2006 | B1 |
7107246 | Wang | Sep 2006 | B2 |
7113601 | Ananda | Sep 2006 | B2 |
7113803 | Dehlin | Sep 2006 | B2 |
7113979 | Smith et al. | Sep 2006 | B1 |
7116762 | Bennett, III et al. | Oct 2006 | B2 |
7116997 | Byers et al. | Oct 2006 | B2 |
7133687 | El-Fishawy et al. | Nov 2006 | B1 |
7146009 | Andivahis et al. | Dec 2006 | B2 |
7149537 | Kupsh et al. | Dec 2006 | B1 |
7149893 | Leonard et al. | Dec 2006 | B1 |
7155241 | Helferich | Dec 2006 | B2 |
7188186 | Meyer et al. | Mar 2007 | B1 |
7218919 | Vaananen | May 2007 | B2 |
7233655 | Gailey et al. | Jun 2007 | B2 |
7239338 | Krisbergh et al. | Jul 2007 | B2 |
7251314 | Huang | Jul 2007 | B2 |
7254384 | Gailey et al. | Aug 2007 | B2 |
7277692 | Jones et al. | Oct 2007 | B1 |
7277716 | Helferich | Oct 2007 | B2 |
7280838 | Helferich | Oct 2007 | B2 |
7286817 | Provost | Oct 2007 | B2 |
7289797 | Kato | Oct 2007 | B2 |
7299036 | Sanding et al. | Nov 2007 | B2 |
7299046 | Ozugur et al. | Nov 2007 | B1 |
7317929 | El-Fishawy et al. | Jan 2008 | B1 |
7333817 | Tsuchiyama | Feb 2008 | B2 |
7353018 | Mauney et al. | Apr 2008 | B2 |
7403793 | Mauney et al. | Jul 2008 | B2 |
7433461 | Bauer | Oct 2008 | B1 |
20010005857 | Lazaridis et al. | Jun 2001 | A1 |
20010005860 | Lazaridis et al. | Jun 2001 | A1 |
20010005861 | Mousseau et al. | Jun 2001 | A1 |
20010013071 | Lazaridis et al. | Aug 2001 | A1 |
20010048737 | Goldberg et al. | Dec 2001 | A1 |
20010054115 | Ferguson et al. | Dec 2001 | A1 |
20020012323 | Petite | Jan 2002 | A1 |
20020023131 | Wu et al. | Feb 2002 | A1 |
20020029258 | Mousseau et al. | Mar 2002 | A1 |
20020032658 | Oki et al. | Mar 2002 | A1 |
20020035687 | Skantze | Mar 2002 | A1 |
20020038298 | Kusakabe et al. | Mar 2002 | A1 |
20020039419 | Akimoto et al. | Apr 2002 | A1 |
20020049818 | Gilhuly et al. | Apr 2002 | A1 |
20020052218 | Rhee | May 2002 | A1 |
20020065887 | Paik et al. | May 2002 | A1 |
20020065895 | Zhang et al. | May 2002 | A1 |
20020087645 | Ertugrul et al. | Jul 2002 | A1 |
20020091777 | Schwartz | Jul 2002 | A1 |
20020091782 | Benninghoff | Jul 2002 | A1 |
20020101998 | Wong et al. | Aug 2002 | A1 |
20020107928 | Chalon | Aug 2002 | A1 |
20020120696 | Mousseau et al. | Aug 2002 | A1 |
20020120788 | Wang et al. | Aug 2002 | A1 |
20020128036 | Yach et al. | Sep 2002 | A1 |
20020138735 | Felt et al. | Sep 2002 | A1 |
20020138759 | Dutta | Sep 2002 | A1 |
20020156691 | Hughes et al. | Oct 2002 | A1 |
20020178353 | Graham | Nov 2002 | A1 |
20020181701 | Lee | Dec 2002 | A1 |
20020194281 | McConnell et al. | Dec 2002 | A1 |
20020194285 | Mousseau et al. | Dec 2002 | A1 |
20030005066 | Lazaridis et al. | Jan 2003 | A1 |
20030009698 | Lindeman et al. | Jan 2003 | A1 |
20030037261 | Meffert et al. | Feb 2003 | A1 |
20030048905 | Gehring et al. | Mar 2003 | A1 |
20030050987 | Lazaridis et al. | Mar 2003 | A1 |
20030055902 | Amir et al. | Mar 2003 | A1 |
20030061511 | Fischer | Mar 2003 | A1 |
20030078058 | Vatanen et al. | Apr 2003 | A1 |
20030081621 | Godfrey et al. | May 2003 | A1 |
20030088633 | Chiu et al. | May 2003 | A1 |
20030097361 | Huang et al. | May 2003 | A1 |
20030115448 | Bouchard | Jun 2003 | A1 |
20030120733 | Forman | Jun 2003 | A1 |
20030126216 | Avila et al. | Jul 2003 | A1 |
20030126220 | Wanless | Jul 2003 | A1 |
20030142364 | Goldstone | Jul 2003 | A1 |
20030182575 | Korfanta | Sep 2003 | A1 |
20030187938 | Mousseau et al. | Oct 2003 | A1 |
20030191808 | Adler et al. | Oct 2003 | A1 |
20030194990 | Helferich | Oct 2003 | A1 |
20030204568 | Bhargava et al. | Oct 2003 | A1 |
20030220979 | Hejl | Nov 2003 | A1 |
20030222765 | Curbow et al. | Dec 2003 | A1 |
20030235307 | Miyamoto | Dec 2003 | A1 |
20030235308 | Boynton et al. | Dec 2003 | A1 |
20030237082 | Thurlow | Dec 2003 | A1 |
20040019780 | Waugh et al. | Jan 2004 | A1 |
20040021889 | McAfee et al. | Feb 2004 | A1 |
20040024824 | Ferguson et al. | Feb 2004 | A1 |
20040030906 | Marmigere et al. | Feb 2004 | A1 |
20040052340 | Joffe et al. | Mar 2004 | A1 |
20040059598 | L. Wellons et al. | Mar 2004 | A1 |
20040059914 | Karaoguz | Mar 2004 | A1 |
20040060056 | Wellons et al. | Mar 2004 | A1 |
20040073619 | Gilhuly et al. | Apr 2004 | A1 |
20040078488 | Patrick | Apr 2004 | A1 |
20040083271 | Robert Tosey | Apr 2004 | A1 |
20040083365 | Renier et al. | Apr 2004 | A1 |
20040111478 | Gross et al. | Jun 2004 | A1 |
20040111480 | Yue | Jun 2004 | A1 |
20040116073 | Mauney et al. | Jun 2004 | A1 |
20040116119 | Lewis et al. | Jun 2004 | A1 |
20040122847 | Rodgers | Jun 2004 | A1 |
20040122905 | Smith et al. | Jun 2004 | A1 |
20040137884 | Engstrom et al. | Jul 2004 | A1 |
20040137955 | Engstrom et al. | Jul 2004 | A1 |
20040165727 | Moreh et al. | Aug 2004 | A1 |
20040185877 | Asthana et al. | Sep 2004 | A1 |
20040194116 | McKee et al. | Sep 2004 | A1 |
20040196978 | Godfrey et al. | Oct 2004 | A1 |
20040198348 | Gresham et al. | Oct 2004 | A1 |
20040199669 | Riggs et al. | Oct 2004 | A1 |
20040202327 | Little et al. | Oct 2004 | A1 |
20040203642 | Zatloukal et al. | Oct 2004 | A1 |
20040205106 | Adler et al. | Oct 2004 | A1 |
20040205248 | Little et al. | Oct 2004 | A1 |
20040205330 | Godfrey et al. | Oct 2004 | A1 |
20040208296 | Aboujaoude et al. | Oct 2004 | A1 |
20040212639 | Smoot et al. | Oct 2004 | A1 |
20040221014 | Tomkow | Nov 2004 | A1 |
20040221048 | Ogier | Nov 2004 | A1 |
20040230657 | Tomkow | Nov 2004 | A1 |
20040243677 | Curbow et al. | Dec 2004 | A1 |
20040243844 | Adkins | Dec 2004 | A1 |
20040243847 | Way | Dec 2004 | A1 |
20040249768 | Kontio et al. | Dec 2004 | A1 |
20040249895 | Way | Dec 2004 | A1 |
20040249899 | Shiigi | Dec 2004 | A1 |
20040252727 | Mousseau et al. | Dec 2004 | A1 |
20040264121 | Orriss | Dec 2004 | A1 |
20040266441 | Sinha et al. | Dec 2004 | A1 |
20040283844 | Adkins | Dec 2004 | |
20050003809 | Kato | Jan 2005 | A1 |
20050009502 | Little et al. | Jan 2005 | A1 |
20050015455 | Liu | Jan 2005 | A1 |
20050019634 | Legg | Jan 2005 | A1 |
20050025172 | Frankel | Feb 2005 | A1 |
20050025291 | Peled et al. | Feb 2005 | A1 |
20050025297 | Finnigan | Feb 2005 | A1 |
20050038863 | Onyon et al. | Feb 2005 | A1 |
20050044160 | McElligott | Feb 2005 | A1 |
20050055413 | Keohane et al. | Mar 2005 | A1 |
20050058124 | Helferich | Mar 2005 | A1 |
20050058260 | Lasensky et al. | Mar 2005 | A1 |
20050060720 | Mayer | Mar 2005 | A1 |
20050076109 | Mathew et al. | Apr 2005 | A1 |
20050091329 | Friskel | Apr 2005 | A1 |
20050099654 | Chen | May 2005 | A1 |
20050099998 | Semper | May 2005 | A1 |
20050102381 | Jiang et al. | May 2005 | A1 |
20050108336 | Naick et al. | May 2005 | A1 |
20050108359 | Hyder et al. | May 2005 | A1 |
20050114652 | Swedor et al. | May 2005 | A1 |
20050130631 | Maguire et al. | Jun 2005 | A1 |
20050132010 | Muller | Jun 2005 | A1 |
20050135681 | Schirmer | Jun 2005 | A1 |
20050137009 | Vetelainen | Jun 2005 | A1 |
20050138353 | Spies | Jun 2005 | A1 |
20050141718 | Yu et al. | Jun 2005 | A1 |
20050148356 | Ferguson et al. | Jul 2005 | A1 |
20050159107 | Mauney et al. | Jul 2005 | A1 |
20050163320 | Brown et al. | Jul 2005 | A1 |
20050165740 | Kerr et al. | Jul 2005 | A1 |
20050176451 | Helferich | Aug 2005 | A1 |
20050180576 | Jevans | Aug 2005 | A1 |
20050188024 | Singer | Aug 2005 | A1 |
20050188045 | Katsikas | Aug 2005 | A1 |
20050198143 | Moody et al. | Sep 2005 | A1 |
20050198170 | LeMay et al. | Sep 2005 | A1 |
20050198506 | Qi et al. | Sep 2005 | A1 |
20050210064 | Caldini et al. | Sep 2005 | A1 |
20050210106 | Cunningham | Sep 2005 | A1 |
20050210246 | Faure | Sep 2005 | A1 |
20050210394 | Crandall et al. | Sep 2005 | A1 |
20050216587 | John | Sep 2005 | A1 |
20050216735 | Huang | Sep 2005 | A1 |
20050229258 | Pigin | Oct 2005 | A1 |
20050251558 | Zaki | Nov 2005 | A1 |
20050257057 | Ivanov et al. | Nov 2005 | A1 |
20050265551 | Hara | Dec 2005 | A1 |
20050282525 | Adams et al. | Dec 2005 | A1 |
20060013368 | LaBaw | Jan 2006 | A1 |
20060019638 | Chiu et al. | Jan 2006 | A1 |
20060019639 | Adams et al. | Jan 2006 | A1 |
20060020667 | Wang et al. | Jan 2006 | A1 |
20060021038 | Brown et al. | Jan 2006 | A1 |
20060021066 | Clayton et al. | Jan 2006 | A1 |
20060026246 | Fukuhara et al. | Feb 2006 | A1 |
20060029191 | Miller et al. | Feb 2006 | A1 |
20060041505 | Enyart | Feb 2006 | A1 |
20060041625 | Chen et al. | Feb 2006 | A1 |
20060046720 | Toropainen et al. | Mar 2006 | A1 |
20060046757 | Hoover et al. | Mar 2006 | A1 |
20060047766 | Spadea | Mar 2006 | A1 |
20060053202 | Foo et al. | Mar 2006 | A1 |
20060068768 | Sanding et al. | Mar 2006 | A1 |
20060069737 | Gilhuly et al. | Mar 2006 | A1 |
20060072761 | Johnson et al. | Apr 2006 | A1 |
20060074706 | Gilham | Apr 2006 | A1 |
20060075027 | Zager et al. | Apr 2006 | A1 |
20060080384 | Robinson et al. | Apr 2006 | A1 |
20060090065 | Bush et al. | Apr 2006 | A1 |
20060095510 | Rouse et al. | May 2006 | A1 |
20060101119 | Qureshi et al. | May 2006 | A1 |
20060133585 | Daigle et al. | Jun 2006 | A1 |
20060135128 | Skoog | Jun 2006 | A1 |
20060155698 | Vayssiere | Jul 2006 | A1 |
20060155812 | Looman | Jul 2006 | A1 |
20060168065 | Martin | Jul 2006 | A1 |
20060168072 | Park | Jul 2006 | A1 |
20060177015 | Skakkebaek et al. | Aug 2006 | A1 |
20060182124 | Cole et al. | Aug 2006 | A1 |
20060187897 | Dabbs et al. | Aug 2006 | A1 |
20060190533 | Shannon et al. | Aug 2006 | A1 |
20060194572 | Fresonke et al. | Aug 2006 | A1 |
20060200528 | Pathiyal | Sep 2006 | A1 |
20060217112 | Mo | Sep 2006 | A1 |
20060218224 | Agrawal et al. | Sep 2006 | A1 |
20060218244 | Rasmussen et al. | Sep 2006 | A1 |
20060221916 | Taylor et al. | Oct 2006 | A1 |
20060224893 | Sales et al. | Oct 2006 | A1 |
20060230266 | Maes | Oct 2006 | A1 |
20060233370 | Jung et al. | Oct 2006 | A1 |
20060234680 | Doulton | Oct 2006 | A1 |
20060239424 | Walter | Oct 2006 | A1 |
20060240868 | Kaplan et al. | Oct 2006 | A1 |
20060247962 | Harvey et al. | Nov 2006 | A1 |
20060248148 | Timmins et al. | Nov 2006 | A1 |
20060259558 | Yen | Nov 2006 | A1 |
20060265660 | Hullot et al. | Nov 2006 | A1 |
20060270461 | Won et al. | Nov 2006 | A1 |
20060285533 | Divine et al. | Dec 2006 | A1 |
20060286990 | Juan et al. | Dec 2006 | A1 |
20070042747 | Sun | Feb 2007 | A1 |
20070117541 | Helferich | May 2007 | A1 |
20070162454 | D'Albora et al. | Jul 2007 | A1 |
20070265838 | Chopra et al. | Nov 2007 | A1 |
20080037582 | Wang | Feb 2008 | A1 |
20080039052 | Knowles | Feb 2008 | A1 |
20090191848 | Helferich | Jul 2009 | A1 |
Number | Date | Country |
---|---|---|
631419 | Dec 1994 | EP |
0 695 071 | Jan 1996 | EP |
777394 | Jun 1997 | EP |
831664 | Sep 1997 | EP |
0 505 489 | Nov 1997 | EP |
0 624 993 | Dec 2003 | EP |
3-500955 | Feb 1991 | JP |
03232325 | Oct 1991 | JP |
6-70292 | Mar 1994 | JP |
6-261121 | Sep 1994 | JP |
6-276226 | Sep 1994 | JP |
06245254 | Sep 1994 | JP |
6-318899 | Nov 1994 | JP |
6-318899 | Nov 1994 | JP |
06-326656 | Nov 1994 | JP |
7-503826 | Apr 1995 | JP |
7-245773 | Sep 1995 | JP |
8-019025 | Jan 1996 | JP |
8-97854 | Apr 1996 | JP |
8-163637 | Jun 1996 | JP |
8-228368 | Sep 1996 | JP |
8-228368 | Sep 1996 | JP |
8-265245 | Oct 1996 | JP |
08336182 | Dec 1996 | JP |
9-146824 | Jun 1997 | JP |
9-200250 | Jul 1997 | JP |
2001-517891 | Oct 2001 | JP |
2000-513362 | Sep 2005 | JP |
00164369 | Sep 1998 | KR |
00164369 | Sep 1998 | KR |
8905009 | Jun 1989 | WO |
9708906 | Mar 1997 | WO |
9731488 | Aug 1997 | WO |
97 32439 | Sep 1997 | WO |
9858476 | Dec 1998 | WO |
9965256 | Dec 1999 | WO |
Number | Date | Country | |
---|---|---|---|
20050058124 A1 | Mar 2005 | US |
Number | Date | Country | |
---|---|---|---|
60126939 | Mar 1999 | US | |
60155055 | Sep 1999 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09408841 | Sep 1999 | US |
Child | 10958731 | US |