Printer with audio or video receiver, recorder, and real-time content-based processing logic

Information

  • Patent Grant
  • 8077341
  • Patent Number
    8,077,341
  • Date Filed
    Tuesday, March 30, 2004
    20 years ago
  • Date Issued
    Tuesday, December 13, 2011
    12 years ago
Abstract
A system and method for monitoring events from a media stream and triggering an action in response to detected events. The action is preferably based on information relating to the event received by the system. The system can generate a paper document that reflects some aspects of the detected event such as a summary describing the event. The system can also generate a network message (e.g., email or paging call) in response to the detected event. In other embodiments, the system stores multimedia in memory in response to the detected event. The system can also generate an audio on a speaker or a video on a video display system attached to the printer based on the detected event.
Description
BACKGROUND

1. Field of the Invention


The present invention relates generally to document printers and more particularly to systems and methods that can monitor an event and trigger an action in response.


2. Background of the Invention


Monitoring of a live video and/or audio feed is desirable in many situations. For example, a person may wish to monitor a live radio or television feed for weather-related events such as the Emergency Alert System (EAS) issued by the Federal Communications Commission (FCC) to state and local broadcasters. A person may also wish to monitor user-defined events such as the appearance of a specified set of keywords in the closed caption of a TV broadcast, or the appearance of a given image (e.g., the face of Jonathan Hull) in a video stream, or the occurrence of an audio event (e.g., gun shot) in an audio stream.


The monitoring of such events typically requires the individual monitoring of live audio or video broadcasts, or the monitoring of recordings of the broadcasts. This can be both inefficient and tedious for the person performing the monitoring, and for a live broadcast, it requires the person to be present during the broadcast. Moreover, monitoring of a recorded broadcast may delay delivery of critical information.


What is needed is a system and method for monitoring media feeds for a specified event and alert the user of the event.


SUMMARY OF THE INVENTION

The present invention overcomes the deficiencies and limitations of the prior art by providing a system and method for a printer that can detect specified events from a media feed and trigger an action in response.


The action is preferably based on information relating to the event received by the system. In one embodiment, the system generates a paper document that reflects some aspects of the detected event such as a summary describing the event. In a second embodiment, the system generates a network message (e.g., email or paging call) in response to the detected event. In a third embodiment, the system stores multimedia in memory in response to the detected event. In a fourth embodiment, the system can generate audio on a speaker or video on a video display system attached to the printer based on the detected event. In all of these embodiments, the system performs an action (in addition to or instead of printing) in accordance with information relating to the event.


In certain embodiments, the system interacts with the user or the media source before the printer performs the action in accordance with information relating to the detected event.


The features and advantages described in the specification are not all inclusive and, in particular, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter.





BRIEF DESCRIPTIONS OF THE DRAWINGS

The invention has other advantages and features which will be more readily apparent from the following detailed description of the invention and the appended claims, when taken in conjunction with the accompanying drawings, in which:



FIG. 1(
a) is a block diagram showing a system usable in connection with the present invention.



FIG. 1(
b) is a block diagram showing a system usable in connection with the present invention.



FIG. 1(
c) is a block diagram showing a system usable in connection with the present invention.



FIG. 2 illustrates a printer with an embedded audio and video receiver and recorder, according to an embodiment of the present invention.



FIG. 3 is a block diagram showing a system with a printer having embedded components for detecting events, according to an embodiment of the present invention.



FIG. 4 shows an example of interactive communication with a printer in accordance with the present invention.



FIG. 5 is a flowchart corresponding to an embodiment of FIG. 3.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

A preferred embodiment of the present invention is now described with reference to the figures where like reference numbers indicate identical or functionally similar elements. Also in the figures, the left most digit(s) of each reference number typically correspond(s) to the figure in which the reference number is first used. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the invention. It will be apparent, however, to one skilled in the art that the invention can be practiced without these specific details. In other instances, structures and devices are shown in block diagram form in order to avoid obscuring the invention.


Reference in the specification to “one embodiment,” “certain embodiments” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.


The present invention also relates to apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.


The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description below. In addition, the present invention is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein.


Now referring to FIG. 1(a), there is shown a system that is usable in connection with the present invention. A media feed is sent from a media source 102 to a processing logic 106. As used herein, the term “media feed” refers to anything in the print stream sent to the printer, including both printing and non-printing data. In an embodiment, a media feed may comprise the closed caption of a television broadcast, a digital media file (such as MPEG movies, Quicktime video, MP3 audio, WAV audio), or audio or video streams such as those obtained from a radio or television broadcast. In certain embodiments, the media source 102 may comprise a receiver for receiving a media feed and a recorder for recording the media feed. A receiver may be coupled to an antenna, a satellite dish, and/or a cable line for receiving radio, television, satellite broadcasts, and/or cable television broadcasts.


In FIG. 1(a), the media feed 105 is sent over a network 104, such as the internet, an intranet, a wireless connection, a wide area network, or the like. The processing logic 106 receives the media feed 105 and performs an action based on an event that is triggered by the monitoring of the media feed. The processing logic 106 contains decision logics to monitor the media feed 105 based on a user-defined event or a pre-defined event. As used herein, an “event” refers to anything (e.g., sounds, images, text, etc.) the system is monitoring. Examples of events include tone sequences or digital data embedded in a broadcast signal that is indicative of National Weather Service (NWS) or the Emergency Alert System (EAS) alerts but could also include user-defined events such as the appearance of a specified set of keywords in the closed caption a TV broadcast, or the appearance of a given image (e.g., face image of Jonathan Hull) in a video stream, or the occurrence of a sound (e.g., gun shot) in an audio stream.


The action performed by the processing logic 106 may be any of a number of actions, such as entering data in a database, sending a notification or confirmation, adding data to a web page, etc. One example of an action performed by the processing logic 106 is to send data to a component 108. The component 108 may comprise a cell phone, a pager, an email engine, a database, a speaker, a video display unit, a storage element, and the like. Thus, for example, an event detected by the processing logic 106 may trigger the processing logic 106 to generate audio on a speaker coupled to the processing logic 106 based on the event detected in the input audio or video streams. In another embodiment, the processing logic 106 might respond to an event detected in the input audio or video stream by populating a webpage located on a database with information relating to the event. In other embodiments, the processing logic 106 may generate a network message (e.g., an email or paging call). The email or webpage could contain a textual representation for the event and the time when it occurred or it could contain a document (e.g. an Adobe Acrobat file) that describes the event. In certain embodiments, the processing logic 106 could extract information (such as time of the event, a textual description of the event, or a graphic representation of the event) from the media feed 105 or from other sources located on the network 104. For example, the processing logic 106 may be programmed to gather information regarding an EAS alert from certain web pages located on the Internet that contain information on the EAS alert. In other situations, the processing logic may obtain information regarding a detected event from a database 112 located on the network 104.


In certain embodiments, the processing logic 106 also generates a document in response to a detected event and causes a print engine 111 to print the document 110 describing or summarizing information associated with the detected event. Again, the information describing the event could be gathered from the media feed 105 or from a source on the network 104.


It will be understood that the actions performed by the processing logic 106 as described above are examples only. It will be understood that other responses are possible. For example, video could be generated on a video display (e.g., showing a television broadcast) in response to an event detected by the processing logic 106.



FIG. 1(
b) is a block diagram showing a system usable in connection with the present invention. In this example, the media source 102 is coupled to the processing logic 106 without a network connection. Similarly, the processing logic 106 is connected to the component 108 without a network connection. In this embodiment, the media source 102 and the processing logic 106 may reside on a single unit or multiple units.



FIG. 1(
c) is a block diagram showing a system usable in connection with the present invention. In this example, the processing logic 106 is connected to the component 108 using a network connection through the network 104. The network 104 can be any network, such as the Internet, an intranet, a wireless network connection, a wide area network, or the like.


It will be understood that the system configurations shown in FIGS. 1(a)-1(c) are examples only and are included to show some configurations usable with the present invention. It will be understood that other configurations are possible. For example, the connections between the media source 102 and the processing logic 106 and between the processing logic 106 and the component 108 can both be network connections.


Now referring to FIG. 2, there is shown an embodiment of a printer 200 wherein the audio and video receiver and recorder are embedded in the printer 200. In FIG. 2, the printer 200 comprises a media receiver 206, a media recorder 208, a processing logic 210, a print engine 212, a speaker 214, and a display 215 (touch screen and/or video display capable of displaying a media feed). The media receiver 206 comprises a radio, television, satellite, and/or cable receiver. The receiver obtains media broadcasts through various means, including an antenna 202, a satellite dish 204, and/or a cable line (not shown). A media feed from a media broadcast is recorded on the media recorder 208. The media recorder 208 may record audio or video feeds. The processing logic 210 monitors the media feed from the media recorder 208 for a pre-defined or user-defined event. When an event is detected the processing logic 210 causes the print engine 212 to print a document 216 describing the event. In certain embodiments, the processing logic may also cause audio to be played on the speaker 214 in response to an event.


Printed document 216 illustrates an embodiment of the present invention wherein the system is used to detect NWS and/or EAS alerts. In this embodiment, the processing logic 210 contains Tone sequence decoding logic to detect tone sequences or digital data embedded in the media that are indicative of an NWS or an EAS alert. The processing logic 210 generates a document indicating the date and time of the alert (in this example, an earthquake alert at 11:45) and a weather forecast. The processing logic 210 can extract information regarding the alert from the media feed or from other sources such as Internet web pages with information relating to the event, as described above. Moreover, as illustrated, the description of the event could contain textual representations as well as graphics representations. In this example, the processing logic 210 obtains key frames from the media feed and causes the print engine 212 to print a document 216 with bar codes linking the key frames to different segments of a video file that may be used to replay a recorded video describing the event. Closed caption texts from the media feed may also be printed alongside the video key frames to describe the event. The processing logic 210 may also cause audio relating to the NWS or EAS alert to be played on the speaker 214 (e.g., tune into a radio station with weather alert). Live video could also be played on the speaker 214 or the display 215 either as the result of an event being detected or in response to commands that were entered on the console or on the web interface.



FIG. 3 illustrates another embodiment of the present invention. The system 300 includes a printer 301, means for receiving media broadcasts 302 coupled to the printer 301, a network 316 coupled to the printer 301, and a printed document 324 generated by the printer 301. In this embodiment, the printer 301 contains an embedded media receiver 306, media recorder 307, tone sequence decoding logic 304, processing logic 318, print engine 320, storage 322, console 321, audio and video display systems 308 and 314, electronic output system 325, and communication port 323 including parallel, serial, USB, and network connections that receive the page description data that allow the printer 301 to function as a normal printer in the absence of any media, as illustrated in FIG. 3.


The means for receiving media broadcasts 302 is coupled to the media receiver 306. In an embodiment, the system 300 could receive media broadcasts via an antenna, a satellite dish, and/or a cable wire. Thus, the system 300 could receive one frequency or multiple frequencies simultaneously. In such an embodiment, the receiver 306 could receive television, radio, cable television, and/or satellite broadcasts. The media receiver 306 is coupled to the media recorder 307, which records the media broadcasts obtained from the receiver 306. In other embodiments, the media receiver 306 is coupled to the processing logic 318, thereby allowing media feeds form the media receiver 306 to be directly processed by the processing logic 318.


Live media feeds from the media recorder 307 are coupled to the decoding logic 304 decoding logic 304 decodes tone sequences and digital data embedded in a broadcast signal such as the EAS alerts issued by the FCC broadcasts, an NWS alert or an Emergency Broadcast System (EBS) alert. These tone sequences or embedded digital data may correspond to a myriad of emergency alerts, weather-related warnings and other information issued by the FCC or other branches of the government. Those skilled in the art will recognize that the decoding logic 304 could be implemented using a digital signal processor (DSP) or a general-purpose processor. FIG. 3 illustrates the decoding logic 304 as a separate unit from the processing logic 318, and the decoding logic 304 is coupled to the processing logic 318. It will be understood that the decoding logic 304 could also be embedded in the processing logic 318.


In an embodiment, the media recorder 307 provides a live media feed to the processing logic 318. If the media feed is in analog format, the analog-to-digital converter 310 can convert the analog signal to a digital format before feeding the media signal to the processing logic 318. In certain embodiments, the video feed can be sent to the processing logic via the network 316. As described above, the processing logic 318 monitors the media feed for a user-defined or pre-programmed event. Once an event is detected, the processing logic 318 may gather additional information about the event. For example, it may extract information about the event from a preprogrammed Internet website located on the network 316, or it may capture additional information from a pre-programmed video feed located on the network 316 or the storage 322, or it may extract information from the media feed itself The processing logic 318 can then generate a summary of the event. The processing logic 318 could generate a document that summarizes the event and send it to the print engine 320 to produce a printed document 324. The processing logic could also generate a network message (e.g., an email or a paging call) over the network 316 in response to the detected event. The network message could contain information about the event. The processing logic 318 could also store the information about the event in the storage 322. In certain situations (such as an EAS weather alert), the processing logic 318 could respond by controlling switch 312 to allow broadcasting of the media feed of the event on speaker 308 and/or video display 314. For example, a radio announcement of an EAS weather alert could be played on speaker 308. In another example, upon a receipt of an EAS alert, the processing logic 318 could request the local NEXRAD satellite image from a specified web address, a web cam picture from a certain location, and construct an Adobe Acrobat file with a textual description of the event and the time it occurred. Those skilled in the art will recognize that other responses to a detect event could be generated.


In other embodiments, the printer 301 includes an electronic output system 325 that can be designed to produce an electronic output related to the multimedia data in any desired format. Because of the great variety of types and formats of electronic outputs, the electronic output system 325 may take any of a number of forms for producing an electronic output desired by the user. For example, the electronic output system 325 may comprise a removable media device with a media writer (e.g., a writeable DVD or CD, flash card, memory stick, and the like), an audio speaker, a video display system, a storage device, and the like. In particular implementations, the printer 301 may have only one or only a subset of the various components shown, and in addition it may have other types of not shown.


In another embodiment, the printer 301 includes a communication interface 323 that allows the printer 301 to be communicatively coupled to another electronic device. Depending on the desired input, the interface 305 may allow the printer 301 to communicate with a wide variety of different peripheral devices that can provide the printer 301 multimedia data to print. Without intending to limit the types of devices, the interface 323 may allow the printer 301 to received media data from peripheral devices such as computer systems, computer networks, digital cameras, cellular telephones, PDA devices, video cameras, media renderers (such as DVD and CD players), media receivers (such as televisions, satellite receivers, set-top boxes, radios, and the like), digital video recorders (such as a TiVO), a portable meeting recorder, external storage devices, video game systems, or any combination thereof. The connection type for the interface 323 can take a variety of forms based on the type of device that is intended to be connected to the printer 301 and the available standard connections for that type of device. For example, the interface 323 may comprise a port for connecting the device using a connection type such as USB, serial, FireWire, SCSI, IDE, RJ11, parallel port (e.g., bidirectional, Enhanced Parallel Port (EPP), Extended Capability Port (ECP), IEEE 1284 Standard parallel port), optical, composite video, component video, or S-video, or any other suitable connection type.


The printer 301 also contains a user interface console 321 that is coupled to the processing logic 318. In certain embodiments, the user interface console 321 allows the user to define events to be monitored by the processing logic 318, and it allows the user to program the processing logic 318 to respond to specific events in specific manners. For example, a user can use console 321 to program the processing logic 318 to monitor EAS events. The user could also program the processing logic to generate and print a document summarizing any detected EAS event. The user could program the processing logic 318 to extract information from a specified Internet website whenever a specific event is detected. The user could also program the processing logic 318 to trigger the speaker 308 to broadcast a preprogrammed radio station that has information about the EAS alert when an EAS event is detected. It will be understood that a user could program other events and responses. For example, the system 300 could be used to monitor the appearance of specified set of keywords in the closed caption of a television broadcast, or the appearance of a given image in a video stream, or the occurrence of a specified sound in an audio stream. Moreover, those skilled in the art will also recognize that the system 300 may be designed to automatically detect certain events and provide certain responses without user interactions.


In certain embodiments, the printer 301 could also be controlled remotely via the network 316. For example, the printer 301 could be controlled by a web page located on the network 316 that is supplied when the user enters the Internet address for the printer 301 in a web browser. The user could enter descriptions of events that the processing logic 318 should monitor and the expected responses to the events. In another embodiment, the processing logic 318 can also run a web server using a database on the storage 322.


In other embodiments, the printer 301 could be controlled by a print dialog box that pops up when the user sends any documents to the printer 301. One of the options available to the user is to print a template document where that template shows what events are to be detected and the appearance of the document that is generated in response to those events.


Although system 300 depicts various elements (e.g., media receiver 306, media recorder 307, processing logic 318, decoding logic 304, print engine 320, console 321, speaker 308, etc.) as being embedded in the printer 301, it will be understood that in alternative embodiments these elements may reside as separate units. For example, the processing logic 318 and the Tone sequence decoding logic 304 may be a single separate unit coupled to another unit containing the print engine 320, or the media receiver 306 and recorder 307 may be a single unit coupled to a unit containing the processing logic, or the speakers and video display may be separate units, and the like.


The printer 301 may include other embodiments of the electronic output system 325, the communication interface 323, and any number of embedded elements described in co-pending U.S. patent application entitled, “Printer Having Embedded Functionality for Printing Time-Based Media,” filed Mar. 30, 2004, which application is incorporated by reference in its entirety.


An advantage of the present invention is the ability to monitor a live media feed for certain events. For example, the system 300 can perform live monitoring of the content of radio or television broadcasts and generate printouts and storage of multimedia in response to detected events. In a preferred embodiment, the present invention as exemplified in system 300 can monitor events that occur when the user is not present. The instant generation of a paper output or a network message when events occur allows the user to pick up a print-out off the printer or review the network message at any later time without needing to press any other buttons or do anything else. Those events can be summarized very well with a paper document that is easy to browse. The utility of paper printouts may increase as more events occur since a paper representation can be easier to browse than an online representation. Also, the paper will be there if a power failure occurs after it is printed. As described above, the system can also generate electronic versions of the paper summaries and email them to users.



FIG. 5 illustrates a method that is useable in connection with the system 300. The printer 301 obtains media feed from a media source, step 502. Depending on the event that is being monitored, the decoding logic 304 and/or the processing logic 504 will analyze the media feed for detection of the event, step 504. For example, if the system 300 is programmed to monitor EAS alerts, the decoding logic 304 will decode EAS event codes from the media feed, and the processing logic 504 will generate responses to detected EAS events. If the system 300 is programmed to monitor other events such as appearance recognition in a video feed, the processing logic 504 will analyze the feed for event detection. If an event is detected, the printer will perform actions in accordance with user-defined and/or preprogrammed data and commands, step 506 and 510, as described above. If no event is detected, the printer 301 will continue to monitor the media feed unless there is no more video feed to monitor, step 508.


Interactive Communication with a Printer



FIG. 4 shows an example of interactive communication with a printer in accordance with the present invention.


In general, conventional printer drivers in modern operating systems are not designed to facilitate interactive information gathering. Because the print job can be redirected to another printer, or the printing protocol does not allow such interactive sessions, the operating system does not encourage interaction with the user. Once initial printer settings are captured, further interactions are generally not allowed in conventional printers. One approach to this problem is to embed metadata into the print stream itself, as noted above. However, it is possible that the printer could need to ask the user for more information, in response to computations made from the data supplied by the user. In addition, the printer might itself delegate some tasks to other application servers, which might in turn need more information from the user. So-called “Web services” or “grid computing” systems are examples of the sort of application server that the printer might trigger.


In order to allow this interaction, without modifying printer driver architecture of the underlying operating system, an extra mechanism, such as the one shown in FIG. 4, is constructed. A “UI Listener,” program 454 listens to a network socket, accepts requests for information 408, interacts with a user to obtain such data, and then sends the data back to the requester.


Once a print request 402 is sent by user 450 and notification 404 requested from the UI listener 454, the print job 406 is sent by application 452. Here, the print job 406 contains embedded information including the network address of the UI listener 454, authentication information, and the latest time that the client will be listening for requests.


If the printer requires additional information of confirmation, it sends a request 408, which is detected by the UI listener, which displays a dialog box to obtain input from the user 410. An example of such a request might be a request for a password or user confirmation code that the user must enter to access a database 458. The user's input is included in a reply 412 sent to the printer. If the reply does not satisfy the printer it may ask for additional information (not shown). If the reply does satisfy the printer, it takes a next step. This step might be to perform an external action such as sending an email (not shown). The next step might also be sending a request for information 414 to an application server (such as a database) 458. In this example, application server 458 also sends a request for information 416, which is detected by the UI listener. The user is prompted 418 and his response forwarded to the application server 420. In this example, a reply is then sent form the application server 458 to the printer 456. It will be understood that a particular embodiment may include either or none or requests 408 and 416 without departing from the spirit of the present invention.


A program such as that shown in FIG. 4 may have a fixed set of possible interactions, or may accept a flexible command syntax that allows the requester to display many different requests. An example of such a command syntax would be the standard web browser's ability to display HTML forms. These forms are generated by a remote server, and displayed by the browser, which then returns results to the server. In this embodiment, however, the UI listener is different from a browser in that a user does not generate the initial request to see a form. Instead, the remote machine generates this request. In the described embodiment, the UI listener is a server, not a client.


Because network transactions of this type are prone to many complex error conditions, a system of timeouts would be necessary to assure robust operation. Normally, each message sent across a network either expects a reply or is a one-way message. Messages which expect replies generally have a timeout, a limited period of time during which it is acceptable for the reply to arrive. In this embodiment, embedded metadata would include metadata about a UI listener that will accept requests for further information. Such metadata preferably includes at least a network address, port number, and a timeout period. It might also include authentication information, designed to prevent malicious attempts to elicit information from the user. Because the user cannot tell whether the request is coming from a printer, a delegated server, or a malicious agent, prudence suggests strong authentication by the UI listener. If the printer or a delegated application server wishes more information, it can use the above noted information to request that the UI listener ask a user for the needed information.


Examples of Printers with Embedded Media Devices


Printer with Embedded National Weather Service Radio Alert Receiver


The printer 301 includes a radio receiver (e.g., 306) tuned to the National Weather Service frequency as well as a tone decoding circuit (e.g., 304) that can recognize the tone signals used to indicate an alert message. This printer 301 can construct a log of all the alerts it hears and print that log, it can also apply speech recognition in an attempt to construct a printable representation for the alert message. It can also ring a bell and audibly play the alert message on a speaker attached to the printer. It can also send an email message or Internet paging message to the registered owner or owners of the printer.


Printer with Embedded TV Emergency Alert System (EBS) Alert Monitor


The printer 301 includes a cable TV (or broadcast TV) receiver (e.g., 306) tuned to a “responsible” local station known to broadcast EAS alerts, as well as a tone decoding circuit (e.g., 304) that can recognize the tone signals used to indicate an alert message. This printer can construct a log of all the alerts it hears and print that log, it also does a frame grab and saves a copy of the screen image associated with the alert message, it can also apply speech recognition in an attempt to construct a printable representation for the alert message that includes the image grabbed from the TV signal. It can also ring a bell and audibly play the alert message on a speaker 308 attached to the printer. This obviates the need for the user to have the TV turned on to receive the alert. It can also send an email message or Internet paging message to the registered owner or owners of the printer.


Printer with Embedded Audio Recorder


The printer 301 is plugged in to an audio source and it's recorded onto an internal disk. The printer generates a summary of what's on the disk. Optionally, this is a Video Paper document.


Printer with Embedded Video Recorder


The printer 301 is plugged in to an audio source and it's recorded onto an internal disk. The printer generates a summary of what's on the disk. Optionally, this is a Audio Paper document.


Printer with Embedded Single-channel TV Receiver


The user can walk up to the printer 301 and dial in a TV channel (e.g., on user interface console 321). The current program appears on a small monitor (e.g., 314) on the printer. The user can stand there and watch TV and at any point choose to print a keyframe. This can also be controlled from a print dialog box.


Printer with Embedded Single-channel AM/FM/Shortwave Radio Receiver


The user can walk up to the printer 301, dial in a radio station (e.g., on user interface console 321), listen to what's being broadcast, and at any point choose to print a document that shows the time when the button was pushed as well as a waveform for the audio.


Printer with Embedded Single-channel Satellite Radio Receiver


The user can walk up to the printer 301 and dial in a satellite TV channel (e.g., on user interface console 321). Note that the printer must also be connected to a satellite TV antenna (e.g., 302). The current program appears on a small monitor on the printer. The user can stand there and watch TV and at any point choose to print a key frame. This can also be controlled from a print dialog box.


Printer with Embedded Multi-channel TV Receiver


The user can watch more than one channel at the same time (like on PIP Picture In Picture TV sets) and choose to print key frames from any of the available sources.


Printer with Embedded Multi-channel AM/FM/Shortwave Radio Receiver


The user can listen to more than one channel at the same time, perhaps using a stereo speaker system (e.g., 308) on the printer 301, and selectively choose to print a document that shows a time stamp for when the button was pushed.


Printer with Embedded Multi-channel Satellite Radio Receiver


The user can watch more than one satellite TV channel at the same time (like on PIP Picture In Picture TV sets) and choose to print key frames from any of the available sources. Note: the printer 301 must be plugged into more than one satellite TV antenna (e.g., 302).


Upon reading this disclosure, those skilled in the art will appreciate still alternative systems and methods with the disclosed principles of the present invention for detecting specified events from a media feed and triggering an action in response. Thus, while particular embodiments and applications of the present invention have been illustrated and described, it is to be understood that the invention is not limited to the precise construction and components disclosed herein and that various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus of the present invention disclosed herein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims
  • 1. A printer for printing time-based media from a broadcast media feed, the printer comprising: a broadcast media receiver for receiving and outputting the broadcast media feed of time-based media;a content-based processing logic coupled to the broadcast media receiver for monitoring the broadcast media feed of time-based media to detect an occurrence of an event within the broadcast media feed, the content-based processing logic processing the broadcast media feed to generate an electronic representation and a printable representation of the broadcast media feed responsive to detecting the occurrence of the event, the printable representation and the electronic representation including a time of occurrence of the event, a graphical representation of the event and a barcode linking the graphical representation to a corresponding segment of a recorded media feed describing the event;a first output device in communication with the content-based processing logic to receive the electronic representation, the first output device automatically producing a corresponding electronic output from the received electronic representation of the broadcast media feed responsive to detecting the occurrence of the event; anda second output device in communication with the content-based processing logic to receive the printable representation, the second output device automatically producing a corresponding printed output from the received printable representation of the broadcast media feed responsive to the generation of the printable representation.
  • 2. The printer of claim 1, wherein the printed output is generated in a video paper format.
  • 3. The printer of claim 1, wherein the printed output is generated in an audio paper format.
  • 4. The printer of claim 1, wherein the electronic representation including the graphical representation of the event comprises an email message.
  • 5. The printer of claim 1, wherein the corresponding electronic output comprises at least one selected from the group consisting of a network message, audio related to the broadcast media feed, a modified web page comprising information related to the event, and video related to the broadcast media feed.
  • 6. The printer of claim 5, wherein the network message comprises an email message.
  • 7. The printer of claim 5, wherein the network message comprises a paging message.
  • 8. The printer of claim 1, wherein the content-based processing logic is user-programmable to indicate the event to be monitored.
  • 9. The printer of claim 1, wherein the content-based processing logic is user-programmable to indicate a response to be generated.
  • 10. The printer of claim 1, wherein the content-based processing logic extracts data from a web page responsive to detecting the occurrence of the event.
  • 11. The printer of claim 1, wherein the content-based processing logic extracts data from the broadcast media feed responsive to detecting the occurrence of the event.
  • 12. The printer of claim 11, wherein the content-based processing logic extracts closed caption text from the broadcast media feed.
  • 13. The printer of claim 11, wherein the content-based processing logic extracts key frames from a video feed.
  • 14. The printer of claim 1, further comprising the content-based processing logic broadcasting a video feed responsive to detecting the occurrence of the event.
  • 15. The printer of claim 1, further comprising the processing logic broadcasting an audio feed on a speaker responsive to detecting the occurrence of the event.
  • 16. The printer of claim 1, wherein the broadcast media feed comprises live media feed.
  • 17. The printer of claim 1, further comprising a media recorder for recording the broadcast media feed.
  • 18. The printer of claim 1, wherein the event comprises a coded signal embedded in the broadcast media feed.
  • 19. The printer of claim 18, wherein the coded signal corresponds to an emergency alert system (EAS) alert.
  • 20. The printer of claim 18, wherein the coded signal corresponds to a national weather service (NWS) alert.
  • 21. The printer of claim 18, wherein the coded signal corresponds to an emergency broadcast system (EBS) alert.
  • 22. The printer of claim 18, further comprising a decoder for decoding the coded signal.
  • 23. The printer of claim 18, wherein the coded signal comprises a digital data embedded in the broadcast media feed.
  • 24. The printer of claim 18, wherein the coded signal comprises a tone sequence embedded in the broadcast media feed.
  • 25. The printer of claim 1, wherein the event comprises an appearance of an image in the broadcast media feed.
  • 26. The printer of claim 1, wherein the broadcast media feed comprises an audio stream.
  • 27. The printer of claim 1, wherein the broadcast media feed comprises a video stream.
  • 28. The printer of claim 1, wherein the media receiver comprises a receiving means selected from a group of an antenna, a satellite dish, and a cable line.
  • 29. The printer of claim 1, wherein the media receiver is adapted to receive media signals at multiple frequencies simultaneously.
  • 30. The printer of claim 1, wherein the event comprises an occurrence of a specified sound in the broadcast media feed.
  • 31. A method for printing time-based media from a broadcast media feed, the method comprising: receiving the broadcast media feed of time-based media;monitoring the broadcast media feed of time-based media to detect an occurrence of an event within the broadcast media feed;processing the broadcast media feed to generate an electronic representation of the broadcast media feed and a printable representation of the broadcast media feed responsive to detecting the occurrence of the event, the printable representation and the electronic representation including a time of occurrence of the event, a graphical representation of the event and a barcode linking the graphical representation to a corresponding segment of a recorded media feed describing the event;responsive to detecting the occurrence of the event, automatically generating a corresponding electronic output from the electronic representation of the broadcast media feed; andresponsive to the generation of the printable representation, automatically generating a corresponding printed output from the printable representation of the broadcast media feed.
  • 32. The method of claim 31 further comprising generating an email message from the electronic representation of the broadcast media feed.
  • 33. The method of claim 31 further comprising generating a network message responsive to detecting the occurrence of the event.
  • 34. The method of claim 31 further comprising defining the event to be monitored.
  • 35. The method of claim 31 further comprising extracting data from a web page responsive to detecting the occurrence of the event.
  • 36. The method of claim 31, wherein processing the broadcast media feed comprises extracting closed caption text from the media feed.
  • 37. The method of claim 31, wherein processing the broadcast media feed comprises extracting key frames from a video feed.
  • 38. The method of claim 31 further comprising broadcasting a video feed responsive to detecting the occurrence of the event.
  • 39. The method of claim 31 further comprising broadcasting an audio feed responsive to detecting the occurrence of the event.
  • 40. The method of claim 31 further comprising further comprising decoding a coded signal in the broadcast media feed.
RELATED APPLICATIONS

This application claims priority from U.S. Provisional Patent Application entitled “Printer Including One or More Specialized Hardware Devices” filed on Sep. 25, 2003, having Ser. No. 60/506,303, and U.S. Provisional Patent Application entitled “Printer Including Interface and Specialized Information Processing Capabilities” filed on Sep. 25, 2003, having Ser. No. 60/506,302, each of which is incorporated by reference herein in its entirety. This application is also related to the following applications, each of which was filed on Mar. 30, 2004 and each of which is incorporated by reference herein in its entirety: application Ser. No. 10/814,931, entitled “Printer Having Embedded Functionality for Printing Time-Based Media,” application serial No. 10/814,700, entitled “Printer User Interface,” and application Ser. No. 10/814,932, entitled “Printer With Hardware and Software Interfaces for Media Devices,”.

US Referenced Citations (345)
Number Name Date Kind
4133007 Wessler et al. Jan 1979 A
4205780 Burns et al. Jun 1980 A
4437378 Ishida et al. Mar 1984 A
4619522 Imai Oct 1986 A
4635132 Nakamura Jan 1987 A
4703366 Kobori et al. Oct 1987 A
4734898 Morinaga Mar 1988 A
4754485 Klatt Jun 1988 A
4807186 Ohnishi et al. Feb 1989 A
4831610 Hoda et al. May 1989 A
4881135 Heilweil Nov 1989 A
4907973 Hon et al. Mar 1990 A
4998215 Black et al. Mar 1991 A
5010498 Miyata Apr 1991 A
5059126 Kimball Oct 1991 A
5091948 Kametani Feb 1992 A
5093730 Ishii et al. Mar 1992 A
5111285 Fujita et al. May 1992 A
5115967 Wedekind May 1992 A
5136563 Takemasa et al. Aug 1992 A
5170935 Federspiel et al. Dec 1992 A
5220649 Forcier Jun 1993 A
5231698 Forcier Jul 1993 A
5237648 Mills et al. Aug 1993 A
5270989 Kimura Dec 1993 A
5343251 Nafeh Aug 1994 A
5386510 Jacobs Jan 1995 A
5428555 Starkey et al. Jun 1995 A
5432532 Mochimaru et al. Jul 1995 A
5436792 Leman et al. Jul 1995 A
5438426 Miake et al. Aug 1995 A
5444476 Conway et al. Aug 1995 A
5479600 Wroblewski et al. Dec 1995 A
5480306 Liu Jan 1996 A
5485554 Lowitz et al. Jan 1996 A
5488423 Walkingshaw et al. Jan 1996 A
5493409 Maeda et al. Feb 1996 A
5524085 Bellucco et al. Jun 1996 A
5566271 Tomitsuka et al. Oct 1996 A
5568406 Gerber Oct 1996 A
5572651 Weber et al. Nov 1996 A
5576950 Tonomura et al. Nov 1996 A
5581366 Merchant et al. Dec 1996 A
5590257 Forcier Dec 1996 A
5596698 Morgan Jan 1997 A
5617138 Ito et al. Apr 1997 A
5624265 Redford et al. Apr 1997 A
5627936 Prasad et al. May 1997 A
5628684 Bouedec May 1997 A
5633723 Sugiyama et al. May 1997 A
5640193 Wellner Jun 1997 A
5661506 Lazzouni et al. Aug 1997 A
5661783 Assis Aug 1997 A
5682330 Seaman et al. Oct 1997 A
5682540 Klotz, Jr. et al. Oct 1997 A
5690496 Kennedy Nov 1997 A
5706097 Schelling et al. Jan 1998 A
5717841 Farrell et al. Feb 1998 A
5721883 Katsuo et al. Feb 1998 A
5729665 Gauthier Mar 1998 A
5749735 Redford et al. May 1998 A
5757897 LaBarbera et al. May 1998 A
5761380 Lewis et al. Jun 1998 A
5764235 Hunt et al. Jun 1998 A
5764368 Shibaki et al. Jun 1998 A
5774260 Petitto et al. Jun 1998 A
5793869 Claflin, Jr. Aug 1998 A
5804803 Cragun et al. Sep 1998 A
5845144 Tateyama et al. Dec 1998 A
5877764 Feitelson et al. Mar 1999 A
5884056 Steele Mar 1999 A
5903538 Fujita et al. May 1999 A
5936542 Kleinrock et al. Aug 1999 A
5938727 Ikeda Aug 1999 A
5940776 Baron et al. Aug 1999 A
5941936 Taylor Aug 1999 A
5945998 Eick Aug 1999 A
5949879 Berson et al. Sep 1999 A
5962839 Eskildsen Oct 1999 A
5974189 Nicponski Oct 1999 A
5987226 Ishikawa et al. Nov 1999 A
5995553 Crandall et al. Nov 1999 A
5999173 Ubillos Dec 1999 A
6000030 Steinberg et al. Dec 1999 A
6006241 Purnaveja et al. Dec 1999 A
6020916 Gersberg et al. Feb 2000 A
6038567 Young Mar 2000 A
6043904 Nickerson Mar 2000 A
6046718 Suzuki et al. Apr 2000 A
6076733 Wilz, Sr. et al. Jun 2000 A
6076734 Dougherty et al. Jun 2000 A
6081261 Wolff et al. Jun 2000 A
6098106 Philyaw et al. Aug 2000 A
6106457 Perkins et al. Aug 2000 A
6108656 Durst et al. Aug 2000 A
6111567 Savchenko et al. Aug 2000 A
6115718 Huberman et al. Sep 2000 A
6118888 Chino et al. Sep 2000 A
6123258 Iida Sep 2000 A
6125229 Dimitrova et al. Sep 2000 A
6138151 Reber et al. Oct 2000 A
6141001 Baleh Oct 2000 A
6148094 Kinsella Nov 2000 A
6152369 Wilz, Sr. et al. Nov 2000 A
6153667 Howald Nov 2000 A
6167033 Chang et al. Dec 2000 A
6170007 Venkatraman et al. Jan 2001 B1
6175489 Markow et al. Jan 2001 B1
6189009 Stratigos et al. Feb 2001 B1
6193658 Wendelken et al. Feb 2001 B1
6195068 Suzuki et al. Feb 2001 B1
6199042 Kurzweil Mar 2001 B1
6230189 Sato et al. May 2001 B1
6256638 Dougherty et al. Jul 2001 B1
6266053 French et al. Jul 2001 B1
6296693 McCarthy Oct 2001 B1
6297812 Ohara et al. Oct 2001 B1
6297851 Taubman et al. Oct 2001 B1
6298145 Zhang et al. Oct 2001 B1
6301586 Yang et al. Oct 2001 B1
6302527 Walker Oct 2001 B1
6307956 Black Oct 2001 B1
6308887 Korman et al. Oct 2001 B1
6330976 Dymetman et al. Dec 2001 B1
6360057 Tsumagari et al. Mar 2002 B1
6369811 Graham et al. Apr 2002 B1
6373498 Abgrall Apr 2002 B1
6373585 Mastie et al. Apr 2002 B1
6375298 Purcell et al. Apr 2002 B2
6378070 Chan et al. Apr 2002 B1
6381614 Barnett et al. Apr 2002 B1
6396594 French et al. May 2002 B1
6400996 Hoffberg et al. Jun 2002 B1
6417435 Chantzis et al. Jul 2002 B2
6421738 Ratan et al. Jul 2002 B1
6439465 Bloomberg Aug 2002 B1
6442336 Lemelson Aug 2002 B1
6452615 Chiu et al. Sep 2002 B1
6466534 Cundiff, Sr. Oct 2002 B2
6476793 Motoyama et al. Nov 2002 B1
6476834 Doval et al. Nov 2002 B1
6502114 Forcier Dec 2002 B1
D468277 Sugiyama Jan 2003 S
6502756 Fåhraeus Jan 2003 B1
6504620 Kinjo Jan 2003 B1
6515756 Mastie et al. Feb 2003 B1
6519360 Tanaka Feb 2003 B1
6529920 Arons et al. Mar 2003 B1
6535639 Uchihachi et al. Mar 2003 B1
6544294 Greenfield et al. Apr 2003 B1
6552743 Rissman Apr 2003 B1
6556241 Yoshimura et al. Apr 2003 B1
6568595 Russell et al. May 2003 B1
6581070 Gibbon et al. Jun 2003 B1
6587859 Dougherty et al. Jul 2003 B2
6593860 Lai et al. Jul 2003 B2
6594377 Kim et al. Jul 2003 B1
6611276 Muratori et al. Aug 2003 B1
6611622 Krumm Aug 2003 B1
6611628 Sekiguchi et al. Aug 2003 B1
6625334 Shiota et al. Sep 2003 B1
6647534 Graham Nov 2003 B1
6647535 Bozdagi et al. Nov 2003 B1
6654887 Rhoads Nov 2003 B2
6665092 Reed Dec 2003 B2
6674538 Takahashi Jan 2004 B2
6678389 Sun et al. Jan 2004 B1
6687383 Kanevsky et al. Feb 2004 B1
6700566 Shimoosawa et al. Mar 2004 B2
6701011 Nakajima Mar 2004 B1
6701369 Philyaw Mar 2004 B1
6724494 Danknick Apr 2004 B1
6728466 Tanaka Apr 2004 B1
6745234 Philyaw et al. Jun 2004 B1
6750978 Marggraff et al. Jun 2004 B1
6753883 Schena et al. Jun 2004 B2
6771283 Carro Aug 2004 B2
6772947 Shaw Aug 2004 B2
6774951 Narushima Aug 2004 B2
6775651 Lewis et al. Aug 2004 B1
6807303 Kim et al. Oct 2004 B1
6824044 Lapstun et al. Nov 2004 B1
6845913 Madding et al. Jan 2005 B2
6853980 Ying et al. Feb 2005 B1
6856415 Simchik et al. Feb 2005 B1
6871780 Nygren et al. Mar 2005 B2
6877134 Fuller et al. Apr 2005 B1
6883162 Jackson et al. Apr 2005 B2
6886750 Rathus et al. May 2005 B2
6892193 Bolle et al. May 2005 B2
6898709 Teppler May 2005 B1
6904168 Steinberg et al. Jun 2005 B1
6904451 Orfitelli et al. Jun 2005 B1
6923721 Luciano et al. Aug 2005 B2
6931594 Jun Aug 2005 B1
6938202 Matsubayashi et al. Aug 2005 B1
6946672 Lapstun et al. Sep 2005 B1
6950623 Brown et al. Sep 2005 B2
6964374 Djuknic et al. Nov 2005 B1
6966495 Lynggaard et al. Nov 2005 B2
6983482 Morita et al. Jan 2006 B2
7000193 Impink, Jr. et al. Feb 2006 B1
7023459 Arndt et al. Apr 2006 B2
7031965 Moriya et al. Apr 2006 B1
7073119 Matsubayashi et al. Jul 2006 B2
7075676 Owen Jul 2006 B2
7079278 Sato Jul 2006 B2
7089156 Takayasu et al. Aug 2006 B2
7089420 Durst et al. Aug 2006 B1
7092568 Eaton Aug 2006 B2
7131058 Lapstun et al. Oct 2006 B1
7134016 Harris Nov 2006 B1
7149957 Hull et al. Dec 2006 B2
7151613 Ito Dec 2006 B1
7152206 Tsuruta Dec 2006 B1
7162690 Gupta et al. Jan 2007 B2
7174151 Lynch et al. Feb 2007 B2
7181502 Incertis Feb 2007 B2
7196808 Kofman et al. Mar 2007 B2
7215436 Hull et al. May 2007 B2
7225158 Toshikage et al. May 2007 B2
7228492 Graham Jun 2007 B1
7260828 Aratani et al. Aug 2007 B2
7263659 Hull et al. Aug 2007 B2
7263671 Hull et al. Aug 2007 B2
7280738 Kauffman et al. Oct 2007 B2
7298512 Reese et al. Nov 2007 B2
7305620 Nakajima et al. Dec 2007 B1
7313808 Gupta et al. Dec 2007 B1
7363580 Tabata et al. Apr 2008 B2
7647555 Wilcox et al. Jan 2010 B1
20010003846 Rowe et al. Jun 2001 A1
20010017714 Komatsu et al. Aug 2001 A1
20010037408 Thrift et al. Nov 2001 A1
20010052942 MacCollum et al. Dec 2001 A1
20020001101 Hamura et al. Jan 2002 A1
20020004807 Graham et al. Jan 2002 A1
20020006100 Cundiff Sr., et al. Jan 2002 A1
20020010641 Stevens et al. Jan 2002 A1
20020011518 Goetz et al. Jan 2002 A1
20020015066 Siwinski et al. Feb 2002 A1
20020019982 Aratani et al. Feb 2002 A1
20020023957 Michaelis et al. Feb 2002 A1
20020048224 Dygert et al. Apr 2002 A1
20020051010 Jun et al. May 2002 A1
20020060748 Aratani et al. May 2002 A1
20020066782 Swaminathan et al. Jun 2002 A1
20020067503 Hiatt Jun 2002 A1
20020078149 Chang et al. Jun 2002 A1
20020087530 Smith et al. Jul 2002 A1
20020087598 Carro Jul 2002 A1
20020095501 Chiloyan et al. Jul 2002 A1
20020099534 Hegarty Jul 2002 A1
20020101343 Patton Aug 2002 A1
20020101513 Halverson Aug 2002 A1
20020131071 Parry Sep 2002 A1
20020131078 Tsukinokizawa Sep 2002 A1
20020134699 Bradfield et al. Sep 2002 A1
20020135800 Dutta Sep 2002 A1
20020137544 Myojo Sep 2002 A1
20020140993 Silverbrook Oct 2002 A1
20020159637 Echigo et al. Oct 2002 A1
20020165769 Ogaki et al. Nov 2002 A1
20020169849 Schroath Nov 2002 A1
20020171857 Hisatomi et al. Nov 2002 A1
20020185533 Shieh et al. Dec 2002 A1
20020199149 Nagasaki et al. Dec 2002 A1
20030002068 Constantin et al. Jan 2003 A1
20030007776 Kameyama et al. Jan 2003 A1
20030014615 Lynggaard Jan 2003 A1
20030024975 Rajasekharan Feb 2003 A1
20030025951 Pollard et al. Feb 2003 A1
20030038971 Renda Feb 2003 A1
20030046241 Toshikage et al. Mar 2003 A1
20030051214 Graham et al. Mar 2003 A1
20030065925 Shindo et al. Apr 2003 A1
20030076521 Li et al. Apr 2003 A1
20030084462 Kubota et al. May 2003 A1
20030086720 Song May 2003 A1
20030088582 Pflug May 2003 A1
20030093384 Durst et al. May 2003 A1
20030110926 Sitrick et al. Jun 2003 A1
20030117652 Lapstun Jun 2003 A1
20030121006 Tabata et al. Jun 2003 A1
20030128877 Nicponski Jul 2003 A1
20030146927 Crow et al. Aug 2003 A1
20030160898 Baek et al. Aug 2003 A1
20030164898 Imai Sep 2003 A1
20030177240 Gulko et al. Sep 2003 A1
20030187642 Ponceleon et al. Oct 2003 A1
20030218597 Hodzic Nov 2003 A1
20030220988 Hymel Nov 2003 A1
20030231198 Janevski Dec 2003 A1
20040024643 Pollock et al. Feb 2004 A1
20040036842 Tsai et al. Feb 2004 A1
20040039723 Lee et al. Feb 2004 A1
20040044894 Lofgren et al. Mar 2004 A1
20040049681 Diehl et al. Mar 2004 A1
20040064207 Zacks et al. Apr 2004 A1
20040118908 Ando et al. Jun 2004 A1
20040125402 Kanai et al. Jul 2004 A1
20040128514 Rhoads Jul 2004 A1
20040128613 Sinisi Jul 2004 A1
20040143459 Engleson et al. Jul 2004 A1
20040143602 Ruiz et al. Jul 2004 A1
20040150627 Luman et al. Aug 2004 A1
20040156616 Strub et al. Aug 2004 A1
20040167895 Carro Aug 2004 A1
20040184064 TaKeda et al. Sep 2004 A1
20040207876 Aschenbrenner et al. Oct 2004 A1
20040215470 Bodin Oct 2004 A1
20040229195 Marggraff et al. Nov 2004 A1
20040240541 Chadwick et al. Dec 2004 A1
20040240562 Bargeron et al. Dec 2004 A1
20040249650 Freedman et al. Dec 2004 A1
20050034057 Hull et al. Feb 2005 A1
20050038794 Piersol Feb 2005 A1
20050064935 Blanco Mar 2005 A1
20050068569 Hull et al. Mar 2005 A1
20050068581 Hull et al. Mar 2005 A1
20050083413 Reed et al. Apr 2005 A1
20050125717 Segal et al. Jun 2005 A1
20050149849 Graham et al. Jul 2005 A1
20050213153 Hull et al. Sep 2005 A1
20050216838 Graham Sep 2005 A1
20050216852 Hull et al. Sep 2005 A1
20050225781 Koizumi Oct 2005 A1
20050229107 Hull et al. Oct 2005 A1
20050231739 Lee et al. Oct 2005 A1
20050262437 Patterson et al. Nov 2005 A1
20060013478 Ito et al. Jan 2006 A1
20060043193 Brock Mar 2006 A1
20060092450 Kanazawa et al. May 2006 A1
20060136343 Coley et al. Jun 2006 A1
20060171559 Rhoads Aug 2006 A1
20060250585 Anderson et al. Nov 2006 A1
20070033419 Kocher et al. Feb 2007 A1
20070065094 Chien et al. Mar 2007 A1
20070109397 Yuan et al. May 2007 A1
20070162858 Hurley et al. Jul 2007 A1
20070168426 Ludwig et al. Jul 2007 A1
20070234196 Nicol et al. Oct 2007 A1
20070268164 Lai et al. Nov 2007 A1
20080037043 Hull et al. Feb 2008 A1
20080246757 Ito Oct 2008 A1
Foreign Referenced Citations (36)
Number Date Country
2386829 Nov 2002 CA
1352765 Jun 2002 CN
1097394 Dec 2002 CN
0651556 May 1995 EP
0743613 Nov 1996 EP
1079313 Feb 2001 EP
1133170 Sep 2001 EP
60-046653 Mar 1985 JP
01-172900 Jul 1989 JP
04-225670 Aug 1992 JP
05-101484 Apr 1993 JP
06-124502 May 1994 JP
07-160445 Jun 1995 JP
H07-284033 Oct 1995 JP
08-002015 Jan 1996 JP
08-69419 Mar 1996 JP
08-160985 Jun 1996 JP
H09-037180 Feb 1997 JP
H10-049761 Feb 1998 JP
10-126723 May 1998 JP
H11-341423 Dec 1999 JP
2000190575 Jul 2000 JP
2000352995 Dec 2000 JP
2001-228994 Aug 2001 JP
2001324988 Nov 2001 JP
2002178565 Jun 2002 JP
2002344636 Nov 2002 JP
2003005790 Jan 2003 JP
2003-87458 Mar 2003 JP
2003-513564 Apr 2003 JP
2003-514318 Apr 2003 JP
2003-177776 Jun 2003 JP
WO9806098 Feb 1998 WO
WO 9918523 Apr 1999 WO
WO0073875 Dec 2000 WO
WO 02082316 Oct 2002 WO
Related Publications (1)
Number Date Country
20050068567 A1 Mar 2005 US
Provisional Applications (2)
Number Date Country
60506303 Sep 2003 US
60506302 Sep 2003 US