A variety of systems are used for authoring multimedia presentations such as motion pictures, television shows, advertisements for television, presentations on digital versatile disks (DVDs), interactive hypermedia, and other presentations. Such authoring systems generally provide a user interface and a process through which multimedia data is captured and stored, and through which the multimedia presentation is created, reviewed and published for distribution. The user interface and process for authoring generally depend on the kind of presentation being created and what the system developer believes is intuitive and enables an author to work creatively, flexibly and quickly.
Some multimedia presentations are primarily nontemporal presentations. That is, any change in the presentation typically depends on user activity or other event, instead of the passage of time. Some nontemporal multimedia presentations may include temporal components. For example, a user may cause a video to be displayed that is related to a text document by selecting a hyperlink to the video in the document.
Other multimedia presentations are primarily temporal presentations incorporating audio and/or video material, and optionally other media related to the temporal media. Primarily temporal media presentations that are well known today include streaming media formats such as QuickTime, Real Media, Windows Media Technology and SMIL, and formats that encode data in the vertical blanking interval of a television signal, such as used by WebTV, ATVEF, and other similar formats.
A variety of authoring tools have been developed for different kinds of presentations. Tools for processing combined temporal and nontemporal media include those described in PCT Publication No. WO99/52045, corresponding to U.S. patent application Ser. No. 09/054,861, and PCT Publication No. WO96/31829, corresponding to U.S. patent application Ser. No. 08/417,974, and U.S. Pat. No. 5,659,793 and U.S. Pat. No. 5,428,731.
An authoring tool has a graphical user interface enabling interactive authoring of a multimedia presentation including temporal and nontemporal media. The graphical user interface enables specification of the temporal and spatial relationships among the media and playback of the presentation with the specified temporal and spatial relationships. The spatial and temporal relationships among the media may be changed independently of each other. The presentation may be viewed interactively under the control of the author during the authoring process without encoding the audio and video data into a streaming media data file for combination with the other media, simulating behavior of a browser that would receive a streaming media data file. The multimedia presentation may include elements that initiate playback of the presentation from a specified point in time. After authoring of the presentation is completed, the authoring tool assists in encoding and transferring the presentation for distribution. Information about the distribution format and location may be stored as user-defined profiles. Communication with the distribution location may be tested and presentation and the distribution information may be audited prior to encoding and transfer to reduce errors. A presentation is encoded according to the defined temporal and spatial relationships and the distribution format and location information to produce and encoded presentation. The encoded presentation and any supporting media data are transferred to the distribution location, such as a server. A streaming media server may be used for streaming media, whereas other data may be stored on a conventional data server. Accounts may be provided for a streaming media server for authors to publish their presentations. The authoring tool may be associated with a service that uses the streaming media server. Such streaming media servers also may be a source of stock footage for use by authors. These various functions, and combinations thereof, of the authoring tool are each aspects of the present invention that may be embodied as a computer system, a computer program product or a computer implemented process that provides these functions.
In one embodiment, the spatial relationship may be defined by a layout specification that indicates an association of one or more tracks of temporal media and one or more tracks of nontemporal media with a corresponding display location. If the temporal media is not visible, such as audio, the spatial relationship may be defined among the nontemporal media.
One kind of temporal relationship between nontemporal data and temporal media is provided by a table of contents track. The nontemporal media of elements associated with points in time in the table of contents track of a presentation is combined and displayed for the duration of the presentation. If a user selects one of the elements from the table of contents track, presentation of the temporal media data is initiated from the point in time associated with that element on the table of contents track.
It is also possible to associate a streaming media presentation with another streaming media presentation. For example, an event in one streaming media presentation may be used to initiate playback of another subsequent streaming media presentation. The two presentations may have different layout specifications. A document in a markup language may be created to include a hyperlink to each of the plurality of streaming media presentations.
In this description, all patent applications and published patent documents referred to herein are hereby incorporated by reference.
Referring to
There are many ways in which such multimedia presentations may be stored. For example, various streaming media formats, such as Real Media, Microsoft Windows Media Technology, QuickTime and SMIL, may be used. The temporal media also may be encoded in a television signal, with nontemporal media encoded in a vertical-blanking interval of the television signal, such as used by WebTV, ATVEF and other formats.
Creating such a multimedia presentation involves creating a temporal relationship between each element of nontemporal media and the temporal media. Such a relationship may be visualized using a timeline, an example of which is shown in
The timeline is a time based representation of a composition. The horizontal dimension represents time, and the vertical dimension represents the tracks of the composition. Each track has a row in the timeline which it occupies. The size of a displayed element in a graphical user interface is determined as a function of the duration of the segment it represents and a timeline scale. Each element in each track of the timeline has a position (determined by its start time within the presentation), a title and associated data and optionally a duration.
An audio track 300 or a video track 302 is for placement of temporal media. Such tracks commonly are used in video editing applications, such as shown in PCT Publication No. WO98/05034, which corresponds to U.S. patent application Ser. Nos. 08/687,926 and 08/691,985. Similarly, a title track 306 commonly is used to create title effects for movies, such as scrolling credits. As such, titles commonly are considered temporal media because they have parameters that are animated over time and that are combined with video data. Each track supports defining a sequence of segments of media data. A segment references, either directly or indirectly, the media data for the segment.
In the timeline shown herein, event tracks 304 associate nontemporal media with a particular point in time, thus creating a temporal relationship with the temporal media in tracks 300, 302, and 306. Each event track is a list of events. Each event includes a time and references a data file or a uniform resource locator, either directly or indirectly, from which media data for the event may be received.
The table of contents track 308 associates a table of contents entry with a point in time. The table of contents may be used as an index to the temporal media. Each entry includes a time and associated content, typically text, entered by the author. As described in more detail below, the table of contents entries are combined into a single document for display. If a user selects an element in the table of contents as displayed, the presentation is displayed starting at the point in time corresponding to the selected element.
The spatial relationship of the elements in the timeline as presented also may be specified by the author. In one simple example, a layout specification indicates a combination of frames of a display area, of which one or more frames is associated to one or more of the tracks in the timeline. Some tracks might not be associated with a display frame. Some frames might be associated directly with static media and not with a track. In general a frame is associated with only one track and a track is associated with only one frame.
The possible combinations and arrangements of the various tracks in a timeline are unlimited, and are not limited to visual media. As shown in the examples in
A graphical user interface, and example of which is described in connection with
In
An example template file follows:
<HTML>
<AVIDPUB tagtype=“framemap” framename=“Frame_A” feature=“MOVIE” originalurl=“static.htm”>
<AVIDPUB tagtype=“framemap” framename=“Frame_B” feature=“EVENTTRACK” featurenum=“1”>
<AVIDPUB tagtype=“framemap” framename=“Frame_C” feature=“EVENTTRACK” featurenum=“2”>
<AVIDPUB tagtype=“framemap” framename=“Frame_D” feature=“TOC” originalurl=“static.htm”>
<AVIDPUB tagtype=“framemap” framename=“Frame_E” feature=“EVENTTRACK” featurenum=“3”>
<AVIDPUB tagtype=“framemap” framename=“Frame_Top” feature=“STATICHTML” featurenum=“0” originalurl=“header.htm”>
<FRAMESET cols=“40%, 60%” bordercolor=“blue” frameborder=yes framespacing=2>
<FRAMESET rows=“70, 40%,*”>
</FRAMESET>
<FRAMESET rows=“33%, 34%,*”>
</FRAMESET>
</FRAMESET>
</HTML>
The first few lines of this template include “<AVIDPUB>” HTML elements. These elements keep track of the mappings between frames and tracks. Following these elements, a frame set definition is provided using the “<FRAMESETS>” element. Each frame has a source file name (SRC=“filename”) and a name (name=“name”) associated with it. Each <AVIDPUB> element maps a frame name to a “feature,” which is a name of a type of a track, and a feature number, indicative of which of the number of tracks of that type is mapped to the frame.
A template may include other content and structure beyond that shown in the example. For example, a company may want all of its presentations to use the same logo in the same position. This consistency may be provided by adding a reference to the logo to the template.
By selecting the next button 510 in
In this and other processes described below in which an HTML file is read and accessed, an application programming interface provided by the Microsoft Corporation used may be to read and write data in HTML files.
Having now described examples of data structures for timelines and layout specifications how they may be defined and how they may be associated with each other, authoring and publishing of such presentations will now be described.
An example GUI for the editing GUI of
How entries in the index or table of contents track 702 and event tracks 712 through 716 are added or modified will now be described. A region 740 illustrates available multimedia data for insertion into events. Buttons 742, 744 and 746 enable different views of the information presented in region 740. Button 742 selects a mode in which the system displays a picture of the data. Button 744 selects a mode in which the system displays a detailed list including a small picture, filename, and timestamp of the data file or resource. Button 746 selects a mode in which the system displays only titles. Other modes are possible and the invention is not limited to these. The names displayed are for those files found in the currently active path in the file system used by the authoring tool or other resources available to the system. The list operation, for example, may involve a directory lookup performed by the computer on its file system. A user may select an indicated data file or resource and drag its icon to an event timeline either to create a new event, or to replace media in an existing event, or to add media to an existing event.
On the event timeline, an event 750 indicates a data file or other resource associated with a particular point in time. Event 752 indicates that no file or resource is associated with the event at this time. In response to a user selection of a point on an event track, a new event may be created, if one is not already there, or the selected event may be opened. Whether a new event is created, or an existing event is opened, the user may be presented with a properties dialog box to enable entry of information, such as a name for the event, or a file name or resource locator for the associated media, for storage into the event data structure. An event that is created may be empty, i.e., might not refer to any data file or resource.
The elements on the event track may be illustrated as having a width corresponding to the amount of time it would take to download the data file over a specified network connection. To achieve this kind of display, the number of bytes of a data file is divided by the byte-per-second rate of the network connection to determine a time value, in seconds, which is used to determine the width of the icon for the event to be displayed on the event track. Displaying the temporal width of an object provides information to the author about whether enough time is available at the location of distribution to download the data and to display the data at the desired time.
Similar to the events, a user may select an element on the table of contents track as indicated at 754. An item may be added by selecting a point on the table of contents track with a cursor control device. Upon selection, a dialog window is displayed through which the user may enter text for the selected element. Each of the elements in the table of contents track 702 is displayed in the frame 756 in the viewer 720.
To display the presentation to the author, for a given point in time of the presentation, the system determines which contents should be displayed. In the example shown in
Referring now to
There are several ways in which the table of contents may be constructed to allow actions on a table of contents entry to cause a change in the playback position in the video frame. One example is provided by the source code and Appendices I-III. In the table of contents page, a JAVA script function called “seekToEPMarker” takes either a marker number (for Windows Media technology) or a time in milliseconds (for Real Media) and calls a function “seekToVideoMarker” of its parent frame in its frame set. This function call actually calls the JAVA script function of the child frame of the table of contents' parent frame that includes the video player. That function receives the marker and the time in milliseconds and generates the appropriate commands to the media player to initiate playback of the streaming media from the designated position.
Turning again to
A display manager, in one implementation, is described in more detail in connection with
Referring now to
How the display manager displays data given a specified time in the presentation will now be described in connection with
After initialization, each display manager acts as a “listener” process that responds to messages from other components, such as the clip manager and graphical user interface, to update the display. One kind of update is generated if display controls in the graphical user interface are manipulated. For example, a user may modify the position bar on either the timeline or the viewer to initiate display from a different point in time T. In response to such a change, the graphical user interface or the clip manager may issue a message requesting the display managers to update the display given a different time T. Similarly, during editing, changes to the timeline data structure at a given point in time T cause the clip manager to instruct the display managers to update the display with the new presentation information at that point in time T.
Playback may be implemented using the same display mechanism. During either forward or reverse playback at a continuous or user-controlled rate, a stream of instructions to update the display at different points in time T may be sent to the display managers. Each display manager updates its region of the display at each of the specified times T which it receives from the clip manager or graphical user interface.
Although the table of contents generally is a single file without time dependency, during editing it may be modified, after which the display is updated. One implementation for modifying the table of contents display will now be described in connection with
How the presentation manager generates a new table of contents file is described in
In one implementation, the display manager for each frame also may permit display of a zoomed version of the frame. In this implementation, selection of a frame for zooming causes the graphical user interface to display the data for this frame in the full display region. For video and events tracks, the zoom instruction merely changes the image scaling performed on the image to be displayed. For the table of contents track, the zoomed version may be provided by a display that enables editing of the table of contents. Modifications to the entries in the table of contents in the zoomed interface are passed back to the clip manager to update the timeline data structures.
After completing editing of a presentation, it may be published to its desired distribution format. A variety of operations maybe performed and assisted by the publishing component of this system to prepare a presentation for distribution. Operations that may be performed to publish a multimedia presentation will now be described in more detail in connection with
First, the author provides setup data, which is accepted 900 through a GUI, to define the distribution format and other information used to encode and transfer the presentation.
For example, the selected output format may be a streaming media format, such as RealG2, Windows Media Technology, QuickTime or SMIL. Other settings for the encoder may include the streaming data file type, the video width, the video height, a title, author, copyright and keyword data.
For transferring the presentation, various information may be used to specify characteristics of one or more servers to which the presentation will be sent and any account information for those servers. Transfer settings may include a transfer protocol, such as file transfer protocol (FTP) or a local or LAN connection, for sending the presentation data files to the server. The server name, a directory at the server in which the media files will be copied, and optionally a user name and password also may be provided. A default file name for the server, and the HTTP address or URL of the server from which a user will access the published presentation, also may be provided. The server information may be separate for both data files and streaming media files.
This encoding and transfer information may be stored by the transfer tool as a named profile for later retrieval for transferring other presentations. Such profile data may include, for example, the data defining settings for encoding, and the data defining settings for transfer of encoded data files.
When setting up each of the connections for transfer, the connection also may be tested to confirm its operation. This test process involves transferring a small file to the destination and confirming the ability of the system to read the file from the destination.
After setup, the presentation may be audited 901 to reduce the number of errors that may otherwise result during the encoding and/or transfer processes. Profile information, described below, the presentation, and other information may be reviewed for likely sources of errors. For example, titles and/or other effects may be checked to determine whether the title and/or effect has been rendered. The timeline data structure may be searched to identify the data files related to each event, segment, table of contents entry, etc., to determine if any file is missing. The events in the timeline may be compared to the video or audio or other temporal data track to determine if any events occur after the end of the video or audio or other temporal data track. The layout specification also may be compared to the timeline data structure to ensure that no events or other data have been defined on tracks that are not referred to in the layout specification. Results of these various tests on the layout and timeline data structures may be provided to the user. Information about the profile used for the transfer process also may be audited. For example, whether passwords might be used on the target server, and the other information about the accessibility of the target server may be checked. The target directory also may be checked to ensure that no files in the native file format of the authoring tool are present in the target directory. Various other tests may be performed in an audit process and the invention is not limited thereto.
After optional auditing, the presentation is encoded 902 by transforming the timeline data structures into a format used by a standard encoder, such as provided for the Real Media Player or Windows Media Technology. Such encoding is described in more detail below in connection with
A graphical user interface for facilitating the publishing process described in
Referring to
More particularly, an API has functions that: 1) enable opening the component, 2) optionally present the user with a dialog box interface to configure the component, 3) set settings of the component that control its behavior, 4) connect the component to a user visible progress bar and to the source of the data, 5) to initiate the component to start translating the data into the desired format, 6) write the desired format to a file, and 7) close the component if the process is complete. On the receiving side of the API, the system has code to respond to requests for data from the export or encode component. The export component generally accesses the time, track number, and file or URL specified by the user, which are obtained from the timeline data structure. To the extent that data interpretation or project-specific settings are used by the encoder, this information also may be made available through an API.
The video and audio may be encoded 1100 separately using standard techniques. The table of contents and event tracks are then processed. In particular, a list of event assets is generated 1102. An event asset is defined by its filename, track, and time in the presentation. The frame set is then accessed 1104 to obtain a list of tracks and frame names. The items in the event tracks are then added to the streaming media file using the filename for the event and the frame name for the event, at the indicated time for the event, in 1106. The filename for the event is its full path including either a full URL for remote files or an indicator of the disk volume for files that are accessed locally or over a local area network (LAN). In step 1106, the filenames and frame names inserted into the streaming media file are those in the destination to which the media file is being transferred. Therefore, the encoding is dependent in part on the transfer parameters. The list created in step 1102 may be sorted or unsorted.
Using Real Media, the table of contents track does not affect the streaming media file. Using Windows Media technology, however, marker codes are inserted for each table of contents entry, although no marker codes are inserted for events.
Referring to
The process of transferring data to the servers will now be described in connection with
A computer system with which the various elements of the system described above, either individually or in combination, may be implemented typically includes at least one main unit connected to both one or more output devices which store information, transmit information or display information to one or more users or machines and one or more input devices which receives input from one or more users or machines. The main unit may include one or more processors connected to a memory system via one or more interconnection mechanisms. Any input device and output device also are connected to the processor and memory system via the interconnection mechanism.
The computer system may be a general purpose computer system which is programmable using a computer programming language. Computer programming languages suitable for implementing such a system include procedural programming languages, object-oriented programming languages, combinations of the two, or other languages. The computer system may also be specially programmed, special purpose hardware, or an application specific integrated circuit (ASIC).
In a general purpose computer system, the processor is typically a commercially available processor which executes a program called an operating system which controls the execution of other computer programs and provides scheduling, debugging, input/output control, accounting, compilation, storage assignment, data management and memory management, and communication control and related services. The processor and operating system defines computer platform for which application programs in other computer programming languages are written. The invention is not limited to any particular processor, operating system or programming language.
A memory system typically includes a computer readable and writeable nonvolatile recording medium in which signals are stored that define a program to be executed by the processor or information stored on the disk to be processed by the program. Typically, in operation, the processor causes data to be read from the nonvolatile recording medium into another memory that allows for faster access to the information by the processor than does the disk. This memory is typically a volatile, random access memory such as a dynamic random access memory (DRAM) or static memory (SRAM). The processor generally manipulates the data within the integrated circuit memory and may copy the data to the disk if processing is completed. A variety of mechanisms are known for managing data movement between the disk and the integrated circuit memory element, and the invention is not limited thereto. The invention is not limited to a particular memory system.
Such a system may be implemented in software or hardware or firmware, or any combination thereof. The various elements of this system, either individually or in combination, may be implemented as a computer program product including a computer-readable medium on which instructions are stored for access and execution by a processor. Various steps of the process may be performed by a computer processor executing instructions stored on a computer-readable medium to perform functions by operating on input and generating output.
Additionally, the computer system may be a multiprocessor computer system or may include multiple computers connected over a computer network. Various possible configurations of computers in a network permit access to the system by multiple users using multiple instances of the programs even if they are dispersed geographically. Each program or step shown in the figures and the substeps or subparts shown in the figures may correspond to separate modules of a computer program, or may be separate computer programs. Such modules may be operable on one or more separate computers or other devices. The data produced by these components may be stored in a memory system or transmitted between computer systems or devices. The plurality of computers or devices may be interconnected by a communication network, such as a public switched telephone network or other circuit switched network, or a packet switched network such as an Internet protocol (IP) network. The network may be wired or wireless, and may be public or private.
A suitable platform for implementing software to provide such an authoring system includes a processor, operating system, a video capture device, a Creative Labs Sound Blaster or compatible sound card, CD-ROM drive, and 64 Megabytes of RAM minimum. For analog video capture, the video capture device may be the Osprey-100 PCI Video Capture Card or the Eskape MyCapture II USB Video Capture Device. The processor may be a 230 megahertz Pentium II or Pentium III processor, or Intel equivalent processor with MMX Technology, such as the AMD-K6-III, or Celeron Processor with 128K cache, and may be used with an operating system such as the Windows 98/98SE or Millennium operating systems. For digital video capture, the video capture device may be an IEEE 1394 Port (OHCI compliant or Sony ILink). The processor may be a 450 megahertz Pentium II or Pentium III processor, or Intel equivalent processor with MMX Technology, such as the AMD-K6-III, or Celeron processor with 128K cache.
Given an authoring tool such as described above, the use of multiple authoring tools by multiple authors for publishing data to a public or private computer network for access by other users will now be described in connection with
In addition to publishing presentations to the media server, an authoring tool may use the media server or data server as a source of content for presentations. As shown in
Having now described a few embodiments, it should be apparent to those skilled in the art that the foregoing is merely illustrative and not limiting, having been presented by way of example only. Numerous modifications and other embodiments are within the scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
4538188 | Barker et al. | Aug 1985 | A |
4685003 | Westland | Aug 1987 | A |
4746994 | Ettlinger | May 1988 | A |
5012334 | Etra | Apr 1991 | A |
5045940 | Peters et al. | Sep 1991 | A |
5097351 | Kramer | Mar 1992 | A |
5196933 | Henot | Mar 1993 | A |
5214528 | Akanabe et al. | May 1993 | A |
5237648 | Mills et al. | Aug 1993 | A |
5267351 | Reber et al. | Nov 1993 | A |
5274758 | Beitel et al. | Dec 1993 | A |
5307456 | MacKay | Apr 1994 | A |
5317732 | Gerlach, Jr. et al. | May 1994 | A |
5390138 | Milne et al. | Feb 1995 | A |
5404316 | Klinger et al. | Apr 1995 | A |
5428731 | Powers, III | Jun 1995 | A |
5442744 | Piech et al. | Aug 1995 | A |
5467288 | Fasciano et al. | Nov 1995 | A |
5481664 | Hiroya et al. | Jan 1996 | A |
5488433 | Washino et al. | Jan 1996 | A |
5489947 | Cooper | Feb 1996 | A |
5493568 | Sampat et al. | Feb 1996 | A |
5513306 | Mills et al. | Apr 1996 | A |
5515490 | Buchanan et al. | May 1996 | A |
5534942 | Beyers, Jr. et al. | Jul 1996 | A |
5537141 | Harper et al. | Jul 1996 | A |
5537157 | Washino et al. | Jul 1996 | A |
5539869 | Spoto et al. | Jul 1996 | A |
5541662 | Adams et al. | Jul 1996 | A |
5561457 | Cragun et al. | Oct 1996 | A |
5568275 | Norton et al. | Oct 1996 | A |
5577190 | Peters | Nov 1996 | A |
5584006 | Reber et al. | Dec 1996 | A |
5585858 | Harper et al. | Dec 1996 | A |
5592602 | Edmunds et al. | Jan 1997 | A |
5613057 | Caravel | Mar 1997 | A |
5617146 | Duffield et al. | Apr 1997 | A |
5619636 | Sweat et al. | Apr 1997 | A |
5623308 | Civanlar et al. | Apr 1997 | A |
5652714 | Peterson et al. | Jul 1997 | A |
5659790 | Kim et al. | Aug 1997 | A |
5659792 | Walmsley | Aug 1997 | A |
5659793 | Escobar et al. | Aug 1997 | A |
5664216 | Blumenau | Sep 1997 | A |
5669006 | Joskowicz et al. | Sep 1997 | A |
5680619 | Gudmundson et al. | Oct 1997 | A |
5682326 | Klingler et al. | Oct 1997 | A |
5684963 | Clement | Nov 1997 | A |
5712953 | Langs | Jan 1998 | A |
5717438 | Kim et al. | Feb 1998 | A |
5724605 | Wissner | Mar 1998 | A |
5742283 | Kim | Apr 1998 | A |
5752029 | Wissner | May 1998 | A |
5754851 | Wissner | May 1998 | A |
5760767 | Shore et al. | Jun 1998 | A |
5764275 | Lappington et al. | Jun 1998 | A |
5767846 | Nakamura et al. | Jun 1998 | A |
5781435 | Holroyd et al. | Jul 1998 | A |
5801685 | Miller et al. | Sep 1998 | A |
5822019 | Takeuchi | Oct 1998 | A |
5826102 | Escobar et al. | Oct 1998 | A |
5852435 | Vigneaux et al. | Dec 1998 | A |
5860073 | Ferrel et al. | Jan 1999 | A |
5861881 | Freeman et al. | Jan 1999 | A |
5878421 | Meyer et al. | Mar 1999 | A |
5889514 | Boezeman et al. | Mar 1999 | A |
5892506 | Hermanson | Apr 1999 | A |
5892507 | Moorby et al. | Apr 1999 | A |
5905841 | Peters et al. | May 1999 | A |
5907366 | Farmer et al. | May 1999 | A |
5910825 | Takeuchi | Jun 1999 | A |
5926613 | Schaffer | Jul 1999 | A |
5946445 | Peters et al. | Aug 1999 | A |
5969716 | Davis et al. | Oct 1999 | A |
5977962 | Chapman et al. | Nov 1999 | A |
5978648 | George et al. | Nov 1999 | A |
5982445 | Eyer et al. | Nov 1999 | A |
5995951 | Ferguson | Nov 1999 | A |
5999173 | Ubillos | Dec 1999 | A |
6016362 | Kato et al. | Jan 2000 | A |
6037932 | Feinleib | Mar 2000 | A |
6038573 | Parks | Mar 2000 | A |
6058236 | Peters et al. | May 2000 | A |
6061696 | Lee et al. | May 2000 | A |
6081262 | Gill et al. | Jun 2000 | A |
6091407 | Boetje et al. | Jul 2000 | A |
6092122 | Liu et al. | Jul 2000 | A |
6118444 | Garmon et al. | Sep 2000 | A |
6195497 | Nagasaka et al. | Feb 2001 | B1 |
6199082 | Ferrel et al. | Mar 2001 | B1 |
6201924 | Crane et al. | Mar 2001 | B1 |
6212527 | Gustman | Apr 2001 | B1 |
6230173 | Ferrel et al. | May 2001 | B1 |
6237025 | Ludwig et al. | May 2001 | B1 |
6243087 | Davis et al. | Jun 2001 | B1 |
6249280 | Garmon et al. | Jun 2001 | B1 |
6262723 | Matsuzawa et al. | Jul 2001 | B1 |
6262724 | Crowe et al. | Jul 2001 | B1 |
6292827 | Raz | Sep 2001 | B1 |
6330004 | Matsuzawa et al. | Dec 2001 | B1 |
6353461 | Shore et al. | Mar 2002 | B1 |
6400378 | Snook | Jun 2002 | B1 |
6404978 | Abe | Jun 2002 | B1 |
6411725 | Rhoads | Jun 2002 | B1 |
6426778 | Valdez, Jr. | Jul 2002 | B1 |
6430355 | Nagasawa | Aug 2002 | B1 |
6476828 | Burkett et al. | Nov 2002 | B1 |
6484199 | Eyal | Nov 2002 | B2 |
6489969 | Garmon et al. | Dec 2002 | B1 |
6515656 | Wittenburg et al. | Feb 2003 | B1 |
6544294 | Greenfield et al. | Apr 2003 | B1 |
6546405 | Gupta et al. | Apr 2003 | B2 |
6553142 | Peters | Apr 2003 | B2 |
6564263 | Bergman et al. | May 2003 | B1 |
6618547 | Peters et al. | Sep 2003 | B1 |
6665732 | Garofalakis et al. | Dec 2003 | B1 |
20020188628 | Cooper et al. | Dec 2002 | A1 |
20030018609 | Phillips et al. | Jan 2003 | A1 |
20040268224 | Balkus et al. | Dec 2004 | A1 |
20050120127 | Bradley et al. | Jun 2005 | A1 |
Number | Date | Country |
---|---|---|
0403118 | Dec 1990 | EP |
0469850 | Feb 1992 | EP |
0526064 | Feb 1993 | EP |
0564247 | Oct 1993 | EP |
0592250 | Apr 1994 | EP |
0596823 | May 1994 | EP |
0613145 | Aug 1994 | EP |
0689133 | Oct 1995 | EP |
0706124 | Apr 1996 | EP |
2336025 | Oct 1999 | GB |
WO8807719 | Oct 1988 | WO |
WO9321636 | Oct 1993 | WO |
WO9403897 | Feb 1994 | WO |
WO9429868 | Dec 1994 | WO |
WO9626600 | Aug 1996 | WO |
WO9631829 | Oct 1996 | WO |
WO9636007 | Nov 1996 | WO |
WO9712342 | Apr 1997 | WO |
WO9737497 | Oct 1997 | WO |
WO 9741504 | Nov 1997 | WO |
WO9804984 | Feb 1998 | WO |
WO9952045 | Oct 1999 | WO |
WO 9952045 | Oct 1999 | WO |
WO0073875 | Dec 2000 | WO |