Method and apparatus for editing heterogeneous media objects in a digital imaging device

Information

  • Patent Grant
  • 8127232
  • Patent Number
    8,127,232
  • Date Filed
    Friday, December 21, 2007
    16 years ago
  • Date Issued
    Tuesday, February 28, 2012
    12 years ago
Abstract
A method and apparatus for editing heterogeneous media objects in a digital imaging device having a display screen, where each one of the media objects has one or more media types associated therewith, such as a still image, a sequential image, video, audio, and text. The method aspect of the present invention begins by displaying a representation of each one of the media objects on the display screen to allow a user to randomly select a particular media object to edit. In response to a user pressing a key to edit a selected media object, one or more specialized edit screens is invoked for editing the media types associated with the selected media object. If the media object includes a still or a sequential image, then an image editing screen is invoked. If the media object includes a video clip, then a video editing screen is invoked. If the media object includes an audio clip, then an audio editing screen is invoked. And if the media object includes a text clip, then a text editing screen is invoked.
Description
FIELD OF THE INVENTION

The present invention relates generally to a digital imaging device and more particularly to a method and apparatus for creating, editing and presenting a multimedia presentation comprising heterogeneous media objects in the digital imaging device.


BACKGROUND OF THE INVENTION

The use of digital cameras is rapidly proliferating and they may one day overtake 35 mm SLR's in terms of worldwide sales. There are basically three types of digital cameras; digital still cameras, digital video cameras, and hybrid digital-video cameras.


Still digital cameras are used primarily for capturing high quality static photographs, and offer a less expensive alternative to digital video cameras. Still digital cameras are typically less expensive because they have far less processing power and memory capacity than digital video cameras.


Digital video cameras differ from digital still cameras in a number of respects. Digital video cameras are used to capture video at approximately thirty frames per second at the expense of image quality. Digital video cameras are more expensive than still cameras because of the extra hardware needed. The uncompressed digital video signals from all the low-resolution images require huge amounts memory storage, and high-ratio real-time compression schemes, such as MPEG, are essential for providing digital video for today's computers. Until recently, most digital video recorders used digital magnetic tape as the primary storage media, which has the disadvantage of not allowing random access to the data.


Hybrid digital-video cameras, also referred to as multimedia recorders, are capable of capturing both still JPEG images and video clips, with or without sound. One such camera, the M2 Multimedia Recorder by Hitachi America, Ltd., Brisbane, Calif., stores the images on a PC card hard disk (PCMCIA Type III), which provides random access to the recorded video data.


All three types of cameras typically include a liquid-crystal display (LCD) or other type of display screen on the back of the camera. Through the use of the LCD, the digital cameras operate in one of two modes, record and play. In record mode, the display is used as a viewfinder in which the user may view an object or scene before taking a picture. In play mode, the display is used a playback screen for allowing the user to review previously captured images and/or video. The camera may also be connected to a television for displaying the images on a larger screen.


Since digital cameras capture images and sound in digital format, their use for creation of multimedia presentations is ideal. However, despite their capability to record still images, audio, and video, today's digital cameras require the user to be very technologically proficient in order to create multimedia presentations.


For example, in order to create a multimedia presentation, the user first captures desired images and video with the camera, and then downloads the images to a personal computer or notebook computer. There, the user may import the images and video directly into a presentation program, such as Microsoft PowerPoint™. The user may also edit the images and video using any one of a number of image editing software applications. After the PowerPoint presentation has been created, the user must connect the PC or notebook to a projector to display the presentation. Finally, the user typically controls the play back of the presentation using a remote control.


Due to the limitations of today's digital cameras in terms of capabilities and features, the user is forced to learn how to operate a computer, image editing software, and a presentation program in order to effectively create and display the multimedia presentation. As the use of digital cameras becomes increasingly mainstream, however, the number of novice computer users will increase. Indeed, many users will not even own a computer at all. Therefore, many camera owners will be precluded from taking advantage of the multimedia capabilities provided by digital cameras.


What is needed is an improved method for creating, editing, and displaying a multimedia presentation using images and/or video from a digital imaging device. The present invention addresses such a need.


SUMMARY OF THE INVENTION

The present invention provides a method and apparatus for editing heterogeneous media objects in a digital imaging device having a display screen, where each one of the media objects has one or more media types associated therewith, such as a still image, a sequential image, video, audio, and text. The method aspect of the present invention begins by displaying a representation of each one of the media objects on the display screen to allow a user to randomly select a particular media object to edit. In response to a user pressing a key to edit a selected media object, one or more specialized edit screens is invoked for editing the media types associated with the selected media object. If the media object includes a still or a sequential image, then an image editing screen is invoked. If the media object includes a video clip, then a video editing screen is invoked. If the media object includes an audio clip, then an audio editing screen is invoked. And if the media object includes a text clip, then a text editing screen is invoked.


According to the present invention, each one of the specialized editing screens operates in a similar manner to ease use and operation of the digital imaging device and to facilitate creation of multimedia presentations on the digital imaging device, without the need to download the contents of the camera to a PC for editing.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram illustrating of one preferred embodiment of a digital video camera (DVC) for use in accordance with the present invention.



FIGS. 2A and 2B are diagrams depicting an exemplary form factor design for the DVC.



FIG. 3 is a table listing example media types that may be captured and stored by the DVC.



FIGS. 4A-B are diagrams illustrating one preferred embodiment of the review mode screen.



FIG. 5 is a flowchart depicting the process of creating an ordered group of heterogeneous media objects in accordance with the present invention.



FIGS. 6-8 are diagrams illustrating examples of marking heterogeneous media objects.



FIGS. 9A-B are diagrams illustrating a slide show object implemented as a metadata file.



FIG. 10 is a diagram illustrating the DVC connected to external projector, and alternatively to a television.



FIG. 11 is a diagram illustrating the components of the slide-show edit screen in accordance with the present invention.



FIG. 12 is a diagram illustrating the image editing screen.



FIG. 13 is a diagram illustrating the video editing screen.



FIGS. 14-17 are diagrams illustrating the process of editing a video on the DVC by creating and moving a video clip.



FIG. 18 is a diagram illustrating an audio editing screen for editing audio media types.



FIG. 19 is a diagram illustrating a text editing screen for editing text media types.



FIG. 20 is a diagram illustrating the mapping of the four-way control during slide show presentation.



FIG. 21 is a diagram illustrating the properties page of a media object.





DETAILED DESCRIPTION OF THE INVENTION

The present invention is a method and apparatus for creating and presenting a multimedia presentation comprising heterogeneous media objects stored in a digital imaging device. The following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements. Although the present invention will be described in the context of a digital video camera, various modifications to the preferred embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. That is, any digital imaging device used to store and display and/or video, could incorporate the features described hereinbelow and that device would be within the spirit and scope of the present invention. Thus, the present invention is not intended to be limited to the embodiment shown but is to be accorded the widest scope consistent with the principles and features described herein.


Referring now to FIG. 1, a block diagram of one preferred embodiment of a digital video camera (DVC) is shown for use in accordance with the present invention. The DVC 100 is preferably capable of capturing and displaying various types of image data including digital video and high-resolution still images.


The DVC 100 comprises an imaging device 110, a computer 112, and a hardware user interface 114. The Imaging device 110 includes an image sensor (not shown), such as a charged coupled device (CCD) or a CMOS sensor, for capturing frames of image data in bayer format. The image frames are transferred from the imaging device 110 to the computer 112 for processing, storage, and display on the hardware user interface 114.


The computer 112 includes an image processing digital-signal-processor (DSP) 116, a video codec 132, an audio codec 132, a mass storage device 122, a CPU 124, a DRAM 126, an internal nonvolatile memory, a mixer, and a video control 132. The computer 112 also includes a power supply 134, a power manager 136, and a system bus 138 for connecting the main components of the computer 112.


The hardware interface 114 for interacting with the user includes a display screen 140 for displaying the digital video and still images, an audio subsystem 142 for playing and recording audio, buttons and dials 146 for operating the DVC 100, and an optional status display 148.


The CPU 124 may include a conventional microprocessor device for controlling the overall operation of camera. In the preferred embodiment, The CPU 124 is capable of concurrently running multiple software routines to control the various processes of camera within a multithreaded environment. In a preferred embodiment, The CPU 124 runs an operating system that includes a menu-driven GUI. An example of such software is the Digita™ Operating Environment by FlashPoint Technology of San Jose, Calif. Although the CPU 124 is preferably a microprocessor, one or more DSP 116's (digital signal processor) or ASIC's (Application Specific Integrated Circuit) could also be used.


Non-volatile memory 128, which may typically comprise a conventional read-only memory or flash memory, stores a set of computer readable program instructions that are executed by the CPU 124. Input/Output interface (I/O) 150 is an interface device allowing communications to and from computer 112. For example, I/O 150 permits an external host computer (not shown) to connect to and communicate with computer 118.


Dynamic Random-Access-Memory (DRAM) 126 is a contiguous block of dynamic memory that may be selectively allocated for various storage functions. DRAM 126 temporarily stores both raw and compressed image data and is also used by CPU 124 while executing the software routines used within computer 112. The raw image data received from imaging device 110 is temporarily stored in several input buffers (not shown) within DRAM 126. A frame buffer (not shown) is used to store still image and graphics data via the video control 132 and/or the mixer.


Power supply 134 supplies operating power to the various components of camera. Power manager 136 communicates via line with power supply 134 and coordinates power management operations for camera. In the preferred embodiment, power supply 134 provides operating power to a main power bus 152 and also to a secondary power bus 154. The main power bus 152 provides power to imaging device 110, I/O 150, Non-volatile memory 128 and removable memory. The secondary power bus 154 provides power to power manager 136, CPU 124 and DRAM 126.


Power supply 134 is connected to main batteries and also to backup batteries 360. In the preferred embodiment, a camera user may also connect power supply 134 to an external power source. During normal operation of power supply 134, the main batteries (not shown) provide operating power to power supply 134 which then provides the operating power to camera via both main power bus 152 and secondary power bus 154. During a power failure mode in which the main batteries have failed (when their output voltage has fallen below a minimum operational voltage level) the backup batteries provide operating power to power supply 134 which then provides the operating power only to the secondary power bus 154 of camera.



FIGS. 2A and 2B are diagrams depicting an exemplary form factor design for the DVC 100, shown here as a clam-shell design having a rotatable imaging device 110. FIG. 2A is a top view of the DVC 100 in an opened position, while FIG. 2B is a top view of the DVC 100 in a closed position. FIG. 2A shows the display screen 140, a four-way navigation control 200, a mode dial 202, a display button 204, a set of programmable soft keys 206, a shutter button 208, a menu button 210, and an audio record button 212.


The mode dial 202 is used to select the operating modes for DVC 100, which include a capture mode (C) for recording video clips and for capturing images, a review mode (R) for quickly viewing the video clips and images on the display screen 140, and a play mode (P) for viewing full-sized images on the display screen 140.


When the DVC 100 is placed into capture mode and the display screen 140 is activated, the camera displays a “live view” of the scene viewed through the camera lens on the display screen 140 as a successive series of real-time frames. If the display screen 140 is not activated, then the user may view the scene through a conventional optical viewfinder (not shown).


Referring to FIGS. 1 and 2A, during live view, the imaging device 110 transfers raw image data to the image processing DSP 116 at 30 frames per second (fps), or 60 fields per second. The DSP 116 performs gamma correction and color conversion, and extracts exposure, focus, and white balance settings from the image data and converts the data into CCIR 650 streaming video. (CCIR 650 is an international standard for digital video designed to encompass both NTSC and PAL analog signals, providing an NTSC-equivalent resolution of 720×486 pixels at 30 fps. It requires 27 MB per second and uses three signals: one 13.5 MB/sec luminance (gray scale) and two 6.75 MB/sec chrominance (color)).


After processing, the streaming video from the DSP 116 is transferred to the mixer for the overlay of optional graphics and/or images onto the video. The graphics data from the DRAM's 126 frame buffer is transferred to the mixer in synch with streaming video, where the mixer combines the graphic data with the video. After the streaming video and the graphics are combined, the video is displayed on the display screen 140 via the video control 132. A video out port is also provided to display the video on an external display device.


When the user initiates the video capture function to record the digital video, the streaming video output from the DSP 116 is also transferred to the video codec 132 for compression and storage. The video codec 132 performs MPEG-2 encoding on the streaming video during recording, and performs MPEG-2 decoding during playback. The video codec 132 may include local memory, such as 32 Mbitsc of SDRAM 126 for example, for MPEG-2 motion estimation between frames. Such video codecs 132 are commercially available from Sony Electronics (CXD1922Q0) and Matsushita Electronics Corp.


As the video codec 132 compresses the digital video, the compressed video stream is transferred to a temporary buffer in DRAM 126. Simultaneously, audio is recorded by the audio subsystem 142 and transferred to the audio codec 132 for compression into a compressed audio format, such MPEG Audio Layer 3 (MP3), which is common internet format. In an alternative embodiment, the audio could be compressed into AC-3 format, a well-known Dolby Digital audio recording technology that provides six surround-sound audio channels.


The CPU 124 mixes the compressed video and audio into a specified format, such as MPEG-2, for example. After the compressed MPEG-2 data is generated, the CPU 124 transfers the MPEG-2 data to the removable mass-storage device 122 for storage. In a preferred embodiment, the mass storage device 122 comprises a randomly accessible 3-inch recordable DVD drive from Toshiba/Panasonic, or a one-inch 340 MB MicroDrive™ from IBM, for example.


The video architecture inputs the video stream from the DSP 116 directly into the mixer, rather than first storing the video in memory and then inputting the video to the mixer, in order to save bus bandwidth. However, if sufficient bus bandwidth is provided (e.g., 100 MHz), the video stream could be first stored in memory.


Although the resolution of the display screen 140 may vary, the display screen 140 resolution is usually much less than the resolution of the image data that's produced by imaging device 110 when the user captures a still image at full resolution. Typically, the resolution of display screen 140 is ¼ the video resolution of a full resolution image. Since the display screen 140 is capable of only displaying images at ¼ resolution, the images generated during the live view process are also ¼ resolution.


As stated above, the DVC 100 is capable of capturing high-resolution still images in addition to video. When the user initiates the capture function to capture a still or sequential image, the image device captures a frame of image data at a resolution set by user. The DSP 116 performs image processing on the raw CCD data to convert the frame of data into YCC color format, typically YCC 2:2:2 format (YCC is an abbreviation for Luminance, Chrominance-red and Chrominance-blue). Alternatively, the data could be converted into RGB format (Red, Green, Blue).


After the still image has been processed, the image is compressed, typically in JPEG format, and stored as an image file on the mass storage device 122. A JPEG engine (not shown) for compressing and decompressing the still images may be provided in the image processing DSP 116, the video codec 132, provided as a separate unit, or performed in software by the CPU 124.


After the image has been compressed and stored, live view resumes to allow the capture of another image. The user may continue to either capture still images, capture video, or switch to play or review mode to playback and view the previously stored video and images on the display screen 140. In a preferred embodiment, the DVC 100 is capable of capturing several different media types, as shown in FIG. 3.



FIG. 3 is a table listing example media types that may be captured and stored by the DVC 100. Also shown are the corresponding icons that are used to indicate to the media type. The media types include a single still image, a time lapse or burst image, a panorama, a video segment, an audio clip, and a text file.


A still image is a high-quality, single image that may have a resolution of 1536×1024 pixels, for example. A time-lapse image is a series of images automatically captured by the DVC 100 at predefined time intervals for a defined duration (e.g. capturing a picture every five minutes for an hour). A burst image is similar to a time-lapse, but instead of capturing images for defined period of time, the DVC 100 captures as many images as possible in a brief time frame (e.g., a couple seconds). A panorama image is an image comprising several overlapping images of a larger scene that have been stitched together. A burst image, a time-lapse image, and a panorama image are each objects that include multiple still images, therefore, they may be referred to as a sequential images.


In addition to capturing different image-based media types, the DVC 100 can capture other media types, such as audio clips and text. The user can record a voice message to create a stand-alone audio clip, or the user may record a voice message and have it attached to an image to annotate the image. Audio clips may also be downloaded from an external source to add sound tracks to the captured objects.


A text media type is created by entering letters through the buttons on the user interface. The text along with graphics can be overlaid as watermarks on the images or, the text can be saved in a file to create a text-based media type.


In a preferred embodiment, one or more of the different media types can be combined to form a single media object. Since various combinations may be formed, such as single image with sound, or burst image with text, etc, the DVC 100 can be described at storing heterogeneous media objects, each comprising a particular combination of media types, such as images, video, sound, and text/graphics. Some types of media objects are formed automatically by the DVC 100, such as a captured image or an annotated image, others are formed manually by the user.


After media objects are created and stored, the user may view the media objects by switching the camera to play mode or review mode. In play mode, the camera 100 allows the user to view screen-sized images in the display screen 140 in the orientation that the image was captured. Play mode also allows the user to hear recorded sound associated with a displayed image, and to play back sequential groups of images (time lapse, burst, and panorama images) and to view movies from the video.


In review mode, the DVC 100 enables the user to rapidly review the contents of the DVC. In addition, the media objects may be edited, sorted, printed, and transferred to an external source.


Referring now to FIG. 4A, a diagram illustrating one preferred embodiment of the review mode screen is shown. Moving the mode dial 202 (FIG. 2) to access the review mode enables the user to view all the media objects in the camera along with the specific media types associated with each of the objects.


The first embodiment of the review mode screen displays a series of object cells 300 that represent the media objects stored on the DVC 100, and a command bar 310. The display screen 140 is shown here as displaying nine object cells 300, although other numbers are also suitable.


The user may navigate through a series of displayed object cells 300 in the display screen 140 using the four-way navigation control 200. The object cell 300 currently selected by the four-way navigation control 200 is indicated by a highlighted area 302, which in this embodiment is shown as selection rectangle. Other shapes or indications that a object cell 300 is the currently active object cell are also suitable.


Each object cell 300 includes an image area 304 and an icon/information area 306. In the case of a still image, the image area 304 of a object cell 300 displays a thumbnail of the media object, which in the case of an image-based media object is a small, low-resolution version of the image. In the case of sequential images and video segments, the image area 304 of a object cell 300 displays a representative thumbnail or frame from the image sequence or video, respectively, typically the first one.


The icon/information area 306 displays one or more graphical icons and/or text information indicating to the user what media types have been associated with the media object displayed in the image area 304. The icon/information area 306 may be placed in various positions relative to the image area 304. However, in a preferred embodiment, the icon/information area 306 is displayed on the right-hand side of each object cell 300, as shown.


Referring now to FIG. 4B a diagram illustrating a second preferred embodiment of the review mode screen is shown, where like components share like reference numerals. In the second preferred embodiment, the review mode screen includes a filmstrip 352, the icon/information area 306 for displaying the media type icons associated with the active media object 302, a large thumbnail 354 showing a larger view of the active media object 302, and the command bar 310.


In a preferred embodiment, the filmstrip 352 displays four thumbnail images 350 at a time, although other numbers are also suitable. The user may navigate through the series of displayed thumbnails 350 in the display screen 140 using the four-way navigation control 200 (FIG. 2A). When the user holds down the left/right buttons on the four-way control 200, the thumbnails 350 are scrolled-off the display screen 140 and replaced by new thumbnails 350 representing other stored media objects to provide for fast browsing of the camera contents. As the user presses the buttons on the four-way control 200 and the thumbnails 350 scroll across the display screen 140, the thumbnail 350 that is positioned over a notch in the selection arrow line 356 is considered the active media object 302. When there are more than four media objects in the camera, the selection arrow line 356 displays arrowheads to indicate movement in that direction is possible with the left/right navigation buttons.


When a thumbnail 350 becomes the active media object 302, the media type icons corresponding to that media object are automatically displayed in the icon/information area 306, along with the large thumbnail 354. Other information can also be displayed, such as the name or number of the media object, and the date and time the media object was captured or created, for example.


In both the first and second embodiments of the review screen layout, displaying icons and text information in the icon/information area 306 according to the present invention provides the user with an automatic method identifying common groups of media objects. This also reduces the need for the user to switch to play mode to view the full-sized view of the object in order to recall the object's subject matter, which eliminates the need for decompressing the objects for display.


In a first aspect of the present invention, a method and apparatus is provided for creating and presenting a multimedia presentation from the heterogeneous group of media objects stored and displayed on the DVC 100. This is accomplished by navigating through several displays showing the heterogeneous media objects, selecting and marking the desired objects in the preferred order to create an ordered list of objects, and then saving the ordered list of objects as a slide show, thereby creating a new type of media object. After the slide show is created, the user may present the slide show wherein each media object comprising the slide show is automatically played back to the user in sequence that it was selected. The slide show may be played back on the display screen 140 and/or on an external television via the video out port.


In a second aspect of the present invention, each media object may be edited before or after incorporation into the slideshow, where each media object is edited using different media types editors designed to edit the media types associated with that particular object.


In a third aspect of the present invention, the user may specify parameters for slide show so that the objects in the slide show are not displayed linearly, but are displayed in an order that is dependent upon user defined events, thus creating an interactive slide show.


Each aspect of the present invention will now be explained in the sections below.


Slide Show Creation from Heterogeneous Media Objects


In a preferred embodiment, a slide show is generated by providing the DVC 100 with a marking and unmarking function within the user interface 114 that simultaneously provides for the selection and order of the heterogeneous media objects in the slide show.


Referring again to FIGS. 4A and 4B, in a preferred embodiment, the marking and unmarking function is implemented through the use of the soft keys 206a, 206b, and 206c displayed in the command bar 310, which are programmable, i.e., they may be assigned predefined functions. Hence, the name “soft” keys.


The function currently assigned to a respective soft key 206 is indicated by several soft key labels 308a, 308b, and 308c displayed in the command bar 310 on the display screen 140. In an alternative embodiment, the display screen 140 may be a touch-screen wherein each soft key 206 and corresponding label are implemented as distinct touch-sensitive areas in the command bar 310.


After a soft key label 308 has been displayed, the user may press the corresponding soft key 206 to have the function indicated by its label 308 applied to the current image. The functions assigned to the soft keys 206 may be changed in response to several different factors. The soft 206 keys may change automatically either in response to user actions, or based on predetermined conditions existing in the camera, such as the current operating mode, the image type of the media object, and so on. The soft keys 206 may also be changed manually by the user by pressing the menu button 210. Providing programmable soft keys 206 increases the number of functions that may be performed by the camera, while both minimizing the number of buttons required on the user interface 114, and reducing the need to access hierarchical menus.


In the first embodiment of the present invention, the soft keys 206 are “Mark”, “Edit”, and “Save”. Although not shown, other levels of soft key functions may be provided to increase the number of functions the user could apply to the media objects.


In general, the mark function indicated by soft key label 308a enables a user to create a temporary group of media objects. After a group of media objects is created, the user may then perform functions on the group other than transforming the temporary group into a permanent slide show, such as deleting the group and copying, for example.


To create an ordered group of images, the user navigates to a particular media object using the four way control 200 and presses the “Mark” soft key 206a corresponding to the mark function indicated by soft key label 308a. In response, a mark number is displayed in the object cell 300 of the highlighted image 302 and the highlighted image 302 becomes a marked image. After an image is marked, the “Mark” soft key label 308a is updated to “Unmark”. The “Unmark” function allows the user remove an image from the group, which removes the mark number from the object cell 300 of the highlighted image.


According to the present invention, a user may randomly create an ordered group of heterogeneous media objects using the four-way navigation control 200, and the programmable function keys 206, as shown in FIG. 5.



FIG. 5 is a flowchart depicting the process of creating an ordered group of heterogeneous media objects in accordance with the present invention.


The process begins when a user selects a media object by positioning the highlight area 302 over the object cell 300, or otherwise selects the object cell 300, using the four-way navigational control 200 in step 500. The user then presses the function key corresponding to the Mark soft key label 308a in step 502. After the “Mark” soft key 206a is depressed, the object cell 300 is updated to display the number of images that have been marked during the current sequence in step 504. The object cell 300 may also be updated to display an optional graphic, such as a dog-ear corner or a check mark, for example. After the object cell 300 has been updated, the “Mark” soft key in the command bar is updated to “Unmark” in step 506.


Next, the user decides whether to add more media objects to the temporary set of marked media objects in step 508. If the user decides to add more media objects, then the user selects the next media object using the four-way navigational control 200, and the “Unmark” soft key in the command bar is updated to “Mark” in step 510.


If the user decides not to add more media objects to the temporary group of marked media objects in step 508, then the user decides whether to remove any of the marked media objects from the group in step 512. If the user decides not to remove any of the marked media objects from the group, then the user may select a function, such as “Save” or “Delete” to apply to the group in step 514.


If the user decides to remove a marked media object from the group, then the group is dynamically modified as follows. The user first selects the media object to be removed by selecting the marked media object using the four-way navigational control 200 in step 516. The user then presses the function key corresponding to the “Unmark” soft key in step 518.


After the “Unmark” key is depressed, the object cells 300 for the remaining marked media objects may be renumbered. This is accomplished by determining whether the selected media object is the highest numbered media object in the marked group in step 522. If the selected media object is not the highest numbered media object in the marked group, then the marked media objects having a higher number are renumbered by subtracting one from the respective mark number and displaying the result in their object cells 300 in step 524. After the mark number is removed from the unmarked media object and the other mark numbers renumbered if required, the “Unmark” soft key in the command bar is updated to “Mark” in step 526. The user may then continue to modify the group by marking and/or unmarking other media objects accordingly.


The process of grouping media objects in the digital camera will now be explained by way of a specific example with reference to FIGS. 4A, 4B, and 6-8.


Referring again to FIG. 4A, assume that the user wishes to create a slide show beginning with the selected media object 302. At this point, the soft keys displayed in the command bar are prompts to the user that the user may perform the displayed functions, such as “Mark”, on the highlighted media object. The mark function is then performed by the user pressing the Mark function key 206a.


Referring now to FIG. 6 a diagram illustrating the result of the user pressing the Mark function key is shown. The selected media object cell 302 is updated with the number “1”, which indicates that the media object is the first to be marked. FIG. 7 is a diagram showing the user marking another media object by selecting a second media object cell 322 and pressing the Mark function key. This causes the media object cell 322 to be updated with the number “2”. FIG. 8 is a diagram showing a third media object being selected and marked, as described above, in which case, the icon area of the media object 342 is updated with the number “3”.


Referring again to FIG. 5, while marking media objects, the method for removing media objects in the group (steps 512-524) also allows a user to dynamically reorder or re-sequence the media objects in the group. For example, assume the user has marked five media objects, labeled as “1”, “2”, “3”, “4”, “5”, and wants to make media object “3” the last media object in the group. This can be accomplished by unmarking media object “3”, which results in media objects “4”, and “5” being renumbered “3” and “4”, respectively. Thereafter, the user may mark the original media object “3”, which results in the media object being labeled with the number “5”.


Referring again to FIG. 4, after the group has been created with the chosen media objects in the desired sequence, the user saves the ordered group to create a slide show media object. In a preferred embodiment, the slide show media object is created using “Save” function shown in the command bar 310.


In one preferred embodiment, pressing the soft key 206c assigned the “Save” function creates a metadata file, which is a file containing data that describes other data.


Referring to FIG. 9A, a diagram illustrating a slide show object 360 implemented as an exemplary metadata file is shown. The metadata file includes a series of fields that acts a play list when the file is read by identifying one or more of the following attributes for each media object:


a) A pointer to, or the address of, the media object,


b) An identification of each media object's associated media types; and


c) A duration of play.


Creating a metadata file that simply points to the real media objects saves storage space since the original media objects do not have to be duplicated.


In a second preferred embodiment, pressing the soft key 206c assigned the “Save” function (FIGS. 4A and 4B) creates a permanent group of media objects by copying all of the marked media objects either into a file, a folder, or a directory on the DVC's mass storage device 122. A dialog box or other type of prompt appears asking the user to name the new file, folder, or directory.


Referring to FIG. 9B, a diagram illustrating a slide show object 360′ implemented as a file directory is shown. A directory named “slide show” is created for the slide show 360′, where the name of the directory may be input by the user. After the directory is created, each marked media object is then copied to the directory as shown. Since the media objects are copied, the original media objects are left in tact, and the new slide show object 360′ may be transferred to an external source.


After the slide show 360 has been created using any of the described embodiments, it is displayed as a new media object cell 300 on the display screen 140 along with an icon indicating that the media object is a slide show. Selecting the new slide show object cell 300 and pressing the display button 204 or switching to play mode causes each of the media objects included in the “slide show” to be individually played back on the display screen 140 in the sequence that they were marked without user intervention.


In the case of a slide show 360 created as metadata file, the slide show is played by executing the metadata file, causing each media object listed to be fetched from memory and played in the order listed in the file. In the case of a slide show 360′ created as a standard file or directory, the slide show 360′ is played by displaying each media object in the order and listed.


When the slide show is presented, each media object therein is played by playing each of the media types comprising the object. For example, a still image is played by displaying the image for a predefined time on the display screen 140 while playing any associated audio. Sequential images are played by displaying each still comprising the sequential image while playing any associated audio. Video segments are played as a convention movie. A text-based object is played by displaying the text on the display screen 140. And a stand-alone audio clip is played by displaying a blank screen or the name of the clip while the audio is played through the DVC's 100 speakers.


According to the present invention, by connecting the DVC 100 to an external projector or television via the video out port, and playing the slide show 360, the camera can be used as a presentation device in place of a notebook computer, as shown in FIG. 10.



FIG. 10 is a diagram illustrating the DVC 100 connected to external projector 380, and alternatively to a large television 382. When the slide show 360 is played, the images, video and audio are automatically displayed directly on the large screen 384 or on the screen of the television 382 from the DVC 100. Thus, the present invention enables a novice user to show multimedia presentations without the need for downloading images and/or video to a computer for incorporation into presentation software to create a multimedia presentation.


Editing Media Objects


Referring again to FIG. 8 in a second aspect of the present invention, the DVC 100 is provided with an advanced feature that allows the user to edit the media objects either before or after incorporation into the slide show 360 using specialized media type editors. In one preferred embodiment, the user edits the slide show 360 by selecting the slide show object in either review or play mode, and then pressing the “Edit” soft key 206b. In response a slide show edit screen appears displaying the thumbnail images of all the media objects in the slide show.


Referring now to FIG. 11, a diagram illustrating the components of the slide show edit screen is shown in accordance with the present invention. The slide show edit screen is based on the review screen layout of FIG. 4B, where like components share like reference numerals. The slide show edit screen 400 includes, the filmstrip 352, a list page 402, and the command bar 310. The filmstrip 352 displays a scrollable series of thumbnails representing all the media objects in the slide show. The list page 402 displays a scrollable list of menu items that can be applied to the selected media object. And the command bar 310 displays several of soft key functions 308.


In the implementation shown in FIG. 11, the user may move a target cursor to discrete cursor locations 404 within the screen 400, shown here as diamond shapes, using the four-way navigational control 200. The cursor is active at any given time in either the filmstrip 352 or the list page 402. The current target-cursor location is shown as a black diamond, and the element associated with the current cursor location is the target element. In a preferred embodiment, the soft key labels 308 displayed in the command bar 310 are only associated with the target element.


To edit the slide show, the user navigates to the media object of interest in the filmstrip 352 and presses the “Choose” function 308a to select the targeted media object. In response, the target cursor location in the now inactive filmstrip 352 changes to a white diamond to show that the selection of the selected media object 302 is persistent. At the same time, the black diamond cursor appears in the active list page 402.


When in the list page 402, the item associated with the current cursor location becomes the target item and the recipient of the functions in the command bar 310. While the list page 402 is active, the “Exit” function saves the state of the list page 402 and moves the target cursor back to the selected media object 302 in filmstrip 352. The “Help” function offers assistance with the target item.


From the list page 402, the user may choose the “Edit Object” item 406 for editing the selected media object 302, or choose the “Properties” item 408 to change the properties associated with the selected media object 302. Choosing the “Edit Object” item 406 invokes an edit screen for editing the selected media object's content, which means editing the media types associated with the selected media object. In a preferred embodiment, for editing still image and sequential image media types, an image editor appears to enable the user to change the appearance of the image(s). For video, a video editor appears to enable the user to edit and rearrange scenes. For the audio, a sound editor appears to enable the user to edit the sound. And for text, such as a list of email addresses for example, a text editor appears to enable the user to modify the text.


According to the present invention, all four editing screens operate similar to the slide show editing screen 400 to ease the use and operation of the editing functions and facilitate the creation of multimedia presentations by non-computer savvy users.


Referring now to FIG. 12, a diagram illustrating the image editing screen 420 is shown. The image editing screen 420 displays the thumbnail image 422 of the selected media object in the filmstrip 352 along with a real time preview of the modified image 424. The user may select which editing function to apply to the selected media image 422 by moving the target cursor to the item in the list page 402 and pressing the “Choose” soft key 206a. In response, a menu or screen showing modifiable parameters for the selected item is displayed. When the parameters are changed, the results are applied to the selected image and displayed as the modified image 424. The user may then choose to keep or discard the changes.


Referring now to FIG. 13, a diagram illustrating the video editing screen is shown. The video editing screen 430 displays a movie graph 432 in the filmstrip 352 showing a pictorial representation of a video's duration, a position of a playback head 434, and cue locations 436 and 438 that mark significant moments in the video. The video's duration can be sized to fit the length of the movie graph 432 or scaled up and down via the “Zoom In” and “Zoom Out” soft key functions 308a and 308b. A preview pane 440 is provided to play back that portion of the video shown in the filmstrip 352.


The position of the playback head 434 is preferably located in the center of the movie graph 432 and marks the current frame. The movie scrolls forwards and backwards under the playback head 434. The cursor locations 436 (diamonds) on the left and right sides of the movie graph 432 control scrolling. The user may play back the video by navigating to the “Preview” item in the list page 402, causing that portion of the video to play in the preview pane 440.


The cues 438 displayed across the top of the movie graph 432 are associated with the visible video duration. The user may define clips within the video by marking begin and end frames with cues 438. After defining the clip, the user may copy, move, or delete the clip.



FIGS. 14-17 are diagrams illustrating the process of editing a video on the DVC 100 by creating and moving a clip.


Referring to FIG. 14, the process of creating a clip begins by defining and inserting a new cue by navigating to the “Cue” item in the list page 402 and pressing the “Insert” soft key 206a″



FIG. 15 shows that by default the inserted cue 442 is positioned along the movie graph 432 on the current frame marked by the playback head 434. When a cue is inserted, or otherwise targeted by the cursor, the command bar 310 is updated enable the user to select, move, or delete the cue. Pressing the “Choose” soft key 206a marks the current cue position as the beginning frame of the video clip.


Referring now to FIG. 16, after defining the start of the clip, the user navigates left or right to another cue location 438, and presses the “Choose” soft key 206a again to define the end frame of the clip. The duration of the video between the two clips becomes a selected clip 444, as shown in FIG. 16. After the clip 444 is created, the command bar 310 is updated to enable the user to copy, move, or delete the clip. To move the clip 444, the user presses the “Move” soft key 206b.


Referring now to FIG. 17, in move mode, the user may drag the clip 444 left and right to the desired location in the video using the navigation control 200. The video will scroll if required. The user can choose to insert the clip 444 at its new location by pressing the “Insert” soft key 206a (which “offsets” the video content underneath it), or replace the video content with the clip content by pressing the “Replace” soft key 206a. If the user inserts the clip 444, all cues downstream are preferably offset by the duration of the clip. Once the clip 444 is dropped into its new position, the move mode is turned off, and the user may edit the clip, navigate to another clip, or navigate to the list page to perform other operations.


According to the video editing screen 430 of the present invention, novice users are provided with a way to edit digital video directly on the DVC. Thus the present invention eliminates need for downloading the video to a PC and editing the video with some complex video editing package geared towards expert videophiles.


Referring now to FIG. 18, a diagram illustrating an audio editing screen for editing audio media types is shown. The audio editing screen 450 appears and operates like the video editing screen 430, except that a waveform 452 depicting the recorded audio is displayed in the filmstrip 352. The user may hear the audio by selecting the “Play” item in the list page 402, or insert cues as described above by selecting the “Cue” item.


Referring now to FIG. 19, a diagram illustrating a text editing screen for editing text media types is shown. The text editing screen 460 allows the user to edit text-based media objects. The text editing screen 460 uses the filmstrip 352 for displaying text that is to be edited, and includes a keyboard 462 in the list page 402, and an edit field 464.


To enter text, the user navigates to a desired character in the keyboard 462 and presses the “Type” soft key 206a whereupon the letter appears in the both the filmstrip 352 and the edit field 464. The user may edit a current word 466 by press the “up” button twice on the four-way navigational control 200 to enter the filmstrip 352. A cursor may be moved back and forth using the navigational control 200 to select a word 466, causing the word to appear in the edit field 464. The word may then be edited using the key board 462.


Modifying the Slide Show to Create an Interactive Presentation


Referring again to FIG. 11, after creating and/or editing the slide show, the slide show is ready to present. According to a third aspect of the present invention, the user may choose different presentation styles to apply to the slide show to create interactive presentations. In addition, the user may change the properties of media objects so that the objects in the slide show are not displayed linearly during playback, but rather are displayed in an order that is dependent upon user defined events.


In a preferred embodiment of the present invention, three presentation styles are provided. The first presentation style is to play back the media objects in the order that they were marked by the user during slide show creation. This is the default style. After creating the slide show, all the user need do is press the display button 204 and the slide show will present itself automatically.


The second presentation style is random access, where the play back order is controlled manually by the user using the four-way navigational control 200 (FIG. 2). According the to the present invention, the functions of the four-way navigational control 200 are changed during slide show presentation



FIG. 19 is a diagram illustrating the mapping of functions to the four-way control during slide show presentation. The function mapped to the right (or forward) button 200a is to display the next media object in the slide show when the button 200a is pressed. The function mapped to the left button 200b is to display the next media object in the slide show when the button 200b is pressed. And the function mapped to either the up or down buttons 200c and 200d is to display a list of media objects in the slide show when either the up or down buttons 200c and 200d is pressed. Once the list is displayed, the user can scroll to a desired media object and select that media object to cause it to be displayed, thus providing random access to the objects in the slide show during presentation.


The third presentation style is branching, which allows the user to associate branches to a particular media object that indicate which media object in the slide show will be played after the current media object. During playback, the user controls whether or not the branch should be taken.


Referring again to FIG. 11, in a preferred embodiment, the user establishes the branch associations by navigating to a desired media object in the slide show and selecting the “Properties” item 408 from the list page 402. In response, a properties page is displayed.


Referring now to FIG. 21, a diagram illustrating the properties page of the current media object 482 is shown. The properties page 480 displays the thumbnail of the current media object 482 in the filmstrip 352. The list page 402 displays a scrollable list of user-defined properties associated with the current media object 482 that control how and when the media object is played back during the slide show presentation. The user chooses which property to change by moving the target cursor to the discrete cursor locations 404 using the four-way navigational control 200.


As shown, the first property the user may change is the media object's position in the slide show. This property allows the user to manually change the media object's order of play in the slide show. As an example, the number three indicates the current media object 482 is the third object that will be played during the presentation of the slide show.


The second property the user may change is the duration the media object will be played back before the next media object is played. In a preferred embodiment, three types of duration settings are provided. The first duration type is a predefined fixed duration, such as 3 seconds, for example. The second duration type is automatic and is used when the media object includes audio. The automatic setting causes the media object to be played for the duration of the associated audio. The third type of duration is random, where the user overrides the duration setting by manually playing the next media object using the navigation control during slide show presentation, as described with reference to FIG. 20.


As stated above, another property the user may change is branching, which causes the slide show to branch to predefined media objects during presentation. In a preferred embodiment, the user specifies which media objects may be branched to by associating the media objects to the soft keys 206. When the edited media object is subsequently played in the slide show, the soft key labels 308 display the names of the specified media objects that may be branched to. When the user presses one of the soft keys 206, the slide show jumps to the specified media object and the presentation continues.


The example of FIG. 21 shows that the user has associated media object #8 with the first soft key 206a, and has associated media object #20 with the second soft key 206b. After the user has defined all the properties, the user may exit the properties screen 480 and edit the other media objects or play the newly created interactive slide show presentation.


When the slide show is presented, and the media object 482 edited in FIG. 21 is played, the user will have the options of allowing the slide show to play in the defined order or change the order of playback. The order of playback may be changed by playing adjacent media objects using the navigational control, or by using the soft keys 206 to branch to the media objects displayed in the command bar 310.


In accordance with the present invention, the properties screen 480, the text editing screen 460, the audio editing screen 450, the video editing screen 430, and the image editing screen 420 have been provided with an integrated user interface so that all the screens operate similarly, thus making the advance editing functions easy to learn by novice users. In addition, the variety of functions provided by the editing screens enable the user to edit the text, audio, video, and image media types all within a DVC.


In summary, a method and apparatus for creating and presenting a multimedia presentation comprising heterogeneous media objects in the digital imaging device has been disclosed. Although the present invention has been described in accordance with the embodiments shown, one of ordinary skill in the art will readily recognize that there could be variations to the embodiments and those variations would be within the spirit and scope of the present invention.


For example, the functions of creating the slide show, editing the heterogeneous media objects, and changing the properties of the heterogeneous media objects, may be included as part of the operating system, or be implemented as an application or applet that runs on top, or in place, of the operating system. In addition, the present invention may be implemented in other types of digital imaging devices, such as an electronic device for archiving images that displays the stored images on a television, for instance. In addition, software written according to the present invention may be stored on a computer-readable medium, such as a removable memory, or transmitted over a network, and loaded into the digital camera for execution. Accordingly, many modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the appended claims.

Claims
  • 1. A method for editing a plurality of media objects in a hand-held image capture device having a display screen, the method comprising the steps of: a) displaying a representation of each one of the media objects on the display screen, each one of the media objects having a plurality of media types associated therewith, wherein the plurality of media types includes a still image and a sequential image;b) enabling a user to randomly select a particular media object to edit; andc) in response to the user pressing a key to edit the selected media object, invoking a plurality of specialized edit screens for editing the plurality of media types associated with the selected media object, wherein in each one of the plurality of the specialized edit screens, a representation of the selected media object's content, and items to be applied to the selected media object are displayed, whereby each one of the plurality of the specialized edit screens operates in a similar manner to ease use and operation of the hand-held image capture device and to facilitate creation of multimedia presentations on the hand-held image capture device.
  • 2. The method as recited in claim 1 wherein step (c) further comprises: providing at least one of the plurality of specialized edit screens with discrete cursor locations, which the user navigates among using a navigation control.
  • 3. The method as recited in claim 2 wherein step (c) further comprises: providing at least one of the plurality of specialized edit screens with real time preview of editing functions applied to the selected media object.
  • 4. The method as recited in claim 3 wherein step (b) further includes the steps of: i) displaying a plurality of thumbnail images on the display screen, wherein each thumbnail image represents one of the media objects; andii) providing an icon area on the display screen for displaying an indication of one of the plurality of media types associated with the selected media object.
  • 5. A method of editing media objects using a hand-held image capture device comprising: a) displaying a representation of each one of the media objects on a display screen of the hand-held image capture device, each one of the media objects having a plurality of media types associated therewith, wherein the plurality of media types includes a still image and a sequential image;b) allowing selection of one of the media objects on the display screen for editing of the selected media object; andc) invoking a plurality of specialized edit screens for editing the plurality of media types associated with the selected media object when the selected media object is selected, wherein in each one of the plurality of the specialized edit screens, a representation of content of the selected media object, and items to be applied to the selected media object are displayed, whereby each one of the plurality of the specialized edit screens operates in a similar manner to ease use and operation of the hand-held image capture device and to facilitate creation of multimedia presentations on the hand-held image capture device.
  • 6. The method as recited in claim 5 wherein step (c) further includes the step of: i) displaying in each one of the plurality of specialized editing screens, a representation of the selected media object's content, items to be applied to the selected media object, and at least one soft key function.
  • 7. The method as recited in claim 6 wherein step (c) further includes the step of: ii) providing at least one of the plurality of specialized editing screens with discrete cursor locations, which the user navigates among using a navigation control.
  • 8. The method as recited in claim 7 wherein step (c) further includes the step of: iii) providing at least one of the plurality of specialized editing screens with real time preview of editing functions applied to the selected media object.
  • 9. The method as recited in claim 8 wherein step (b) further includes the steps of: i) displaying a plurality of thumbnail images on the display screen, wherein each thumbnail image represents one of the media objects; andii) providing an icon area on the display screen for displaying an indication of one of the plurality of media types associated with the selected media object.
  • 10. A handheld image capture device comprising: a) means for displaying a representation of media objects on a display screen of the image capture device, each one of the media objects having a plurality of media types associated therewith, wherein the plurality of media types includes a still image and a sequential image;b) means for allowing selection of one of the media objects on the display screen for editing of the selected media object; andc) means for invoking a plurality of specialized edit screens for editing the plurality of media types associated with the selected media object when the selected media object is selected, wherein in each one of the plurality of the specialized edit screens, a representation of content of the selected media object, and items to be applied to the selected media object are displayed, whereby each one of the plurality of the specialized edit screens operates in a similar manner to ease use and operation of the image capture device and to facilitate creation of multimedia presentations on the image capture device.
  • 11. The handheld image capture device as recited in claim 10 wherein the each one of the plurality of specialized editing screens displays a representation of the selected media object's content, editing items to be applied to the selected media object, and at least one soft key function.
  • 12. The handheld image capture device as recited in claim 11 wherein at least one of the plurality of specialized editing screens includes discrete cursor locations, which the user navigates among using a navigation control.
  • 13. The handheld image capture device as recited in claim 12 wherein at least one of the plurality of specialized editing screens displays a real time preview of selected editing items applied to the selected media object.
  • 14. The handheld image capture device as recited in claim 13 further including a processing means, wherein the processing means displays thumbnail images on a display screen representing the media objects, and provides an icon area on the display screen for displaying an indication of one of the plurality of media types associated with the selected media object.
  • 15. The handheld image capture device as recited in claim 14 wherein each one of the selected media objects to edit are stored in a slide show media object.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 09/973,128, filed Oct. 9, 2001 now U.S. Pat. No. 7,337,403, which is a continuation of U.S. patent application Ser. No. 09/223,960, filed Dec. 31, 1998, now U.S. Pat. No. 6,317,141, issued Nov. 13, 2001. The disclosures of these applications are hereby incorporated by reference in their entireties. The present invention is related to the following U.S. Pat. No. 5,903,309, entitled “Method and System For Displaying Images And Associated Multimedia Types In The Interface Of A Digital Camera,” issued May 11, 1999; U.S. Pat. No. 6,249,316, entitled “Method and System For Creating A Temporary Group Of Images On A Digital Camera,” issued Jun. 19, 2001; U.S. Pat. No. 6,683,649, entitled “Method And Apparatus For Creating A Multimedia Presentation From Heterogeneous Media Objects In A Digital Imaging Device,” issued Jan. 27, 2004; and U.S. Pat. No. 6,738,075, entitled “Method And Apparatus For Creating An Interactive Slide Show In A Digital Imaging Device,” issued May 18, 2004.

US Referenced Citations (849)
Number Name Date Kind
610861 Goodwin Sep 1898 A
725034 Brownell Apr 1903 A
2289555 Simons Jul 1942 A
2298382 Hutchison, Jr. et al. Oct 1942 A
3062102 Martin Nov 1962 A
RE25635 Nerwin et al. Sep 1964 E
3675549 Adair Jul 1972 A
3814227 Hurd, III et al. Jun 1974 A
3971065 Bayer Jul 1976 A
3991625 Preston Nov 1976 A
4011571 Okuzawa Mar 1977 A
4017680 Anderson et al. Apr 1977 A
4057830 Adcock Nov 1977 A
4081752 Sumi Mar 1978 A
4125111 Hudspeth et al. Nov 1978 A
4131919 Lloyd et al. Dec 1978 A
4158208 Dischert Jun 1979 A
4168488 Evans Sep 1979 A
4172327 Kuehn et al. Oct 1979 A
4183645 Ohmura Jan 1980 A
4195317 Stratton Mar 1980 A
4234890 Astle Nov 1980 A
4253756 Kurei Mar 1981 A
4267555 Boyd et al. May 1981 A
4306793 Date et al. Dec 1981 A
4325080 Satoh Apr 1982 A
4329029 Haskell May 1982 A
4337479 Tomimoto et al. Jun 1982 A
4347618 Kavouras et al. Aug 1982 A
4359222 Smith et al. Nov 1982 A
4364650 Terashita et al. Dec 1982 A
4403303 Howes et al. Sep 1983 A
4416282 Saulson Nov 1983 A
4423934 Lambeth et al. Jan 1984 A
4456931 Toyoda et al. Jun 1984 A
4466230 Osselaere Aug 1984 A
4470067 Mino Sep 1984 A
4471382 Toyoda et al. Sep 1984 A
4477164 Nakai et al. Oct 1984 A
4519692 Michalik May 1985 A
4531161 Murakoshi Jul 1985 A
4540276 Ost Sep 1985 A
4542377 Hagen et al. Sep 1985 A
4554638 Iida Nov 1985 A
4570158 Bleich et al. Feb 1986 A
4574319 Konishi Mar 1986 A
4601055 Kent Jul 1986 A
4603966 Brownstein Aug 1986 A
4623930 Oshima et al. Nov 1986 A
4641198 Ohta et al. Feb 1987 A
4674107 Urban et al. Jun 1987 A
4691253 Silver Sep 1987 A
4723169 Kaji Feb 1988 A
4736224 Watanabe Apr 1988 A
4739409 Baumeister Apr 1988 A
4772941 Noble Sep 1988 A
4774600 Baumeister Sep 1988 A
4797836 Witek et al. Jan 1989 A
4801793 Vaynshteyn Jan 1989 A
4806920 Sawada Feb 1989 A
4816855 Kitaura et al. Mar 1989 A
4823283 Diehm Apr 1989 A
4827347 Bell May 1989 A
4851897 Inuma Jul 1989 A
4853733 Watanabe et al. Aug 1989 A
4855831 Miyamoto Aug 1989 A
4866292 Takemoto et al. Sep 1989 A
4882683 Rupp et al. Nov 1989 A
4887161 Watanabe et al. Dec 1989 A
4888812 Dinan et al. Dec 1989 A
4893198 Little Jan 1990 A
4907089 Yamaguchi Mar 1990 A
4916435 Fuller Apr 1990 A
4931960 Morikawa Jun 1990 A
4935809 Hayashi et al. Jun 1990 A
4937676 Finelli et al. Jun 1990 A
4937685 Barker et al. Jun 1990 A
4942417 Miyazawa Jul 1990 A
4952920 Hayashi Aug 1990 A
4965675 Hori Oct 1990 A
4969647 Mical et al. Nov 1990 A
4972495 Blike et al. Nov 1990 A
4974151 Advani Nov 1990 A
4982291 Kurahashi Jan 1991 A
4992887 Aragaki Feb 1991 A
4996714 Desjardins et al. Feb 1991 A
5001697 Torres Mar 1991 A
5007027 Shimoi Apr 1991 A
5014193 Garner et al. May 1991 A
5016107 Sasson May 1991 A
5018017 Sasaki et al. May 1991 A
5020012 Stockberger May 1991 A
5021989 Fujisawa et al. Jun 1991 A
5027150 Inoue Jun 1991 A
5027227 Kita Jun 1991 A
5030944 Masimo et al. Jul 1991 A
5031329 Smallidge Jul 1991 A
5032918 Ota et al. Jul 1991 A
5032926 Imai et al. Jul 1991 A
5034804 Sasaki et al. Jul 1991 A
5038320 Heath et al. Aug 1991 A
5040068 Parulski Aug 1991 A
5040070 Higashitsutsumi Aug 1991 A
5043801 Watanabe Aug 1991 A
5043816 Nakano Aug 1991 A
5049916 O'Such et al. Sep 1991 A
5050098 Brown et al. Sep 1991 A
5057924 Yamada Oct 1991 A
5063600 Norwood Nov 1991 A
5065246 Takemoto et al. Nov 1991 A
5067029 Takahashi Nov 1991 A
5070406 Kinoshita Dec 1991 A
5073823 Yamada et al. Dec 1991 A
5077582 Kravette et al. Dec 1991 A
5083383 Heger Jan 1992 A
5093716 Kondo et al. Mar 1992 A
5099262 Tanaka et al. Mar 1992 A
5101225 Wash Mar 1992 A
5101364 Davenport Mar 1992 A
5106107 Justus Apr 1992 A
5122827 Saegusa et al. Jun 1992 A
5123088 Kasahara et al. Jun 1992 A
5124537 Chandler et al. Jun 1992 A
5124814 Takahashi et al. Jun 1992 A
5130812 Yamaoka Jul 1992 A
5133076 Hawkins et al. Jul 1992 A
5134390 Kishimoto et al. Jul 1992 A
5134431 Ishimura et al. Jul 1992 A
5134434 Inoue et al. Jul 1992 A
5138459 Roberts Aug 1992 A
5138460 Egawa Aug 1992 A
5140358 Tokunaga Aug 1992 A
5142319 Wakabayashi Aug 1992 A
5142680 Ottman et al. Aug 1992 A
5144358 Tsuru et al. Sep 1992 A
5144445 Higashitsutsumi Sep 1992 A
5146259 Kobayashi et al. Sep 1992 A
5146353 Isoguchi et al. Sep 1992 A
5153729 Saito Oct 1992 A
5153730 Nagasaki Oct 1992 A
5159364 Yanagisawa et al. Oct 1992 A
5161012 Choi Nov 1992 A
5161025 Nakao Nov 1992 A
5161026 Mabuchi et al. Nov 1992 A
5161535 Short Nov 1992 A
5164751 Weyer Nov 1992 A
5164831 Kuchta Nov 1992 A
5172103 Kita Dec 1992 A
5179653 Fuller Jan 1993 A
5184169 Nishitani Feb 1993 A
5185667 Zimmermann Feb 1993 A
5187517 Miyasaka Feb 1993 A
5187776 Yanker Feb 1993 A
5189404 Masimo et al. Feb 1993 A
5189408 Teicher Feb 1993 A
5189466 Yasukawa Feb 1993 A
5189490 Shetty Feb 1993 A
5193538 Ekwall Mar 1993 A
5194944 Uchiyama Mar 1993 A
5198851 Ogawa Mar 1993 A
5199101 Cusick et al. Mar 1993 A
5200818 Neta et al. Apr 1993 A
5202767 Dozier Apr 1993 A
5202844 Kamio et al. Apr 1993 A
5204916 Hamilton et al. Apr 1993 A
5218459 Parulski et al. Jun 1993 A
5218647 Blonstein et al. Jun 1993 A
5220420 Hoarty et al. Jun 1993 A
5220614 Crain Jun 1993 A
5223935 Tsuji Jun 1993 A
5224207 Filion et al. Jun 1993 A
5227835 Anagnostopoulos Jul 1993 A
5227889 Yoneyama et al. Jul 1993 A
5229856 Koshiishi Jul 1993 A
5231511 Kodama et al. Jul 1993 A
5231651 Ozaki Jul 1993 A
5237648 Mills Aug 1993 A
5237650 Priem et al. Aug 1993 A
5239419 Kim Aug 1993 A
5241334 Kobayashi et al. Aug 1993 A
5241659 Parulski et al. Aug 1993 A
5247321 Kazami Sep 1993 A
5247327 Suzuka Sep 1993 A
5247682 Kondou et al. Sep 1993 A
5247683 Holmes et al. Sep 1993 A
5253071 MacKay Oct 1993 A
5258795 Lucas Nov 1993 A
5260795 Sakai Nov 1993 A
5262863 Okada Nov 1993 A
5262867 Kojima Nov 1993 A
5262868 Kaneko et al. Nov 1993 A
5262869 Hong Nov 1993 A
5265238 Canova et al. Nov 1993 A
5270821 Samuels Dec 1993 A
5270831 Parulski et al. Dec 1993 A
5274458 Kondo et al. Dec 1993 A
5276563 Ogawa Jan 1994 A
5278604 Nakamura Jan 1994 A
5282187 Lee Jan 1994 A
5283560 Bartlett Feb 1994 A
5283792 Davies Feb 1994 A
5287192 Iizuka Feb 1994 A
5297051 Arakawa et al. Mar 1994 A
5298936 Akitake et al. Mar 1994 A
5301026 Lee Apr 1994 A
5302997 Cocca Apr 1994 A
5307318 Nemoto Apr 1994 A
5309243 Tsai May 1994 A
5311240 Wheeler May 1994 A
5329289 Sakamoto et al. Jul 1994 A
5331366 Tokunaga Jul 1994 A
5335072 Tanaka et al. Aug 1994 A
5339432 Crick Aug 1994 A
5341466 Perlin Aug 1994 A
5343246 Arai et al. Aug 1994 A
5343267 Kazumi Aug 1994 A
5343386 Barber Aug 1994 A
5343509 Dounies Aug 1994 A
5345552 Brown Sep 1994 A
5359427 Sato Oct 1994 A
5359728 Rusnack Oct 1994 A
5367318 Beaudin et al. Nov 1994 A
5373153 Cumberledge Dec 1994 A
5375160 Guidon et al. Dec 1994 A
5386111 Zimmerman Jan 1995 A
5386177 Uhm Jan 1995 A
5386552 Garney Jan 1995 A
5390026 Lim Feb 1995 A
5390314 Swanson Feb 1995 A
5392462 Komaki Feb 1995 A
5396343 Hanselman Mar 1995 A
5402170 Parulski et al. Mar 1995 A
5402171 Tagami et al. Mar 1995 A
5404316 Klingler et al. Apr 1995 A
5404505 Levinson Apr 1995 A
5408265 Sasaki Apr 1995 A
5414811 Parulski et al. May 1995 A
5416556 Suzuki et al. May 1995 A
5420635 Konishi et al. May 1995 A
5425137 Mohan et al. Jun 1995 A
5428733 Carr Jun 1995 A
5432720 Lucente et al. Jul 1995 A
5432871 Novik Jul 1995 A
5432900 Rhodes et al. Jul 1995 A
5434618 Hayashi et al. Jul 1995 A
5434958 Surma et al. Jul 1995 A
5434964 Moss Jul 1995 A
5434969 Heilveil et al. Jul 1995 A
5436657 Fukuoka Jul 1995 A
5436659 Vincent Jul 1995 A
5440401 Parulski et al. Aug 1995 A
5442465 Compton Aug 1995 A
5444482 Misawa et al. Aug 1995 A
5448372 Axman et al. Sep 1995 A
5452145 Wakui et al. Sep 1995 A
5459830 Ohba et al. Oct 1995 A
5461429 Konishi et al. Oct 1995 A
5463728 Blahut Oct 1995 A
5463729 Kitaguchi Oct 1995 A
5465133 Aoki et al. Nov 1995 A
5467152 Wilson et al. Nov 1995 A
5467288 Fasciano et al. Nov 1995 A
5473370 Moronaga et al. Dec 1995 A
5473371 Choi Dec 1995 A
5475428 Hintz et al. Dec 1995 A
5475441 Parulski et al. Dec 1995 A
5475812 Corona et al. Dec 1995 A
5477264 Sarbadhikari et al. Dec 1995 A
5479206 Ueno et al. Dec 1995 A
5481330 Yamasaki Jan 1996 A
5481667 Bieniek et al. Jan 1996 A
5485200 Shimizu Jan 1996 A
5486853 Baxter Jan 1996 A
5488414 Hirasawa Jan 1996 A
5489945 Kannegundla Feb 1996 A
5489955 Satoh Feb 1996 A
5493332 Dalton et al. Feb 1996 A
5493335 Parulski et al. Feb 1996 A
5495342 Harigaya Feb 1996 A
5495559 Makino Feb 1996 A
5496106 Anderson Mar 1996 A
5497193 Mitsuhashi Mar 1996 A
5497490 Harada et al. Mar 1996 A
5500936 Allen et al. Mar 1996 A
5502486 Ueda Mar 1996 A
5504550 Takagi et al. Apr 1996 A
5506617 Parulski et al. Apr 1996 A
5510830 Ohia et al. Apr 1996 A
5512941 Takahashi Apr 1996 A
5513306 Mills Apr 1996 A
5513342 Leong et al. Apr 1996 A
5515101 Yoshida May 1996 A
5517606 Matheny et al. May 1996 A
5519815 Klassen May 1996 A
5521639 Tomura May 1996 A
5521663 Norris May 1996 A
5521717 Maeda May 1996 A
5521841 Arman et al. May 1996 A
5523786 Parulski Jun 1996 A
5523857 Fukushima Jun 1996 A
5525957 Tanaka Jun 1996 A
5526812 Dumoulin et al. Jun 1996 A
5528293 Watanabe Jun 1996 A
5528315 Sugiyama Jun 1996 A
5530235 Stefik et al. Jun 1996 A
5530517 Patton et al. Jun 1996 A
5532740 Wakui Jul 1996 A
5534975 Stefik et al. Jul 1996 A
5537151 Orr Jul 1996 A
5537530 Edgar Jul 1996 A
5539528 Tawa Jul 1996 A
5539535 Aizawa et al. Jul 1996 A
5539658 Mccullough Jul 1996 A
5541656 Kare et al. Jul 1996 A
5543925 Timmermans Aug 1996 A
5548371 Kawahara Aug 1996 A
5548409 Ohta et al. Aug 1996 A
5550646 Hassan et al. Aug 1996 A
5550938 Hayakawa et al. Aug 1996 A
5552806 Lenchik Sep 1996 A
5553277 Hirano et al. Sep 1996 A
5555193 Tsinberg et al. Sep 1996 A
5557329 Lim Sep 1996 A
5559554 Uekane et al. Sep 1996 A
5560022 Dunstan et al. Sep 1996 A
5561493 Takahashi Oct 1996 A
5563655 Lathrop Oct 1996 A
5565957 Goto Oct 1996 A
5566098 Lucente et al. Oct 1996 A
5568167 Galbi Oct 1996 A
5568192 Hannah Oct 1996 A
5572233 Kakegawa Nov 1996 A
5574933 Horst Nov 1996 A
5576757 Roberts et al. Nov 1996 A
5576759 Kawamura et al. Nov 1996 A
5577190 Peters Nov 1996 A
5577220 Combs et al. Nov 1996 A
5578757 Roth Nov 1996 A
5579029 Arai et al. Nov 1996 A
5579048 Hirasawa Nov 1996 A
5579450 Hanyu Nov 1996 A
5581311 Kuroiwa Dec 1996 A
5585845 Kawamura Dec 1996 A
5587740 Brennan Dec 1996 A
5589902 Gruel et al. Dec 1996 A
5590306 Watanabe et al. Dec 1996 A
5592301 Shimada Jan 1997 A
5594524 Sasagaki Jan 1997 A
5597193 Conner Jan 1997 A
5598181 Kermisch Jan 1997 A
5600371 Arai et al. Feb 1997 A
5602566 Motosyuku et al. Feb 1997 A
5606365 Maurinus Feb 1997 A
5608490 Ogawa Mar 1997 A
5608491 Sasagaki Mar 1997 A
5610653 Abecassis Mar 1997 A
5610654 Parulski Mar 1997 A
5614981 Bryant Mar 1997 A
5619738 Petruchik Apr 1997 A
5621459 Ueda Apr 1997 A
5621906 O'Neill Apr 1997 A
5625412 Aciu et al. Apr 1997 A
5627623 Sasagaki et al. May 1997 A
5630017 Gasper et al. May 1997 A
5630185 Kawamura May 1997 A
5631701 Miyake May 1997 A
5631871 Park et al. May 1997 A
5633573 Van Phuoc et al. May 1997 A
5633678 Parulski et al. May 1997 A
5633976 Ogino May 1997 A
5634000 Wicht May 1997 A
5634144 Mauro May 1997 A
5634154 Sasagaki May 1997 A
5635983 Ohmori Jun 1997 A
5635984 Lee Jun 1997 A
5637871 Piety et al. Jun 1997 A
5638123 Yamaguchi Jun 1997 A
5638498 Tyler et al. Jun 1997 A
5638501 Gough et al. Jun 1997 A
5640193 Wellner Jun 1997 A
5640202 Kondo Jun 1997 A
5640204 Tsutsui Jun 1997 A
5640627 Nakano Jun 1997 A
5640635 Fullam Jun 1997 A
5644653 Sunakawa et al. Jul 1997 A
5644694 Appleton Jul 1997 A
5648816 Wakui Jul 1997 A
5649032 Burt et al. Jul 1997 A
5649186 Ferguson Jul 1997 A
5649245 Inoue Jul 1997 A
5651107 Frank et al. Jul 1997 A
5656804 Barkan et al. Aug 1997 A
5656957 Marlow Aug 1997 A
5659547 Scarr et al. Aug 1997 A
5659729 Nielsen Aug 1997 A
5659805 Furlani et al. Aug 1997 A
5661519 Franetzki Aug 1997 A
5661632 Register Aug 1997 A
5664087 Tani et al. Sep 1997 A
5666580 Ito et al. Sep 1997 A
5668639 Martin Sep 1997 A
5671378 Acker et al. Sep 1997 A
5671440 Curry Sep 1997 A
5672840 Sage et al. Sep 1997 A
5673304 Connor et al. Sep 1997 A
5674003 Andersen Oct 1997 A
5675139 Fama Oct 1997 A
5675358 Bullock et al. Oct 1997 A
5675752 Scott et al. Oct 1997 A
5680533 Yamato Oct 1997 A
5680534 Yamato et al. Oct 1997 A
5682197 Moghadam et al. Oct 1997 A
5682207 Takeda et al. Oct 1997 A
5682326 Klingler et al. Oct 1997 A
5682441 Ligtenberg et al. Oct 1997 A
5684511 Westerink et al. Nov 1997 A
5684542 Tsukagoshi Nov 1997 A
5687408 Park Nov 1997 A
5697004 Saegusa Dec 1997 A
5699109 Nishimura et al. Dec 1997 A
5701900 Shehada Dec 1997 A
5703644 Mori et al. Dec 1997 A
5706049 Moghadam et al. Jan 1998 A
5706097 Schelling et al. Jan 1998 A
5706457 Dwyer et al. Jan 1998 A
5708561 Huilgol et al. Jan 1998 A
5708810 Kern et al. Jan 1998 A
5710572 Nihei Jan 1998 A
5711330 Nelson Jan 1998 A
5714973 Takahashi et al. Feb 1998 A
5715524 Jambhekar et al. Feb 1998 A
5719967 Sekine Feb 1998 A
5719978 Kakii et al. Feb 1998 A
5719987 Kawamura et al. Feb 1998 A
5721908 Lagarde Feb 1998 A
5721909 Oulid-Aissa et al. Feb 1998 A
5724070 Denninghoff et al. Mar 1998 A
5724475 Kirsten Mar 1998 A
5724579 Suzuki Mar 1998 A
5727112 Kellar et al. Mar 1998 A
5727159 Kikinis Mar 1998 A
5729289 Etoh Mar 1998 A
5734425 Takizawa et al. Mar 1998 A
5734427 Hayashi Mar 1998 A
5734436 Abe Mar 1998 A
5734875 Cheng Mar 1998 A
5734915 Roewer Mar 1998 A
5737032 Stenzel Apr 1998 A
5737476 Kim Apr 1998 A
5737491 Allen et al. Apr 1998 A
5740267 Echerer Apr 1998 A
5740436 Davis et al. Apr 1998 A
5740801 Branson Apr 1998 A
5742331 Uomori et al. Apr 1998 A
5742339 Wakui Apr 1998 A
5742435 Nagashima et al. Apr 1998 A
5742436 Furter Apr 1998 A
5742475 Riddiford Apr 1998 A
5742504 Meyer et al. Apr 1998 A
5742659 Atac Apr 1998 A
5742698 Minami et al. Apr 1998 A
5745097 Cappels Apr 1998 A
5745175 Anderson Apr 1998 A
5745808 Tintera Apr 1998 A
5748326 Thompson-Bell et al. May 1998 A
5748831 Kubo May 1998 A
5751350 Tanaka May 1998 A
5752089 Miyazawa et al. May 1998 A
5752244 Rose May 1998 A
5754227 Fukuoka May 1998 A
5754873 Nolan May 1998 A
5757354 Kawamura May 1998 A
5757418 Inagaki May 1998 A
5757427 Miyaguchi May 1998 A
5757468 Patton et al. May 1998 A
5758180 Duffy et al. May 1998 A
5760767 Shore et al. Jun 1998 A
5761655 Hoffman Jun 1998 A
5761686 Bloomberg Jun 1998 A
5764276 Martin et al. Jun 1998 A
5764278 Nagao Jun 1998 A
5764291 Fullam Jun 1998 A
5767897 Howell Jun 1998 A
5767904 Miyake Jun 1998 A
5769713 Katayama Jun 1998 A
5771034 Gibson Jun 1998 A
5773810 Hussey Jun 1998 A
5774131 Kim Jun 1998 A
5774233 Sakamoto Jun 1998 A
5777876 Beauchesne Jul 1998 A
5781175 Hara Jul 1998 A
5781650 Lobo Jul 1998 A
5781798 Beatty et al. Jul 1998 A
5784177 Sanchez et al. Jul 1998 A
5784525 Bell Jul 1998 A
5784629 Anderson Jul 1998 A
5786851 Kondo Jul 1998 A
D396853 Cooper et al. Aug 1998 S
5790094 Tanigawa et al. Aug 1998 A
5790193 Ohmori Aug 1998 A
5790418 Roberts Aug 1998 A
5790800 Gauvin et al. Aug 1998 A
5790878 Anderson Aug 1998 A
5796428 Matsumoto et al. Aug 1998 A
5796875 Read Aug 1998 A
5797051 Mcintyre Aug 1998 A
5798750 Ozaki Aug 1998 A
5801685 Miller et al. Sep 1998 A
5801770 Paff et al. Sep 1998 A
5801773 Ikeda Sep 1998 A
5803565 McIntyre et al. Sep 1998 A
5805153 Nielsen Sep 1998 A
5805163 Bagnas Sep 1998 A
5805829 Cohen et al. Sep 1998 A
5806005 Hull Sep 1998 A
5806072 Kuba et al. Sep 1998 A
5809345 Numako Sep 1998 A
5815160 Kikuchi Sep 1998 A
5815201 Hashimoto Sep 1998 A
5815205 Hashimoto et al. Sep 1998 A
5818925 Anders et al. Oct 1998 A
5818977 Tansley Oct 1998 A
5819103 Endoh et al. Oct 1998 A
5819107 Lichtman et al. Oct 1998 A
5821997 Kawamura Oct 1998 A
5822492 Wakui et al. Oct 1998 A
5822581 Christeson Oct 1998 A
5825675 Want et al. Oct 1998 A
5828406 Parulski Oct 1998 A
5828793 Mann Oct 1998 A
5831590 Ikedo Nov 1998 A
5831872 Pan Nov 1998 A
5835761 Ishii et al. Nov 1998 A
5835772 Thurlo Nov 1998 A
5838325 Deen et al. Nov 1998 A
5841422 Shyu Nov 1998 A
5841471 Endsley et al. Nov 1998 A
5845166 Fellegara Dec 1998 A
5847698 Reavey Dec 1998 A
5847706 Kingsley Dec 1998 A
5848193 Garcia Dec 1998 A
5848420 Xu Dec 1998 A
5850483 Takabatake et al. Dec 1998 A
5852502 Beckett Dec 1998 A
5854641 Howard et al. Dec 1998 A
5861918 Anderson Jan 1999 A
5862218 Steinberg Jan 1999 A
5862297 Timmermans Jan 1999 A
5867214 Anderson Feb 1999 A
5867686 Conner et al. Feb 1999 A
5870143 Suzuki Feb 1999 A
5870464 Brewster et al. Feb 1999 A
5870756 Nakata Feb 1999 A
5873007 Ferrada suarez Feb 1999 A
5874959 Rowe Feb 1999 A
5876351 Rohde Mar 1999 A
5877214 Kim Mar 1999 A
5877746 Parks et al. Mar 1999 A
5881205 Andrew Mar 1999 A
5883610 Jeon Mar 1999 A
5890014 Long Mar 1999 A
5892511 Gelsinger et al. Apr 1999 A
5892847 Johnson Apr 1999 A
5896131 Alexander Apr 1999 A
5896166 D'Alfonso et al. Apr 1999 A
5896203 Shibata Apr 1999 A
5898434 Small et al. Apr 1999 A
5898779 Squilla et al. Apr 1999 A
5898833 Kidder Apr 1999 A
5899851 Koninckx May 1999 A
5900909 Parulski et al. May 1999 A
5901303 Chew May 1999 A
5903309 Anderson May 1999 A
5903700 Fukushima May 1999 A
5903786 Goto May 1999 A
5907315 Vlahos et al. May 1999 A
5910805 Hickey Jun 1999 A
5917488 Anderson et al. Jun 1999 A
5920688 Cooper et al. Jul 1999 A
5920726 Anderson Jul 1999 A
5926208 Noonen et al. Jul 1999 A
5929904 Uchida Jul 1999 A
5933137 Anderson Aug 1999 A
5935259 Anderson Aug 1999 A
5936619 Nagasaki et al. Aug 1999 A
5937106 Murayama Aug 1999 A
5937213 Wakabayashi et al. Aug 1999 A
5938764 Klein Aug 1999 A
5938766 Anderson Aug 1999 A
5940080 Ruehle Aug 1999 A
5940121 Mcintyre Aug 1999 A
5943050 Bullock et al. Aug 1999 A
5943093 Anderson et al. Aug 1999 A
5943332 Liu et al. Aug 1999 A
5948091 Kerigan et al. Sep 1999 A
5949408 Kang et al. Sep 1999 A
5949432 Gough et al. Sep 1999 A
5949474 Gerszberg et al. Sep 1999 A
5949496 Kim Sep 1999 A
5949950 Kubo Sep 1999 A
5956049 Cheng Sep 1999 A
5956084 Moronaga et al. Sep 1999 A
5963255 Anderson et al. Oct 1999 A
5963670 Lipson et al. Oct 1999 A
5966116 Wakeland Oct 1999 A
5966122 Itoh Oct 1999 A
5969718 Mills Oct 1999 A
5969761 Takahashi et al. Oct 1999 A
5973664 Badger Oct 1999 A
5973691 Servan-Schreiber Oct 1999 A
5973694 Steele et al. Oct 1999 A
5973734 Anderson Oct 1999 A
5974386 Ejima et al. Oct 1999 A
5977975 Mugura et al. Nov 1999 A
5977976 Maeda Nov 1999 A
5977985 Ishii Nov 1999 A
5978016 Lourette et al. Nov 1999 A
5978020 Watanabe et al. Nov 1999 A
5978607 Teremy Nov 1999 A
5982350 Hekmatpour et al. Nov 1999 A
5982429 Kamamoto et al. Nov 1999 A
5983297 Noble et al. Nov 1999 A
5986634 Alioshin et al. Nov 1999 A
5986701 Anderson Nov 1999 A
5987223 Narukawa et al. Nov 1999 A
5991465 Anderson Nov 1999 A
5991515 Fall et al. Nov 1999 A
5993137 Harr Nov 1999 A
5999173 Ubillos Dec 1999 A
5999191 Frank et al. Dec 1999 A
5999207 Rodriguez et al. Dec 1999 A
5999213 Tsushima et al. Dec 1999 A
5999740 Rowley Dec 1999 A
5999989 Patel Dec 1999 A
6003093 Kester Dec 1999 A
6005613 Endsley et al. Dec 1999 A
6005618 Fukui Dec 1999 A
6006039 Steinberg et al. Dec 1999 A
6009336 Harris et al. Dec 1999 A
6011585 Anderson Jan 2000 A
6011926 Cockell Jan 2000 A
6012088 Li et al. Jan 2000 A
6015093 Barrett Jan 2000 A
6020920 Anderson Feb 2000 A
6020982 Yamauchi et al. Feb 2000 A
6022315 Iliff Feb 2000 A
6023241 Clapper Feb 2000 A
6023697 Bates et al. Feb 2000 A
6025827 Bullock et al. Feb 2000 A
6028603 Wang et al. Feb 2000 A
6028611 Anderson et al. Feb 2000 A
6031964 Anderson Feb 2000 A
6035323 Narayen et al. Mar 2000 A
6035359 Enoki Mar 2000 A
6037972 Horiuchi et al. Mar 2000 A
6038545 Mandeberg et al. Mar 2000 A
6052555 Ferguson Apr 2000 A
6052692 Anderson Apr 2000 A
6058268 Maeno May 2000 A
6058428 Wang et al. May 2000 A
6072479 Ogawa Jun 2000 A
6072480 Gorbet et al. Jun 2000 A
6072489 Gough et al. Jun 2000 A
6075905 Herman et al. Jun 2000 A
6078005 Kurakake Jun 2000 A
6078756 Squilla et al. Jun 2000 A
6082827 McFall Jul 2000 A
6084990 Suzuki et al. Jul 2000 A
6091377 Kawai Jul 2000 A
6091846 Lin et al. Jul 2000 A
6091956 Hollenberg Jul 2000 A
6094221 Andersion Jul 2000 A
6097389 Morris et al. Aug 2000 A
6097423 Mattsson-Boze et al. Aug 2000 A
6097430 Komiya et al. Aug 2000 A
6097431 Anderson Aug 2000 A
6097855 Levien Aug 2000 A
6104430 Fukuoka Aug 2000 A
6111604 Hashimoto Aug 2000 A
6115025 Buxton et al. Sep 2000 A
6118480 Anderson et al. Sep 2000 A
6122003 Anderson Sep 2000 A
6122005 Sasaki Sep 2000 A
6122409 Boggs et al. Sep 2000 A
6128013 Prabhu Oct 2000 A
6128413 Benamara Oct 2000 A
6131125 Rostoker et al. Oct 2000 A
6134606 Anderson et al. Oct 2000 A
6137468 Martinez Oct 2000 A
6137534 Anderson Oct 2000 A
6141044 Anderson Oct 2000 A
6141052 Fukumitsu et al. Oct 2000 A
6144362 Kawai Nov 2000 A
6147703 Miller Nov 2000 A
6147709 Martin et al. Nov 2000 A
6148149 Kagle Nov 2000 A
6151450 Numako Nov 2000 A
6154210 Anderson Nov 2000 A
6154576 Anderson et al. Nov 2000 A
6157394 Anderson Dec 2000 A
6161131 Garfinkle Dec 2000 A
6163722 Magin Dec 2000 A
6163816 Anderson et al. Dec 2000 A
6167469 Safai Dec 2000 A
6169575 Anderson Jan 2001 B1
6169725 Gibbs et al. Jan 2001 B1
6175663 Huang Jan 2001 B1
6177956 Anderson et al. Jan 2001 B1
6177957 Anderson Jan 2001 B1
6177958 Anderson Jan 2001 B1
6188431 Oie Feb 2001 B1
6188432 Ejima Feb 2001 B1
6188782 Le beux Feb 2001 B1
6204877 Kiyokawa Mar 2001 B1
6205485 Kikinis Mar 2001 B1
6208429 Anderson Mar 2001 B1
6209048 Wolff Mar 2001 B1
6211870 Foster Apr 2001 B1
6212632 Surine Apr 2001 B1
6215523 Anderson Apr 2001 B1
6222538 Anderson Apr 2001 B1
6222584 Pan Apr 2001 B1
6223190 Aihara et al. Apr 2001 B1
6226449 Inoue et al. May 2001 B1
6229566 Matsumoto et al. May 2001 B1
6230307 Davis et al. May 2001 B1
6232932 Thorner May 2001 B1
6233015 Miller May 2001 B1
6233016 Anderson May 2001 B1
6237010 Hui May 2001 B1
6239794 Yuen et al. May 2001 B1
6239837 Yamada et al. May 2001 B1
6246430 Peters Jun 2001 B1
6249316 Anderson Jun 2001 B1
6256063 Saito et al. Jul 2001 B1
6260102 Robinson Jul 2001 B1
6262769 Anderson Jul 2001 B1
6263421 Anderson Jul 2001 B1
6263453 Anderson Jul 2001 B1
6275260 Anderson Aug 2001 B1
6275622 Krtolica Aug 2001 B1
6278447 Anderson Aug 2001 B1
6285398 Shinsky et al. Sep 2001 B1
6292215 Vincent Sep 2001 B1
6292218 Parulski et al. Sep 2001 B1
RE37431 Lanier et al. Oct 2001 E
6300950 Clark et al. Oct 2001 B1
6304851 Kmack et al. Oct 2001 B1
6307544 Harding Oct 2001 B1
6310647 Parulski et al. Oct 2001 B1
6310648 Miller et al. Oct 2001 B1
6317141 Pavley Nov 2001 B1
6334025 Yamagami Dec 2001 B1
6353848 Morris Mar 2002 B1
6356281 Isenman Mar 2002 B1
6356357 Anderson Mar 2002 B1
6362850 Alsing Mar 2002 B1
6370282 Pavley et al. Apr 2002 B1
6377302 Ozaki Apr 2002 B1
6380972 Suga et al. Apr 2002 B1
6400375 Okudaira Jun 2002 B1
6400471 Kuo et al. Jun 2002 B1
6426771 Kosugi Jul 2002 B1
6429896 Aruga Aug 2002 B1
6437829 Webb Aug 2002 B1
6441828 Oba et al. Aug 2002 B1
6441854 Fellegara et al. Aug 2002 B2
6441927 Dow et al. Aug 2002 B1
6445412 Shiohara Sep 2002 B1
6473123 Anderson Oct 2002 B1
6483602 Haneda Nov 2002 B1
6486914 Anderson Nov 2002 B1
6493028 Anderson Dec 2002 B1
6504575 Ramirez et al. Jan 2003 B1
6507362 Akerib Jan 2003 B1
6507363 Anderson Jan 2003 B1
6512548 Anderson Jan 2003 B1
6515704 Sato Feb 2003 B1
6532039 Anderson Mar 2003 B2
6536357 Hiestand Mar 2003 B1
6538698 Anderson Mar 2003 B1
6546430 Gray, III et al. Apr 2003 B2
6563535 Anderson May 2003 B1
6563542 Hatakenaka et al. May 2003 B1
6563961 Murayama May 2003 B1
6567101 Thomas May 2003 B1
6567122 Anderson et al. May 2003 B1
6571271 Savitzky et al. May 2003 B1
6587119 Anderson et al. Jul 2003 B1
6597384 Harrison Jul 2003 B1
6597817 Silverbrook Jul 2003 B1
6608650 Torres Aug 2003 B1
6624824 Tognazzini et al. Sep 2003 B1
6654050 Karube et al. Nov 2003 B2
6657667 Anderson Dec 2003 B1
6680749 Anderson et al. Jan 2004 B1
6682207 Weber et al. Jan 2004 B2
6683649 Anderson Jan 2004 B1
6700612 Anderson Mar 2004 B1
6738075 Torres et al. May 2004 B1
6738091 Eouzan May 2004 B1
6747692 Patel et al. Jun 2004 B2
6765581 Cheng Jul 2004 B2
6765612 Anderson et al. Jul 2004 B1
6779153 Kagle Aug 2004 B1
6785019 Anderson Aug 2004 B2
6803945 Needham Oct 2004 B1
6803950 Miyamoto et al. Oct 2004 B2
6806906 Soga et al. Oct 2004 B1
6809737 Lee et al. Oct 2004 B1
6833867 Anderson Dec 2004 B1
6847388 Anderson Jan 2005 B2
6873357 Fuchimukai Mar 2005 B2
6897891 Itsukaichi May 2005 B2
6903762 Prabhu et al. Jun 2005 B2
6906751 Norita et al. Jun 2005 B1
6937356 Ito et al. Aug 2005 B1
RE38896 Anderson Nov 2005 E
6965400 Haba et al. Nov 2005 B1
7039873 Morris May 2006 B2
7050143 Silverbrook May 2006 B1
7079177 Okazaki et al. Jul 2006 B2
RE39213 Anderson Aug 2006 E
7092024 Kawamura et al. Aug 2006 B2
7106376 Anderson Sep 2006 B1
7107516 Anderson Sep 2006 B1
7113208 Saga Sep 2006 B1
7215371 Fellegara et al. May 2007 B2
7259783 Anderson Aug 2007 B2
7262769 Hoppe et al. Aug 2007 B2
7292267 Prentice et al. Nov 2007 B2
7337403 Pavley Feb 2008 B2
7379097 Anderson May 2008 B2
RE40865 Anderson Aug 2009 E
RE41014 Anderson Nov 2009 E
RE41088 Anderson Jan 2010 E
20010010543 Ward et al. Aug 2001 A1
20010012062 Anderson Aug 2001 A1
20010014910 Bobo Aug 2001 A1
20010014968 Mohammed Aug 2001 A1
20010049758 Shigetomi et al. Dec 2001 A1
20010050711 Karube et al. Dec 2001 A1
20020054116 Pavley et al. May 2002 A1
20020105582 Ikeda Aug 2002 A1
20020109782 Ejima Aug 2002 A1
20030169350 Wiezel Sep 2003 A1
20060174326 Ginter et al. Aug 2006 A1
20060200260 Hoffberg et al. Sep 2006 A1
20070061594 Ginter et al. Mar 2007 A1
Foreign Referenced Citations (229)
Number Date Country
3518887 Sep 1986 DE
0059435 Sep 1982 EP
0122094 Oct 1984 EP
0149196 Jul 1985 EP
0361739 Apr 1990 EP
0421769 Apr 1991 EP
0422447 Apr 1991 EP
0431581 Jun 1991 EP
0439087 Jul 1991 EP
0463856 Jan 1992 EP
0481145 Apr 1992 EP
0519379 Jun 1992 EP
0528084 Feb 1993 EP
0542377 May 1993 EP
0543414 May 1993 EP
0555048 Aug 1993 EP
0568468 Nov 1993 EP
0587161 Mar 1994 EP
0617542 Sep 1994 EP
0650125 Apr 1995 EP
0651553 May 1995 EP
0659017 Jun 1995 EP
0661658 Jul 1995 EP
0664475 Jul 1995 EP
0664526 Jul 1995 EP
0664527 Jul 1995 EP
0675648 Oct 1995 EP
0549689 Dec 1995 EP
0729271 Aug 1996 EP
0730368 Sep 1996 EP
0736841 Oct 1996 EP
0738075 Oct 1996 EP
0449106 Dec 1996 EP
0549684 Feb 1997 EP
0786688 Jul 1997 EP
0817476 Jan 1998 EP
0821522 Jan 1998 EP
0835011 Apr 1998 EP
0851277 Jul 1998 EP
0851675 Jul 1998 EP
0860735 Aug 1998 EP
0860982 Aug 1998 EP
0767941 Oct 1998 EP
0890919 Jan 1999 EP
0600410 Jun 2001 EP
2211707 Jul 1989 GB
2245749 Jan 1992 GB
2289555 Nov 1995 GB
2307371 May 1997 GB
S54-087128 Jul 1979 JP
55-142470 Nov 1980 JP
55-142471 Nov 1980 JP
S57-013479 Jan 1982 JP
S58-182976 Oct 1983 JP
S58-222382 Dec 1983 JP
S59-062891 Apr 1984 JP
S60-053379 Mar 1985 JP
S60-067981 Apr 1985 JP
S61-062281 Mar 1986 JP
S62-067981 Mar 1987 JP
S62-173509 Jul 1987 JP
62-271178 Nov 1987 JP
S62-299881 Dec 1987 JP
S63-303583 Dec 1988 JP
1-132173 May 1989 JP
H01-130675 May 1989 JP
H01-180532 Jul 1989 JP
H01-277285 Jul 1989 JP
1-238382 Sep 1989 JP
H01-306973 Nov 1989 JP
1-319870 Dec 1989 JP
H01-314382 Dec 1989 JP
2-42489 Feb 1990 JP
H02-056532 Feb 1990 JP
H02-058737 Feb 1990 JP
2-162420 Jun 1990 JP
2-257262 Oct 1990 JP
2-280484 Nov 1990 JP
H02-278973 Nov 1990 JP
3-117181 May 1991 JP
3-231574 Oct 1991 JP
H03-222582 Oct 1991 JP
3-246766 Nov 1991 JP
3-506111 Dec 1991 JP
H04-036644 Feb 1992 JP
4-115788 Apr 1992 JP
4-120889 Apr 1992 JP
H04-120889 Apr 1992 JP
4-230517 Aug 1992 JP
H04-236588 Aug 1992 JP
H04-243487 Aug 1992 JP
4-302886 Oct 1992 JP
4-506144 Oct 1992 JP
4-372070 Dec 1992 JP
5-14847 Jan 1993 JP
H05-037887 Feb 1993 JP
H05-064062 Mar 1993 JP
H05-073011 Mar 1993 JP
5-91452 Apr 1993 JP
5-108785 Apr 1993 JP
5-115027 Apr 1993 JP
5-131779 May 1993 JP
5-150308 Jun 1993 JP
5-207343 Aug 1993 JP
H05-219422 Aug 1993 JP
H05-219429 Aug 1993 JP
H05-219430 Aug 1993 JP
5-260351 Oct 1993 JP
H05-260398 Oct 1993 JP
5-289838 Nov 1993 JP
5-290143 Nov 1993 JP
5-308617 Nov 1993 JP
5-314093 Nov 1993 JP
6-57612 Mar 1994 JP
6-60078 Mar 1994 JP
6-78260 Mar 1994 JP
6-103352 Apr 1994 JP
6-105266 Apr 1994 JP
6-178261 Jun 1994 JP
6-197299 Jul 1994 JP
6-265794 Sep 1994 JP
H06-273819 Sep 1994 JP
6-290103 Oct 1994 JP
H06-301341 Oct 1994 JP
6-348467 Dec 1994 JP
6-350949 Dec 1994 JP
7-6028 Jan 1995 JP
H07-005601 Jan 1995 JP
H07-023280 Jan 1995 JP
H07-028757 Jan 1995 JP
H07-036422 Feb 1995 JP
H07-075048 Mar 1995 JP
H07-079375 Mar 1995 JP
H07-095466 Apr 1995 JP
H07-104889 Apr 1995 JP
H07-128702 May 1995 JP
H07-128792 May 1995 JP
7-160842 Jun 1995 JP
H07-143434 Jun 1995 JP
7-168852 Jul 1995 JP
7-184160 Jul 1995 JP
H07-168529 Jul 1995 JP
7-221911 Aug 1995 JP
7221911 Aug 1995 JP
7-245723 Sep 1995 JP
7-274060 Oct 1995 JP
7-274108 Oct 1995 JP
H07-284050 Oct 1995 JP
H07-287689 Oct 1995 JP
7-295873 Nov 1995 JP
H07-311402 Nov 1995 JP
H07-311403 Nov 1995 JP
H08-019023 Jan 1996 JP
8-32847 Feb 1996 JP
H08-056323 Feb 1996 JP
8-502840 Mar 1996 JP
8-111845 Apr 1996 JP
H08-088870 Apr 1996 JP
H08-095111 Apr 1996 JP
H08-097854 Apr 1996 JP
8-114849 May 1996 JP
8-116476 May 1996 JP
8-140025 May 1996 JP
H08-129216 May 1996 JP
H08-129438 May 1996 JP
H08-129557 May 1996 JP
8-147952 Jun 1996 JP
H08-184892 Jul 1996 JP
H08-190145 Jul 1996 JP
8-205014 Aug 1996 JP
8-223524 Aug 1996 JP
H08-223520 Aug 1996 JP
8-249450 Sep 1996 JP
8-279034 Oct 1996 JP
H08-256325 Oct 1996 JP
H08-317276 Nov 1996 JP
8-331495 Dec 1996 JP
8-339297 Dec 1996 JP
H08-336069 Dec 1996 JP
9-27939 Jan 1997 JP
H09-018813 Jan 1997 JP
H09-027939 Jan 1997 JP
9-37139 Feb 1997 JP
H09-044143 Feb 1997 JP
H09-046776 Feb 1997 JP
H09-065345 Mar 1997 JP
H09-069972 Mar 1997 JP
H09-083853 Mar 1997 JP
H09-083981 Mar 1997 JP
H09-098373 Apr 1997 JP
9-163275 Jun 1997 JP
9-171213 Jun 1997 JP
H09-197547 Jul 1997 JP
H09-307803 Nov 1997 JP
H09-307804 Nov 1997 JP
9-311850 Dec 1997 JP
10-4535 Jan 1998 JP
10-162020 Jun 1998 JP
H10-164401 Jun 1998 JP
H10-164426 Jun 1998 JP
H10-336503 Jul 1998 JP
H10-210405 Aug 1998 JP
10-243331 Sep 1998 JP
11032173 Feb 1999 JP
H11-191858 Jul 1999 JP
H11-196397 Jul 1999 JP
2000-92439 Mar 2000 JP
2000-510616 Aug 2000 JP
2000-287110 Oct 2000 JP
2001-501416 Jan 2001 JP
9009717 Aug 1990 WO
9100586 Jan 1991 WO
9114334 Sep 1991 WO
9205652 Apr 1992 WO
9205655 Apr 1992 WO
9209169 May 1992 WO
9210063 Jun 1992 WO
9220186 Nov 1992 WO
9423375 Oct 1994 WO
9532583 Nov 1995 WO
9600952 Jan 1996 WO
9602106 Jan 1996 WO
9624216 Aug 1996 WO
9629818 Sep 1996 WO
9717669 May 1997 WO
9728516 Aug 1997 WO
9738510 Oct 1997 WO
9814863 Apr 1998 WO
9814887 Apr 1998 WO
Related Publications (1)
Number Date Country
20080115066 A1 May 2008 US
Continuations (2)
Number Date Country
Parent 09973128 Oct 2001 US
Child 11963018 US
Parent 09223960 Dec 1998 US
Child 09973128 US