In conventional systems, a user has to login or otherwise select an account identity on their user equipment device in order to receive personalized content. For example, the user may login to their set-top box in order to receive recommended content for viewing including one or more of television broadcasts, streaming internet media, and on-demand programs.
The recommended content may be selected based on the user's profile associated with his or her login or account identity. For example, the user's profile may include viewing history and related viewing preferences for the user. However, if the user is not logged in, he or she may not receive the appropriate content recommendations. Additionally, if the user is not logged in, the user's viewing history or related viewing preferences may not be recorded in his or her user profile for future content recommendations.
Accordingly, there is a need for systems and methods that will alleviate the issues described above and aid in making better content recommendations for the user.
Systems and methods for recommending content to a user are described. A user equipment device may be equipped with a built-in or separately connected image capturing device which may be used to pinpoint the location of the remote control when it is activated, thereby defining the specific location of the remote and the user. The relative location of the user in the room where the television is located may be associated with that particular user and used to automatically log in and access their user profile. This information may be used to enhance the television experience for the user by, e.g., recommending content.
In some implementations, when a button on the user's remote control device is pressed, the infrared (IR) device placed in the remote control device activates to send a signal to a sensor connected to the user equipment device. Upon receipt of the signal, the user equipment device may activate a built-in or separately connected camera to detect the location of the remote within the user's viewing area. As used herein, a “viewing area” refers to a finite distance from a display device typically associated with an area in which a user may be capable of viewing media assets and/or advertisements on the display device. In some embodiments, the size of the viewing area may vary depending on the particular display device. For example, a display device with a large screen size may have a greater viewing area than a display device with a small screen size. In some embodiments, the viewing area may correspond to the range of the image capturing device associated with the media application. For example, if the image capturing device can detect a user only within five feet of a display device, the viewing area associated with the display device may be only five feet. Various systems and methods for detecting users within a range of a media device, is discussed in, for example, Shimy et al., U.S. patent application Ser. No. 12/565,486, filed Sep. 23, 2009, which is hereby incorporated by reference herein in its entirety.
The user equipment may detect the location of the remote by identifying the location of the pixels within the camera sensor's field of view that are activated by the IR signal. The user equipment device may also approximate the distance of the remote from the user equipment device through the number of pixels that are triggered. The identification may be further, or alternatively, enhanced by placing two IR devices in the remote control device such that they are mounted at a known distance apart. The user equipment device may determine the location and distance accurately and additionally determine the orientation (angle) of the remote control device based on the location of the pixels activated by each IR signal and the known distance between the two IR devices.
Typically, the viewing areas where the user's television and/or set-top box are located have a consistent and unchanging arrangement of furniture. The number of possible seating positions within the viewing area is generally small, e.g., 4 to 6 and typically less than or equal to 10. In such situations, most users tend to sit in the same position or a few limited positions and operate their remote control device from their usual position. Therefore, the number of positions in which the remote control device may be detected when a button is pressed may correspond to the number of users in the viewing area. For example, if a user is always seated on his or her preferred couch seat, the location of the remote control device in the viewing area may be associated with the particular user. By defining the detected locations of the remote control device over multiple activations, the user equipment device may determine the number of users and his or her positions in the viewing area.
In some implementations, the user equipment device may automatically detect and define user areas in the viewing area based on the positions where the remote control device is activated. Each user area may be associated with a user profile including viewing history and viewing preferences, such as volume, color, brightness, video quality, etc. The user equipment device may “learn” the correlation between a user area and the content selected from that user area by keeping track of the viewing history and related viewing preferences, e.g., by monitoring and storing the content selections the user makes and/or other interactions the user may have with the guidance application. The user equipment device may enhance the viewing experience for the user by retrieving their user profile and making personalized content recommendations to the user associated with the particular user area.
For example, a user “John” may always sit in a particular user area and mostly view action movies. When the remote control device is detected in that position, the user equipment device may make personalized content recommendations including action movies. In another example, a child “Mike” may mostly view content from another position and the user equipment device may recommend children's content when the remote control device is detected in that position. In some implementations, users may adapt to the user profile generation process and consciously use a fixed seating position in the viewing area to further refine their user profile and receive better content recommendations.
In some implementations, the user area defined for a particular user may be further refined based on time-based restrictions. For example, a given user area may be associated with user “John” during weekends but with his wife “Marie” during weekdays. The user equipment device may make better content recommendations to the user activating the remote control device by taking the day into account. When the remote is detected in the given user area on a weekday, the user equipment device may retrieve Marie's profile and make personalized content recommendations including soap operas of interest to Marie. However, when the remote is detected in the given user area on a weekend, the user equipment device may instead retrieve John's profile and make personalized content recommendations including football games of interest to John.
In some implementations, the systems and methods described herein include a method for recommending content. The method includes receiving input at a user equipment device from a remote control device operated by a user positioned in a viewing area. In response to receiving the input, an instruction is transmitted to activate an image capturing device. The method further includes receiving from the image capturing device an image of the viewing area. The method further includes analyzing the received image to detect a position of the remote control device. The method further includes retrieving from a storage device a profile for the user based on the determined position. The method further includes determining media content of interest to the user based on the retrieved profile. The method further includes generating a display recommending the media content to the user.
In some embodiments, the retrieved profile is includes a viewing history of media content selected when the remote control device is positioned in a vicinity of the determined position in the viewing area. For example, the remote control device may be positioned within 0.1 cm, 1 cm, 10 cm, or any other suitable distance in the vicinity of the determined position in the viewing area. In some embodiments, the viewing history includes the media content selected in a vicinity of a particular time and/or the media content selected on a particular day. For example, viewing history includes the media content selected within one second, one minute, one hour, or any other suitable time period in the vicinity of the particular time.
In some embodiments, analyzing the received image to detect a position of the remote control device includes receiving a signal from the remote control device that is captured in the image. The method further includes identifying location of one or more pixels that are activated in the image due to the signal from the remote control device, e.g., identifying location of pixels that have particular values corresponding to an infrared (IR) signal. The method further includes determining the position of the remote control device based on the identified location.
In some embodiments, the method further includes receiving two signals simultaneously from the remote control device at the image capturing device. The method further includes identifying a first location of pixels and a second location of pixels that are activated in the image due to the two signals from the remote control device. The method further includes retrieving a distance associated with the two signals and calculating a distance of the remote control device from the image capturing device and its location relative to the image capturing device in the viewing area based on the identified first and second locations and the retrieved distance. In some embodiments, the image capturing device is built in to the display device or separately connected but adjacent to the display device. In such embodiments, the calculated distance may be used to approximate the distance of the remote control device from the display device. The method further includes determining the position of the remote control device based on the identified first and second locations and the calculated distance. In some embodiments, the two signals are received simultaneously from two infrared (IR) devices included in the remote control device.
In some embodiments, the method further includes determining a location of a second user in the viewing area. The method further includes retrieving from the storage device a profile for the second user based on the location. The method further includes determining media content of interest to both users based on their profiles. In some embodiments, determining the location of the second user in the viewing area further includes analyzing the received image to detect silhouettes corresponding to one or more users. The method further includes retrieving from the storage device a silhouette for the second user. The method further includes comparing the retrieved silhouette for the second user to the silhouettes detected in the image. The method further includes determining the location of the second user in the viewing area based on the comparison.
In some embodiments, the method further includes receiving a plurality of inputs from the remote control device over a period of time. The method further includes receiving from the image capturing device a plurality of images of the viewing area over the period of time. The method further includes clustering a plurality of positions in the viewing area to define a user area based on the position of the remote control device in each captured image.
In some embodiments, the method further includes receiving a plurality of inputs from the remote control device over a period of time. The method further includes receiving from the image capturing device a plurality of images of the viewing area over the period of time. The method further includes determining a plurality of locations in the viewing area based on the position of the remote control device in each captured image. The method further includes determining a number of users and/or user areas in the viewing area over the period of time based on the plurality of locations.
In some embodiments, the method further includes receiving the image of the viewing area from the image capturing device. The method further includes analyzing the received image to detect the position of the remote control device and determining the user area associated with the detected position. The method further includes retrieving the profile for the user from the storage device based on the determined user area.
In some aspects, the systems and methods described herein include a system or apparatus for recommending content configured to execute the functionality described above.
It should be noted, the systems and/or methods described above may be applied to, or used in accordance with, other systems, methods and/or apparatuses.
The above and other objects and advantages of the disclosure will be apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings, in which like reference characters refer to like parts throughout, and in which:
Systems and methods are described herein for recommending content to a user. In some implementations, the systems and methods utilize a built-in or separately connected image capturing device to locate the position of the remote control device when it is activated in the viewing area, thereby defining the specific position of the remote and the user. As used herein, a “viewing area” refers to a finite distance from a display device typically associated with an area in which a user may be capable of viewing media assets and/or advertisements on the display device. In some embodiments, the size of the viewing area may vary depending on the particular display device. For example, a display device with a large screen size may have a greater viewing area than a display device with a small screen size. In some embodiments, the viewing area may correspond to the range of the detection modules associated with the media application. For example, if the detection module can detect a user only within five feet of a display device, the viewing area associated with the display device may be only five feet. Various systems and methods for detecting users within a range of a media device, is discussed in, for example, Shimy et al., U.S. patent application Ser. No. 12/565,486, filed Sep. 23, 2009, which is hereby incorporated by reference herein in its entirety.
The relative position of the user in the viewing area where the user equipment device is located may be associated with that particular user and used to automatically login and access their user profile. This information may be used to enhance the television experience for the user by, e.g., recommending content.
In some implementations, a user equipment device including control circuitry recommends content to a user. The control circuitry receives input at the user equipment device from a remote control device operated by a user positioned in a viewing area. For example, the input may be an infrared (IR) signal received from an IR device placed in the remote control device. In response to receiving the input, control circuitry transmits an instruction to activate an image capturing device, e.g., a built-in or separately connected camera. The control circuitry receives from the image capturing device an image of the viewing area and analyzes the image to detect a position of the remote control device. For example, the control circuitry may compare the image to one or more previously or subsequently captured images to determine the pixels activated in the image due to the IR signal. For instance, in response to detecting the signal with a sensor, the camera may rapidly take a series of pictures and then analyze them to determine the portion that is changing across the images.
The control circuitry retrieves from a storage device a profile for the user based on the determined position and determines media content of interest to the user based on the retrieved profile. For example, the control circuitry may determine that the user “Mike” may find the television program “Korra: The Next Hero” of interest (as shown in
In some embodiments, the retrieved profile is based on a viewing history of media content selected when the remote control device is positioned in a vicinity of the determined position in the viewing area. For example, the remote control device may be positioned within 0.1 cm, 1 cm, 10 cm, or any other suitable distance in the vicinity of the determined position in the viewing area. In another example, the determined position may be associated with a user profile including viewing history and/or viewing preferences, such as volume, color, brightness, video quality, etc. The user equipment device may “learn” the correlation between the user area and the content viewed and may generate a profile for the user area.
In some embodiments, the viewing history includes the media content selected in a vicinity of a particular time and/or the media content selected on a particular day. For example, viewing history includes the media content selected within one second, one minute, one hour, or any other suitable time period in the vicinity of the particular time. In another example, a given user area may be associated with user “John” during weekends but with his wife “Marie” during weekdays. The user equipment device may make better content recommendations to the user activating the remote control device by taking the day into account. When the remote is detected in the given user area on a weekday, the user equipment device may retrieve Marie's profile and make personalized content recommendations including soap operas. However, when the remote is detected in the given user area on a weekend, the user equipment device may instead retrieve John's profile and make personalized content recommendations including football games.
In some embodiments, control circuitry analyzes the received image to detect a position of the remote control device by locating a signal from the remote control device that is captured in the image. For example, the received image may include a captured infrared (IR) signal received from the remote control device. The control circuitry identifies the location of one or more pixels that are activated in the image due to the IR signal from the remote control device. The control circuitry determines the position of the remote control device based on the identified location. For example, the control circuitry may compare the image to a previously captured image to determine the pixels activated in the image due to the IR signal.
In some embodiments, the control circuitry receives an indication of two signals simultaneously from the remote control device. For example, the remote control device may include two IR devices that send signals simultaneously to an IR sensor connected to the control circuitry. The control circuitry instructs the image capture device to capture an image and on receipt analyzes the image to identify a first location of pixels and a second location of pixels that are activated in the image due to the two signals from the remote control device.
The control circuitry retrieves a distance between the two IR devices placed in the remote control device. The distance is a fixed measure of the spacing between the two IR devices and may be used to calculate the distance of the remote control device from the image capturing device and its location relative to the image capturing device in the viewing area. The control circuitry calculates the distance of the remote control device from the image capturing device in the viewing area based on the first and second locations of the two IR signals in the captured image as well as the retrieved distance associated with the two IR signals. In some embodiments, the image capturing device is built in to the display device or separately connected but adjacent to the display device. In such embodiments, the calculated distance may be used to approximate the distance of the remote control device from the display device. The control circuitry determines the position of the remote control device in the viewing area based on the first and second positions corresponding to the pixels activated in the captured image due to the two IR signals and the calculated distance. The position in this case may be a 3-dimensional value indicating the position of the user in the viewing area.
In some embodiments, the control circuitry further determines a location of a second user in the viewing area. The control circuitry retrieves from the storage device a profile for the second user based on the location. The control circuitry makes media content recommendations such that they are of interest to both users based on their profiles. For example, the control circuitry may determine the location of the second user in the viewing area by analyzing the image from the image capture device to detect silhouettes corresponding to the users in the viewing area. The silhouettes may be detected using edge detection, corner detection, blob detection, or other such suitable image processing techniques. Such techniques are fundamental tools in image processing, machine vision and computer vision, particularly in the areas of feature detection and feature extraction. The control circuitry may identify points in an image at which the image brightness changes sharply or has discontinuities to determine silhouettes for the users in the viewing area. The control circuitry retrieves from the storage device a silhouette for the second user and compares the retrieved silhouette for the second user to the silhouettes detected in the image. If there is a match, the control circuitry associates the location in the viewing area with the profile of the second user.
In some embodiments, the control circuitry receives a plurality of inputs from the remote control device over a period of time and receives from the image capturing device a plurality of images of the viewing area over the same period of time. The control circuitry clusters the plurality of positions in the viewing area to define a user area based on the position of the remote control device in each captured image. The control circuitry associates the defined user area with the user. Any future remote control signals from the defined user area may be used to construct or retrieve the user's profile including a viewing history and/or viewing preferences, e.g., volume, brightness, video quality, etc.
In some embodiments, the control circuitry receives a plurality of inputs from the remote control device over a period of time and receives from the image capturing device a plurality of images of the viewing area over the same period of time. The control circuitry determines a plurality of locations in the viewing area based on the position of the remote control device in each captured image. The control circuitry determines a number of users in the viewing area over the period of time based on the plurality of locations. Any future remote control signals from the defined user areas may be used to construct or retrieve corresponding user profiles including a viewing history and/or viewing preferences, e.g., volume, brightness, video quality, etc.
In some embodiments, the control circuitry receives the image of the viewing area from the image capturing device. The control circuitry analyzes the received image to detect the position of the remote control device and determines the user area associated with the detected position. Finally, the control circuitry retrieves the profile for the user from the storage device based on the determined user area.
The amount of content available to users in any given content delivery system can be substantial. Consequently, many users desire a form of media guidance through an interface that allows users to efficiently navigate content selections and easily identify content that they may desire. An application that provides such guidance is referred to herein as an interactive media guidance application or, sometimes, a media guidance application or a guidance application.
Media applications may take various forms depending on their function. Some media applications generate graphical user interface screens (e.g., that enable a user to navigate among, locate and select content), and some media applications may operate without generating graphical user interface screens (e.g., while still issuing instructions related to the transmission of media assets and advertisements).
As referred to herein, the terms “media asset” and “content” should be understood to mean an electronically consumable user asset, such as television programming, as well as pay-per-view programs, on-demand programs (as in video-on-demand (VOD) systems), Internet content (e.g., streaming content, downloadable content, Webcasts, etc.), video clips, audio, content information, pictures, rotating images, documents, playlists, websites, articles, books, electronic books, blogs, advertisements, chat sessions, social media, applications, games, and/or any other media or multimedia and/or combination of the same. As referred to herein, the term “multimedia” should be understood to mean content that utilizes at least two different content forms described above, for example, text, audio, images, video, or interactivity content forms. Content may be recorded, played, displayed or accessed by user equipment devices, but can also be part of a live performance.
With the advent of the Internet, mobile computing, and high-speed wireless networks, users are accessing media on user equipment devices on which they traditionally did not. As referred to herein, the phrase “display device,” “user equipment device,” “user equipment,” “user device,” “electronic device,” “electronic equipment,” “media equipment device,” or “media device” should be understood to mean any device for accessing the content described above, such as a television, a Smart TV, a set-top box, an integrated receiver decoder (IRD) for handling satellite television, a digital storage device, a digital media receiver (DMR), a digital media adapter (DMA), a streaming media device, a DVD player, a DVD recorder, a connected DVD, a local media server, a BLU-RAY player, a BLU-RAY recorder, a personal computer (PC), a laptop computer, a tablet computer, a WebTV box, a personal computer television (PC/TV), a PC media server, a PC media center, a hand-held computer, a stationary telephone, a personal digital assistant (PDA), a mobile telephone, a portable video player, a portable music player, a portable gaming machine, a smart phone, or any other television equipment, computing equipment, or wireless device, and/or combination of the same.
In some embodiments, the user equipment device may have a front facing screen and a rear facing screen, multiple front screens, or multiple angled screens. In some embodiments, the user equipment device may have a front facing camera and/or a rear facing camera. These cameras, for example, can be used to capture an image of an IR signal from a remote control device, as discussed further herein. On these user equipment devices, users may be able to navigate among and locate the same content available through a television. Consequently, media guidance may be available on these devices, as well. The guidance provided may be for content available only through a television, for content available only through one or more of other types of user equipment devices, or for content available both through a television and one or more of the other types of user equipment devices. The media applications may be provided as on-line applications (i.e., provided on a web-site), or as stand-alone applications or clients on user equipment devices. Various devices and platforms that may implement media applications are described in more detail below.
One of the functions of the media guidance application is to provide media guidance data to users. As referred to herein, the phrase, “media guidance data” or “guidance data” should be understood to mean any data related to content, such as media listings, media-related information (e.g., broadcast times, broadcast channels, titles, descriptions, ratings information (e.g., parental control ratings, critic's ratings, etc.), genre or category information, actor information, logo data for broadcasters' or providers' logos, etc.), media format (e.g., standard definition, high definition, 3D, etc.), advertisement information (e.g., text, images, media clips, etc.), on-demand information, blogs, websites, and any other type of guidance data that is helpful for a user to navigate among and locate desired content selections.
In addition to providing access to linear programming (e.g., content that is scheduled to be transmitted to a plurality of user equipment devices at a predetermined time and is provided according to a schedule), the media guidance application also provides access to non-linear programming (e.g., content accessible to a user equipment device at any time and is not provided according to a schedule). Non-linear programming may include content from different content sources including on-demand content (e.g., VOD), Internet content (e.g., streaming media, downloadable media, etc.), locally stored content (e.g., content stored on any user equipment device described above or other storage device), or other time-independent content. On-demand content may include movies or any other content provided by a particular content provider (e.g., HBO On Demand providing “The Sopranos” and “Curb Your Enthusiasm”). HBO ON DEMAND is a service mark owned by Time Warner Company L. P. et al. and THE SOPRANOS and CURB YOUR ENTHUSIASM are trademarks owned by the Home Box Office, Inc. Internet content may include web events, such as a chat session or Webcast, or content available on-demand as streaming content or downloadable content through an Internet web site or other Internet access (e.g., FTP).
Grid 102 may provide media guidance data for non-linear programming including on-demand listing 114, recorded content listing 116, and Internet content listing 118. A display combining media guidance data for content from different types of content sources is sometimes referred to as a “mixed-media” display. Various permutations of the types of media guidance data that may be displayed that are different than display 100 may be based on user selection or guidance application definition (e.g., a display of only recorded and broadcast listings, only on-demand and broadcast listings, etc.). As illustrated, listings 114, 116, and 118 are shown as spanning the entire time block displayed in grid 102 to indicate that selection of these listings may provide access to a display dedicated to on-demand listings, recorded listings, or Internet listings, respectively. In some embodiments, listings for these content types may be included directly in grid 102. Additional media guidance data may be displayed in response to the user selecting one of the navigational icons 120. (Pressing an arrow key on a user input device may affect the display in a similar manner as selecting navigational icons 120.)
Display 100 may also include video region 122, advertisement 124, and options region 126. Video region 122 may allow the user to view and/or preview programs that are currently available, will be available, or were available to the user. The content of video region 122 may correspond to, or be independent from, one of the listings displayed in grid 102. Grid displays including a video region are sometimes referred to as picture-in-guide (PIG) displays. PIG displays and their functionalities are described in greater detail in Satterfield et al. U.S. Pat. No. 6,564,378, issued May 13, 2003 and Yuen et al. U.S. Pat. No. 6,239,794, issued May 29, 2001, which are hereby incorporated by reference herein in their entireties. PIG displays may be included in other media guidance application display screens of the embodiments described herein.
Advertisement 124 may provide an advertisement for content that, depending on a viewer's access rights (e.g., for subscription programming), is currently available for viewing, will be available for viewing in the future, or may never become available for viewing, and may correspond to, or be unrelated to, one or more of the content listings in grid 102. Advertisement 124 may also be for products or services related or unrelated to the content displayed in grid 102. Advertisement 124 may be selectable and provide further information about content, provide information about a product or a service, enable purchasing of content, a product, or a service, provide content relating to the advertisement, etc. Advertisement 124 may be targeted based on a user's profile/preferences, monitored user activity, the type of display provided, or on other suitable targeted advertisement bases.
While advertisement 124 is shown as rectangular or banner shaped, advertisements may be provided in any suitable size, shape, and location in a guidance application display. For example, advertisement 124 may be provided as a rectangular shape that is horizontally adjacent to grid 102. This is sometimes referred to as a panel advertisement. In addition, advertisements may be overlaid over content or a guidance application display or embedded within a display. Advertisements may also include text, images, rotating images, video clips, or other types of content described above. Advertisements may be stored in a user equipment device having a guidance application, in a database connected to the user equipment, in a remote location (including streaming media servers), or on other storage means, or a combination of these locations. Providing advertisements in a media guidance application is discussed in greater detail in, for example, Knudson et al., U.S. Patent Application Publication No. 2003/0110499, filed Jan. 17, 2003; Ward, III et al. U.S. Pat. No. 6,756,997, issued Jun. 29, 2004; and Schein et al. U.S. Pat. No. 6,388,714, issued May 14, 2002, which are hereby incorporated by reference herein in their entireties. It will be appreciated that advertisements may be included in other media guidance application display screens of the embodiments described herein.
Options region 126 may allow the user to access different types of content, media guidance application displays, and/or media guidance application features. Options region 126 may be part of display 100 (and other display screens described herein), or may be invoked by a user by selecting an on-screen option or pressing a dedicated or assignable button on a user input device. The selectable options within options region 126 may concern features related to program listings in grid 102 or may include options available from a main menu display. Features related to program listings may include searching for other air times or ways of receiving a program, recording a program, enabling series recording of a program, setting program and/or channel as a favorite, purchasing a program, or other features. Options available from a main menu display may include search options, VOD options, parental control options, Internet options, cloud-based options, device synchronization options, second screen device options, options to access various types of media guidance data displays, options to subscribe to a premium service, options to edit a user's profile, options to access a browse overlay, options to view related content that provides background information or context for a selected media content, options to view the related content on a second screen device, options to view additional related content, options to add related content to a queue for later viewing, options to resume playback of the selected media content, options to specify an ordering scheme and/or criteria for the ordering scheme, or other suitable options.
The media guidance application may be personalized based on a user's preferences. A personalized media guidance application allows a user to customize displays and features to create a personalized “experience” with the media guidance application. This personalized experience may be created by allowing a user to input these customizations and/or by the media guidance application monitoring user activity to determine various user preferences. Users may access their personalized guidance application by logging in or otherwise identifying themselves to the guidance application. Customization of the media guidance application may be made in accordance with a user profile. The customizations may include varying presentation schemes (e.g., color scheme of displays, font size of text, etc.), aspects of content listings displayed (e.g., only HDTV or only 3D programming, user-specified broadcast channels based on favorite channel selections, re-ordering the display of channels, recommended content, etc.), desired recording features (e.g., recording or series recordings for particular users, recording quality, etc.), parental control settings, customized presentation of Internet content (e.g., presentation of social media content, e-mail, electronically delivered articles, etc.) and other desired customizations.
The media guidance application may allow a user to provide user profile information or may automatically compile user profile information. The media guidance application may, for example, monitor the content the user accesses and/or other interactions the user may have with the guidance application. In some embodiments, the user profile information may be associated with input received from a remote control device over a period of time. Control circuitry may determine the position of the remote control device in the user's viewing area and cluster the positions to define a user area. The control circuitry may associate the defined user area with the user and store the interactions including viewing history and/or viewing preferences in the user profile information. Any future remote control signals from the defined user area may be used to construct or retrieve the user's profile information.
Additionally, the media guidance application may obtain all or part of other user profiles that are related to a particular user (e.g., from other web sites on the Internet the user accesses, such as www.allrovi.com, from other media guidance applications the user accesses, from other interactive applications the user accesses, from another user equipment device of the user, etc.), and/or obtain information about the user from other sources that the media guidance application may access. As a result, a user can be provided with a unified guidance application experience across the user's different user equipment devices. This type of user experience is described in greater detail below in connection with
Another display arrangement for providing media guidance is shown in
The listings in display 200 are of different sizes (i.e., listing 206 is larger than listings 208, 210, and 212), but if desired, all the listings may be the same size. Listings may be of different sizes or graphically accentuated to indicate degrees of interest to the user or to emphasize certain content, as desired by the content provider or based on user preferences. Various systems and methods for graphically accentuating content listings are discussed in, for example, Yates, U.S. Patent Application Publication No. 2010/0153885, filed Dec. 29, 2005, which is hereby incorporated by reference herein in its entirety.
Control circuitry 304 may be based on any suitable processing circuitry such as processing circuitry 306. As referred to herein, processing circuitry should be understood to mean circuitry based on one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, field-programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores) or supercomputer. In some embodiments, processing circuitry may be distributed across multiple separate processors or processing units, for example, multiples of the same type of processing units (e.g., two Intel Core i7 processors) or multiple different processors (e.g., an Intel Core i5 processor and an Intel Core i7 processor). In some embodiments, control circuitry 304 executes instructions for a media application stored in memory (i.e., storage 308). Specifically, control circuitry 304 may be instructed by the media application to perform the functions discussed above and below. For example, the media application may provide instructions to control circuitry 304 to generate the media guidance displays. In some implementations, any action performed by control circuitry 304 may be based on instructions received from the media application.
In client-server based embodiments, control circuitry 304 may include communications circuitry suitable for communicating with a media-application server or other networks or servers. The instructions for carrying out the above mentioned functionality may be stored on the media application server. Communications circuitry may include a cable modem, an integrated services digital network (ISDN) modem, a digital subscriber line (DSL) modem, a telephone modem, Ethernet card, or a wireless modem for communications with other equipment, or any other suitable communications circuitry. Such communications may involve the Internet or any other suitable communications networks or paths (which is described in more detail in connection with
Memory may be an electronic storage device provided as storage 308 that is part of control circuitry 304. As referred to herein, the phrase “electronic storage device” or “storage device” should be understood to mean any device for storing electronic data, computer software, or firmware, such as random-access memory, read-only memory, hard drives, optical drives, digital video disc (DVD) recorders, compact disc (CD) recorders, BLU-RAY disc (BD) recorders, BLU-RAY 3D disc recorders, digital video recorders (DVR, sometimes called a personal video recorder, or PVR), solid state devices, quantum storage devices, gaming consoles, gaming media, or any other suitable fixed or removable storage devices, and/or any combination of the same. Storage 308 may be used to store various types of content described herein as well as media guidance information, described above, and media application data, described above. Nonvolatile memory may also be used (e.g., to launch a boot-up routine and other instructions). Cloud-based storage, described in relation to
Control circuitry 304 may include video generating circuitry and tuning circuitry, such as one or more analog tuners, one or more MPEG-2 decoders or other digital decoding circuitry, high-definition tuners, or any other suitable tuning or video circuits or combinations of such circuits. Encoding circuitry (e.g., for converting over-the-air analog, or digital signals to MPEG signals for storage) may also be provided. Control circuitry 304 may also include scaler circuitry for upconverting and downconverting content into the preferred output format of the user equipment 300. Circuitry 304 may also include digital-to-analog converter circuitry and analog-to-digital converter circuitry for converting between digital and analog signals. The tuning and encoding circuitry may be used by the user equipment device to receive and to display, to play, or to record content. The tuning and encoding circuitry may also be used to receive advertisement data. The circuitry described herein, including for example, the tuning, video generating, encoding, decoding, encrypting, decrypting, scaler, and analog/digital circuitry, may be implemented using software running on one or more general purpose or specialized processors. Multiple tuners may be provided to handle simultaneous tuning functions (e.g., watch and record functions, picture-in-picture (PIP) functions, multiple-tuner recording, etc.). If storage 308 is provided as a separate device from user equipment 300, the tuning and encoding circuitry (including multiple tuners) may be associated with storage 308.
A user may send instructions to control circuitry 304 using a user input interface, e.g., remote control device 310. Remote control device 310 may be any suitable user interface, such as a remote control, mouse, trackball, keypad, keyboard, touch screen, touchpad, stylus input, joystick, voice recognition interface, or other user input interfaces. Display 312 may be provided as a stand-alone device or integrated with other elements of user equipment device 300. Display 312 may be one or more of a monitor, a television, a liquid crystal display (LCD) for a mobile device, or any other suitable equipment for displaying visual images. In some embodiments, display 312 may be HDTV-capable. In some embodiments, display 312 may be a 3D display, and the interactive media application and any suitable content may be displayed in 3D. A video card or graphics card may generate the output to the display 312. The video card may offer various functions such as accelerated rendering of 3D scenes and 2D graphics, MPEG-2/MPEG-4 decoding, TV output, or the ability to connect multiple monitors. The video card may be any processing circuitry described above in relation to control circuitry 304. The video card may be integrated with the control circuitry 304. Sensor device 314 may be provided as integrated with other elements of user equipment device 300 or may be stand-alone units. Sensor device 314 may receive signals from remote control device 310 and indicate receipt of such signals to control circuitry 304. In some embodiments, sensor device 314 is an infrared (IR) receiver that receives IR signals from remote control device 310. In some embodiments, sensor 314 includes multiple IR receivers that receive IR signals from remote control device 310. Image capture device 316 may be provided as integrated with other elements of user equipment device 300 or may be stand-alone units. Image capture device 316 may capture images of a viewing area where the user is positioned when a signal is received from remote control device 310. The captured images may be used to determine the location of the user in the viewing area.
The media application may be implemented using any suitable architecture. For example, it may be a stand-alone application wholly implemented on user equipment device 300. In such an approach, instructions of the application are stored locally, and data for use by the application is downloaded on a periodic basis (e.g., from an out-of-band feed, from an Internet resource, or using another suitable approach). In some embodiments, the media application is a client-server based application. Data for use by a thick or thin client implemented on user equipment device 300 is retrieved on-demand by issuing requests to a server remote to the user equipment device 300. In one example of a client-server based media application, control circuitry 304 runs a web browser that interprets web pages provided by a remote server.
In some embodiments, the media application is downloaded and interpreted or otherwise run by an interpreter or virtual machine (run by control circuitry 304). In some embodiments, the media application may be encoded in the ETV Binary Interchange Format (EBIF), received by control circuitry 304 as part of a suitable feed, and interpreted by a user agent running on control circuitry 304. For example, the media application may be an EBIF application. In some embodiments, the media application may be defined by a series of JAVA-based files that are received and run by a local virtual machine or other suitable middleware executed by control circuitry 304. In some of such embodiments (e.g., those employing MPEG-2 or other digital media encoding schemes), the media application may be, for example, encoded and transmitted in an MPEG-2 object carousel with the MPEG audio and video packets of a program.
User equipment device 300 of
A user equipment device utilizing at least some of the system features described above in connection with
In system 400, there is typically more than one of each type of user equipment device but only one of each is shown in
In some embodiments, a user equipment device (e.g., user television equipment 402, user computer equipment 404, wireless user communications device 406) may be referred to as a “second screen device.” For example, a second screen device may supplement content presented on a first user equipment device. The content presented on the second screen device may be any suitable content that supplements the content presented on the first device. In some embodiments, the second screen device provides an interface for adjusting settings and display preferences of the first device. In some embodiments, the second screen device is configured for interacting with other second screen devices or for interacting with a social network. The second screen device can be located in the same room as the first device, a different room from the first device but in the same house or building, or in a different building from the first device.
The user may also set various settings to maintain consistent media application settings across in-home devices and remote devices. Settings include those described herein, as well as channel and program favorites, programming preferences that the media application utilizes to make programming recommendations, display preferences, and other desirable guidance settings. For example, if a user sets a channel as a favorite on, for example, the website www.allrovi.com on their personal computer at their office, the same channel would appear as a favorite on the user's in-home devices (e.g., user television equipment and user computer equipment) as well as the user's mobile devices, if desired. Therefore, changes made on one user equipment device can change the guidance experience on another user equipment device, regardless of whether they are the same or a different type of user equipment device. In addition, the changes made may be based on settings input by a user, as well as user activity monitored by the media application.
The user equipment devices may be coupled to communications network 414. Namely, user television equipment 402, user computer equipment 404, and wireless user communications device 406 are coupled to communications network 414 via communications paths 408, 410, and 412, respectively. Communications network 414 may be one or more networks including the Internet, a mobile phone network, mobile voice or data network (e.g., a 4G or LTE network), cable network, public switched telephone network, or other types of communications network or combinations of communications networks. Paths 408, 410, and 412 may separately or together include one or more communications paths, such as, a satellite path, a fiber-optic path, a cable path, a path that supports Internet communications (e.g., IPTV), free-space connections (e.g., for broadcast or other wireless signals), or any other suitable wired or wireless communications path or combination of such paths. Path 412 is drawn with dotted lines to indicate that in the exemplary embodiment shown in
Although communications paths are not drawn between user equipment devices, these devices may communicate directly with each other via communication paths, such as those described above in connection with paths 408, 410, and 412, as well as other short-range point-to-point communication paths, such as USB cables, IEEE 1394 cables, wireless paths (e.g., Bluetooth, infrared, IEEE 802-11x, etc.), or other short-range communication via wired or wireless paths. BLUETOOTH is a certification mark owned by Bluetooth SIG, INC. The user equipment devices may also communicate with each other directly through an indirect path via communications network 414.
System 400 includes content source 416 and media guidance data source 418 coupled to communications network 414 via communication paths 420 and 422, respectively. Paths 420 and 422 may include any of the communication paths described above in connection with paths 408, 410, and 412. Communications with the content source 416 and media guidance data source 418 may be exchanged over one or more communications paths, but are shown as a single path in
Content source 416 may include one or more types of content distribution equipment including a television distribution facility, cable system headend, satellite distribution facility, programming sources (e.g., television broadcasters, such as NBC, ABC, HBO, etc.), intermediate distribution facilities and/or servers, Internet providers, on-demand media servers, and other content providers. NBC is a trademark owned by the National Broadcasting Company, Inc., ABC is a trademark owned by the American Broadcasting Company, Inc., and HBO is a trademark owned by the Home Box Office, Inc. Content source 416 may be the originator of content (e.g., a television broadcaster, a Webcast provider, etc.) or may not be the originator of content (e.g., an on-demand content provider, an Internet provider of content of broadcast programs for downloading, etc.). Content source 416 may include cable sources, satellite providers, on-demand providers, Internet providers, over-the-top content providers, or other providers of content. Content source 416 may also include a remote media server used to store different types of content (including video content selected by a user), in a location remote from any of the user equipment devices. Systems and methods for remote storage of content, and providing remotely stored content to user equipment are discussed in greater detail in connection with Ellis et al., U.S. Pat. No. 7,761,892, issued Jul. 20, 2010, which is hereby incorporated by reference herein in its entirety.
Media guidance data source 418 may provide media guidance data, such as the media guidance data described above. Media guidance application data may be provided to the user equipment devices using any suitable approach. In some embodiments, the guidance application may be a stand-alone interactive television program guide that receives program guide data via a data feed (e.g., a continuous feed or trickle feed). Program schedule data and other guidance data may be provided to the user equipment on a television channel sideband, using an in-band digital signal, using an out-of-band digital signal, or by any other suitable data transmission technique. Program schedule data and other media guidance data may be provided to user equipment on multiple analog or digital television channels.
In some embodiments, guidance data from media guidance data source 418 may be provided to users' equipment using a client-server approach. For example, a user equipment device may pull advertisement data from a server, or a server may push advertisement data to a user equipment device. In some embodiments, a media application client residing on the user's equipment may initiate sessions with source 418 to obtain advertisement data when needed, e.g., when the advertisement data is out of date or when the user equipment device receives a request from the user to receive data. Media guidance may be provided to the user equipment with any suitable frequency (e.g., continuously, daily, a user-specified period of time, a system-specified period of time, in response to a request from user equipment, etc.). Media guidance data source 418 may provide user equipment devices 402, 404, and 406 the media application itself or software updates for the media application.
Media applications may be, for example, stand-alone applications implemented on user equipment devices. For example, the media application may be implemented as software or a set of executable instructions which may be stored in storage 308, and executed by control circuitry 304 of a user equipment device 300. In some embodiments, media applications may be client-server applications where only a client application resides on the user equipment device, and server application resides on a remote server. For example, media applications may be implemented partially as a client application on control circuitry 304 of user equipment device 300 and partially on a remote server as a server application (e.g., media guidance data source 418) running on control circuitry of the remote server. When executed by control circuitry of the remote server (such as media guidance data source 418), the media application may instruct the control circuitry to generate the media application displays and transmit the generated displays to the user equipment devices. The server application may instruct the control circuitry of the media guidance data source 418 to transmit data for storage on the user equipment. The client application may instruct control circuitry of the receiving user equipment to generate the media application displays.
Content and/or advertisement data delivered to user equipment devices 402, 404, and 406 may be over-the-top (OTT) content. OTT content delivery allows Internet-enabled user devices, including any user equipment device described above, to receive content that is transferred over the Internet, including any content described above, in addition to content received over cable or satellite connections. OTT content is delivered via an Internet connection provided by an Internet service provider (ISP), but a third party distributes the content. The ISP may not be responsible for the viewing abilities, copyrights, or redistribution of the content, and may transfer only IP packets provided by the OTT content provider. Examples of OTT content providers include YOUTUBE, NETFLIX, and HULU, which provide audio and video via IP packets. Youtube is a trademark owned by Google Inc., Netflix is a trademark owned by Netflix Inc., and Hulu is a trademark owned by Hulu, LLC. OTT content providers may additionally or alternatively provide advertisement data described above. In addition to content and/or advertisement data, providers of OTT content can distribute media applications (e.g., web-based applications or cloud-based applications), or the content can be displayed by media applications stored on the user equipment device.
Media guidance system 400 is intended to illustrate a number of approaches, or network configurations, by which user equipment devices and sources of content and advertisement data may communicate with each other for the purpose of accessing content and providing media guidance. The embodiments described herein may be applied in any one or a subset of these approaches, or in a system employing other approaches for delivering content and providing media guidance. The following four approaches provide specific illustrations of the generalized example of
In one approach, user equipment devices may communicate with each other within a home network. User equipment devices can communicate with each other directly via short-range point-to-point communication schemes described above, via indirect paths through a hub or other similar device provided on a home network, or via communications network 414. Each of the multiple individuals in a single home may operate different user equipment devices on the home network. As a result, it may be desirable for various media guidance information or settings to be communicated between the different user equipment devices. For example, it may be desirable for users to maintain consistent media application settings on different user equipment devices within a home network, as described in greater detail in Ellis et al., U.S. patent application Ser. No. 11/179,410, filed Jul. 11, 2005. Different types of user equipment devices in a home network may also communicate with each other to transmit content. For example, a user may transmit content from user computer equipment to a portable video player or portable music player.
In a second approach, users may have multiple types of user equipment by which they access content and obtain media guidance. For example, some users may have home networks that are accessed by in-home and mobile devices. Users may control in-home devices via a media application implemented on a remote device. For example, users may access an online media application on a website via personal computers at their offices, or mobile devices such as a PDA or web-enabled mobile telephone. The user may set various settings (e.g., recordings, reminders, or other settings) on the online media application to control the user's in-home equipment. The online guide may control the user's equipment directly, or by communicating with a media application on the user's in-home equipment. Various systems and methods for user equipment devices communicating, where the user equipment devices are in locations remote from each other, is discussed in, for example, Ellis et al., U.S. Pat. No. 8,046,801, issued Oct. 25, 2011, which is hereby incorporated by reference herein in its entirety.
In a third approach, users of user equipment devices inside and outside a home can use their media application to communicate directly with content source 416 to access content. Specifically, within a home, users of user television equipment 402 and user computer equipment 404 may access the media application to navigate among and locate desirable content. Users may also access the media application outside of the home using wireless user communications devices 406 to navigate among and locate desirable content.
In a fourth approach, user equipment devices may operate in a cloud computing environment to access cloud services. In a cloud computing environment, various types of computing services for content sharing, storage or distribution (e.g., video sharing sites or social networking sites) are provided by a collection of network-accessible computing and storage resources, referred to as “the cloud.” For example, the cloud can include a collection of server computing devices, which may be located centrally or at distributed locations that provide cloud-based services to various types of users and devices connected via a network such as the Internet via communications network 414. These cloud resources may include one or more content sources 416 and one or more media guidance data sources 418. In addition, or in the alternative, the remote computing sites may include other user equipment devices, such as user television equipment 402, user computer equipment 404, and wireless user communications device 406. For example, the other user equipment devices may provide access to a stored copy of a video or a streamed video. In such embodiments, user equipment devices may operate in a peer-to-peer manner without communicating with a central server.
The cloud provides access to services, such as content storage, content sharing, or social networking services, among other examples, as well as access to any content described above, for user equipment devices. Services can be provided in the cloud through cloud computing service providers, or through other providers of online services. For example, the cloud-based services can include a content storage service, a content sharing site, a social networking site, or other services via which user-sourced content is distributed for viewing by others on connected devices. These cloud-based services may allow a user equipment device to store content to the cloud and to receive content from the cloud rather than storing content locally and accessing locally stored content.
The media application may incorporate, or have access to, one or more content capture devices or application, such as camcorders, digital cameras with video mode, audio recorders, mobile phones, and handheld computing devices, to generate data describing the attentiveness level of a user. The user can upload data describing the attentiveness level of a user to a content storage service on the cloud either directly, for example, from user computer equipment 404 or wireless user communications device 406 having a content capture feature. Alternatively, the user can first transfer the content to a user equipment device, such as user computer equipment 404. The user equipment device storing the data describing the attentiveness level of a user uploads the content to the cloud using a data transmission service on communications network 414. In some embodiments, the user equipment device itself is a cloud resource, and other user equipment devices can access the content directly from the user equipment device on which the user stored the content.
Cloud resources may be accessed by a user equipment device using, for example, a web browser, a media application, a desktop application, a mobile application, and/or any combination of access applications of the same. The user equipment device may be a cloud client that relies on cloud computing for application delivery, or the user equipment device may have some functionality without access to cloud resources. For example, some applications running on the user equipment device may be cloud applications, i.e., applications delivered as a service over the Internet, while other applications may be stored and run on the user equipment device. In some embodiments, a user device may receive content from multiple cloud resources simultaneously. For example, a user device can stream audio from one cloud resource while downloading content from a second cloud resource. Or a user device can download content from multiple cloud resources for more efficient downloading. In some embodiments, user equipment devices can use cloud resources for processing operations such as the processing operations performed by processing circuitry described in relation to
For example, control circuitry 304 may recommend to user “Mike” the television program “Korra: The Next Hero” as shown in
At step 710, control circuitry 304 retrieves a profile for the user based on the determined position in the viewing area. The user profile may include viewing history and/or viewing preferences for the user who typically uses remote control device 316 from the determined position in the viewing area. At step 712, control circuitry 304 determines media content of interest to the user based on their profile. For example, the control circuitry 304 may analyze the viewing history to determine programs similar to those previously viewed by the user. At step 714, control circuitry 304 generates a display for recommending the media content to the user.
It is contemplated that the steps or descriptions of
It is contemplated that the steps or descriptions of
At step 856, control circuitry 304 retrieves a distance between the two IR devices placed in remote control device 316. The distance is a fixed measure of the spacing between the two IR devices and may be used to calculate the distance of remote control device 316 from image capture device 316 and its location relative to image capture device 316 in the viewing area. At step 858, control circuitry 304 calculates the distance of remote control device 316 from image capture device 316 in the viewing area based on the first and second locations of the two IR signals in the captured image as well as the retrieved distance associated with the two IR signals. In some embodiments, image capture device 316 is built in to the display device or separately connected but adjacent to the display device. In such embodiments, the calculated distance may be used to approximate the distance of remote control device 316 from the display device. At step 860, control circuitry 304 determines the position of remote control device 310 in the viewing area based on the first and second positions corresponding to the pixels activated in the captured image due to the two IR signals and the calculated distance. The position in this case is a 3-dimensional value indicating the position of the user in the viewing area.
It is contemplated that the steps or descriptions of
At step 906, control circuitry 304 analyzes the image to identify the position of remote control device 316 in the viewing area according to one of the implementations described above and stores the position for later retrieval. At step 908, control circuitry 304 checks whether a sufficient number of positions have been stored to perform clustering on the stored positions. For example, control circuitry 304 may check against a predetermined threshold or determine a threshold for number of positions based on the size of the viewing area. If more positions are needed, control circuitry 304 returns to step 902. If sufficient positions have been stored, control circuitry 304 proceeds to step 910. At step 910, control circuitry 304 performs clustering on the stored user positions to define one or more user areas.
Clustering is the task of grouping a set of objects in such a way that objects in the same cluster are more similar to each other than to those in other clusters. It is a common technique for statistical data analysis used in many fields, including machine learning, pattern recognition, image analysis, information retrieval, and bioinformatics. Cluster analysis may be performed by various algorithms that differ significantly in their notion of what constitutes a cluster and how to efficiently find them. Popular notions of clusters include groups with small distances among the cluster members, dense areas of the data space, intervals or particular statistical distributions. For example, if most of the stored positions are concentrated in a particular portion of the viewing area, control circuitry 304 may cluster the stored positions into one user area. At step 912, control circuitry 304 associates the defined user area with a particular user. Any future remote control signals from the defined user area may be used to construct or retrieve a user profile including a viewing history and/or viewing preferences, e.g., volume, color, brightness, video quality, etc.
It is contemplated that the steps or descriptions of
At step 956, control circuitry 304 analyzes the image to identify the position of remote control device 316 in the viewing area according to one of the implementations described above. At step 958, control circuitry 304 stores the position for later retrieval. At step 960, control circuitry 304 checks whether additional images remain to be analyzed from the images that have been received over the given period of time. If additional images remain to be analyzed, control circuitry 304 returns to step 952. If all images have been analyzed, control circuitry 304 proceeds to step 962. At step 962, control circuitry 304 performs clustering on the stored user positions to define multiple user areas. At step 912, control circuitry 304 associates each defined user area with a different user. Any future remote control signals from the defined user areas may be used to construct or retrieve corresponding user profiles including a viewing history and/or viewing preferences, e.g., volume, brightness, video quality, etc.
It is contemplated that the steps or descriptions of
In some implementations, the user equipment device may automatically detect and define user areas in the viewing area based on the positions where the remote control device is activated during particular time periods, e.g., particular blocks of time and/or particular days. The user equipment device may “learn” the correlation between a user area and the content selected from that user area by keeping track of the viewing history and related viewing preferences, e.g., by monitoring and storing the content selections the user makes and/or other interactions the user may have with the guidance application. Each user area may be associated with a user profile including viewing history and viewing preferences for the given time period.
For example, when the remote is detected in the user area on a weekday, the user equipment device may retrieve profile A and make personalized content recommendations including soap operas. However, when the remote is detected in the user area on a weekend, the user equipment device may instead retrieve profile B and make personalized content recommendations including football games. In another example, when the remote is detected in the user area between 9 am and 12 pm, the user equipment device may retrieve profile C and make personalized content recommendations including day time shows. However, when the remote is detected in the user area between 6 pm and 9 pm, the user equipment device may instead retrieve profile D and make personalized content recommendations including prime time shows.
The user equipment device may automatically determine the time period or the user equipment device may receive user input defining the time period. In some embodiments, the user equipment device analyzes the viewing history and related viewing preferences and clusters them to determine time periods. For example, the user equipment device may analyze the viewing history and cluster the soap operas into cluster A′ to form profile A for weekdays. The user equipment device may analyze the viewing history and cluster the football games into cluster B′ to form profile B for weekends. In another example, the user equipment device may analyze the viewing history and cluster the day time shows into cluster C′ to form profile C for the time period between 9 am and 12 pm. The user equipment device may analyze the viewing history and cluster the prime time shows into cluster D′ to form profile D for the time period between 6 pm and 9 pm.
The above-described embodiments of the present disclosure are presented for purposes of illustration and not of limitation, and the present disclosure is limited only by the claims which follow. Furthermore, it should be noted that the features and limitations described in any one embodiment may be applied to any other embodiment herein, and flowcharts or examples relating to one embodiment may be combined with any other embodiment in a suitable manner, done in different orders, or done in parallel. In addition, the systems and methods described herein may be performed in real-time. It should also be noted, the systems and/or methods described above may be applied to, or used in accordance with, other systems and/or methods.
Number | Name | Date | Kind |
---|---|---|---|
4288078 | Lugo | Sep 1981 | A |
4355415 | George et al. | Oct 1982 | A |
4429385 | Cichelli et al. | Jan 1984 | A |
4488179 | Krüger et al. | Dec 1984 | A |
4602279 | Freeman | Jul 1986 | A |
4605964 | Chard | Aug 1986 | A |
4625080 | Scott | Nov 1986 | A |
4627620 | Yang | Dec 1986 | A |
4630910 | Ross et al. | Dec 1986 | A |
4645458 | Williams | Feb 1987 | A |
4694490 | Harvey et al. | Sep 1987 | A |
4695953 | Blair et al. | Sep 1987 | A |
4702475 | Elstein et al. | Oct 1987 | A |
4704725 | Harvey et al. | Nov 1987 | A |
4706121 | Young | Nov 1987 | A |
4711543 | Blair et al. | Dec 1987 | A |
4718107 | Hayes | Jan 1988 | A |
4745549 | Hashimoto | May 1988 | A |
4751578 | Reiter et al. | Jun 1988 | A |
4751642 | Silva et al. | Jun 1988 | A |
4761684 | Clark et al. | Aug 1988 | A |
4787063 | Muguet | Nov 1988 | A |
4796997 | Svetkoff et al. | Jan 1989 | A |
4809065 | Harris et al. | Feb 1989 | A |
4817950 | Goo | Apr 1989 | A |
4843568 | Krueger et al. | Jun 1989 | A |
4847698 | Freeman | Jul 1989 | A |
4857999 | Welsh | Aug 1989 | A |
4893183 | Nayar | Jan 1990 | A |
4901362 | Terzian | Feb 1990 | A |
4908707 | Kinghorn | Mar 1990 | A |
4925189 | Braeunig | May 1990 | A |
4930158 | Vogel | May 1990 | A |
4959720 | Duffield et al. | Sep 1990 | A |
4963994 | Levine | Oct 1990 | A |
4965825 | Harvey et al. | Oct 1990 | A |
4977455 | Young | Dec 1990 | A |
5027400 | Baji et al. | Jun 1991 | A |
5036314 | Barillari et al. | Jul 1991 | A |
5047867 | Strubbe et al. | Sep 1991 | A |
5089885 | Clark | Feb 1992 | A |
5101444 | Wilson et al. | Mar 1992 | A |
5109279 | Ando | Apr 1992 | A |
5109414 | Harvey et al. | Apr 1992 | A |
5113259 | Romesburg et al. | May 1992 | A |
5132992 | Yurt et al. | Jul 1992 | A |
5134719 | Mankovitz | Jul 1992 | A |
5148154 | MacKay et al. | Sep 1992 | A |
5151789 | Young | Sep 1992 | A |
5155591 | Wachob | Oct 1992 | A |
5172413 | Bradley et al. | Dec 1992 | A |
5184295 | Mann | Feb 1993 | A |
5200822 | Bronfin et al. | Apr 1993 | A |
5223924 | Strubbe | Jun 1993 | A |
5229754 | Aoki et al. | Jul 1993 | A |
5229756 | Kosugi et al. | Jul 1993 | A |
5231493 | Apitz | Jul 1993 | A |
5233423 | Jernigan et al. | Aug 1993 | A |
5233654 | Harvey et al. | Aug 1993 | A |
5239463 | Blair et al. | Aug 1993 | A |
5239464 | Blair et al. | Aug 1993 | A |
5249043 | Grandmougin | Sep 1993 | A |
5253066 | Vogel | Oct 1993 | A |
5288078 | Capper et al. | Feb 1994 | A |
5295491 | Gevins | Mar 1994 | A |
5299006 | Kim | Mar 1994 | A |
5320538 | Baum | Jun 1994 | A |
5335277 | Harvey et al. | Aug 1994 | A |
5339434 | Rusis | Aug 1994 | A |
5341350 | Frank et al. | Aug 1994 | A |
5347306 | Nitta | Sep 1994 | A |
5353121 | Young et al. | Oct 1994 | A |
5357276 | Banker et al. | Oct 1994 | A |
5359367 | Stockill | Oct 1994 | A |
5382983 | Kwoh et al. | Jan 1995 | A |
5385519 | Hsu et al. | Jan 1995 | A |
5404567 | DePietro et al. | Apr 1995 | A |
5405152 | Katanics et al. | Apr 1995 | A |
5410326 | Goldstein | Apr 1995 | A |
5410344 | Graves et al. | Apr 1995 | A |
5412720 | Hoarty | May 1995 | A |
5414756 | Levine | May 1995 | A |
5417210 | Funda et al. | May 1995 | A |
5423554 | Davis | Jun 1995 | A |
5426699 | Wunderlich et al. | Jun 1995 | A |
5442389 | Blahut et al. | Aug 1995 | A |
5442390 | Hooper et al. | Aug 1995 | A |
5453779 | Dan et al. | Sep 1995 | A |
5454043 | Freeman | Sep 1995 | A |
5455570 | Cook et al. | Oct 1995 | A |
5461415 | Wolf et al. | Oct 1995 | A |
5465113 | Gilboy | Nov 1995 | A |
5465385 | Ohga et al. | Nov 1995 | A |
5469206 | Strubbe et al. | Nov 1995 | A |
5469740 | French et al. | Nov 1995 | A |
5479266 | Young et al. | Dec 1995 | A |
5479268 | Young et al. | Dec 1995 | A |
5479302 | Haines | Dec 1995 | A |
5481296 | Cragun et al. | Jan 1996 | A |
5483278 | Strubbe et al. | Jan 1996 | A |
5485197 | Hoarty | Jan 1996 | A |
5495576 | Ritchey | Feb 1996 | A |
5502504 | Marshall et al. | Mar 1996 | A |
5506932 | Holmes et al. | Apr 1996 | A |
5509908 | Hillstead et al. | Apr 1996 | A |
5516105 | Eisenbrey et al. | May 1996 | A |
5517254 | Monta et al. | May 1996 | A |
5517257 | Dunn et al. | May 1996 | A |
5523794 | Mankovitz et al. | Jun 1996 | A |
5523796 | Marshall et al. | Jun 1996 | A |
5524195 | Clanton, III et al. | Jun 1996 | A |
5524271 | Hollmann et al. | Jun 1996 | A |
5524637 | Erickson | Jun 1996 | A |
5526034 | Hoarty et al. | Jun 1996 | A |
5528304 | Cherrick et al. | Jun 1996 | A |
5528513 | Vaitzblit et al. | Jun 1996 | A |
5534911 | Levitan | Jul 1996 | A |
5534917 | MacDougall | Jul 1996 | A |
5537141 | Harper et al. | Jul 1996 | A |
5539449 | Blahut et al. | Jul 1996 | A |
5539880 | Lakhani | Jul 1996 | A |
5541638 | Story | Jul 1996 | A |
5541738 | Mankovitz | Jul 1996 | A |
5548338 | Ellis et al. | Aug 1996 | A |
5550576 | Klosterman | Aug 1996 | A |
5550825 | McMullan, Jr. et al. | Aug 1996 | A |
5557338 | Maze et al. | Sep 1996 | A |
5557724 | Sampat et al. | Sep 1996 | A |
5559548 | Davis et al. | Sep 1996 | A |
5559549 | Hendricks et al. | Sep 1996 | A |
5559550 | Mankovitz | Sep 1996 | A |
5563988 | Maes et al. | Oct 1996 | A |
5570295 | Isenberg et al. | Oct 1996 | A |
5572442 | Schulhof et al. | Nov 1996 | A |
5574778 | Ely et al. | Nov 1996 | A |
5576755 | Davis et al. | Nov 1996 | A |
5577981 | Jarvik | Nov 1996 | A |
5579239 | Freeman et al. | Nov 1996 | A |
5580249 | Jacobsen et al. | Dec 1996 | A |
5583560 | Florin et al. | Dec 1996 | A |
5583561 | Baker et al. | Dec 1996 | A |
5583563 | Wanderscheid et al. | Dec 1996 | A |
5585838 | Lawler et al. | Dec 1996 | A |
5585858 | Harper et al. | Dec 1996 | A |
5585865 | Amano et al. | Dec 1996 | A |
5585866 | Miller et al. | Dec 1996 | A |
5586264 | Belknap et al. | Dec 1996 | A |
5589892 | Knee et al. | Dec 1996 | A |
5592551 | Lett et al. | Jan 1997 | A |
5594469 | Freeman et al. | Jan 1997 | A |
5594509 | Florin et al. | Jan 1997 | A |
5594779 | Goodman | Jan 1997 | A |
5596373 | White et al. | Jan 1997 | A |
5597309 | Riess | Jan 1997 | A |
5600364 | Hendricks et al. | Feb 1997 | A |
5602582 | Wanderscheid et al. | Feb 1997 | A |
5606374 | Bertram | Feb 1997 | A |
5606642 | Stautner et al. | Feb 1997 | A |
5610653 | Abecassis | Mar 1997 | A |
5616078 | Oh | Apr 1997 | A |
5616876 | Cluts | Apr 1997 | A |
5617312 | Iura et al. | Apr 1997 | A |
5617526 | Oran et al. | Apr 1997 | A |
5619247 | Russo | Apr 1997 | A |
5619249 | Billock et al. | Apr 1997 | A |
5619274 | Roop et al. | Apr 1997 | A |
5621456 | Florin et al. | Apr 1997 | A |
5623613 | Rowe et al. | Apr 1997 | A |
5625678 | Blomfield-Brown | Apr 1997 | A |
5629733 | Youman et al. | May 1997 | A |
5629867 | Goldman | May 1997 | A |
5630060 | Tang et al. | May 1997 | A |
5630119 | Aristides et al. | May 1997 | A |
5631995 | Weissensteiner et al. | May 1997 | A |
5632007 | Freeman | May 1997 | A |
5635978 | Alten et al. | Jun 1997 | A |
5635979 | Kostreski et al. | Jun 1997 | A |
5635987 | Park et al. | Jun 1997 | A |
5638300 | Johnson | Jun 1997 | A |
5640484 | Mankovitz | Jun 1997 | A |
5641288 | Zaenglein, Jr. | Jun 1997 | A |
5648824 | Dunn et al. | Jul 1997 | A |
5652613 | Lazarus et al. | Jul 1997 | A |
5654748 | Matthews, III | Aug 1997 | A |
5654886 | Zereski, Jr. et al. | Aug 1997 | A |
5657072 | Aristides et al. | Aug 1997 | A |
5666293 | Metz et al. | Sep 1997 | A |
5666645 | Thomas et al. | Sep 1997 | A |
5671377 | Bleidt et al. | Sep 1997 | A |
5673089 | Yuen et al. | Sep 1997 | A |
5675390 | Schindler et al. | Oct 1997 | A |
5675743 | Mavity | Oct 1997 | A |
5682196 | Freeman | Oct 1997 | A |
5682206 | Wehmeyer et al. | Oct 1997 | A |
5682229 | Wangler | Oct 1997 | A |
5684525 | Klosterman | Nov 1997 | A |
5690582 | Ulrich et al. | Nov 1997 | A |
5692214 | Levine | Nov 1997 | A |
5694163 | Harrison | Dec 1997 | A |
5696765 | Safadi | Dec 1997 | A |
5699107 | Lawler et al. | Dec 1997 | A |
5703367 | Hashimoto et al. | Dec 1997 | A |
5704837 | Iwasaki et al. | Jan 1998 | A |
5708961 | Hylton et al. | Jan 1998 | A |
5710601 | Marshall et al. | Jan 1998 | A |
5715834 | Bergamasco et al. | Feb 1998 | A |
5717452 | Janin et al. | Feb 1998 | A |
5721829 | Dunn et al. | Feb 1998 | A |
5727060 | Young | Mar 1998 | A |
5732216 | Logan et al. | Mar 1998 | A |
5734119 | France et al. | Mar 1998 | A |
5734853 | Hendricks et al. | Mar 1998 | A |
5734893 | Li et al. | Mar 1998 | A |
5742443 | Tsao et al. | Apr 1998 | A |
5745710 | Clanton, III et al. | Apr 1998 | A |
5748191 | Rozak et al. | May 1998 | A |
5748716 | Levine | May 1998 | A |
5751282 | Girard et al. | May 1998 | A |
5751672 | Yankowski et al. | May 1998 | A |
5752159 | Faust et al. | May 1998 | A |
5752160 | Dunn | May 1998 | A |
5754771 | Epperson et al. | May 1998 | A |
5758257 | Herz et al. | May 1998 | A |
5758258 | Shoff et al. | May 1998 | A |
5758259 | Lawler | May 1998 | A |
5760821 | Ellis et al. | Jun 1998 | A |
5761417 | Henley et al. | Jun 1998 | A |
5761607 | Gudesen et al. | Jun 1998 | A |
5768528 | Stumm | Jun 1998 | A |
5771354 | Crawford et al. | Jun 1998 | A |
5774170 | Hite et al. | Jun 1998 | A |
5778181 | Hidary et al. | Jul 1998 | A |
5778182 | Cathey et al. | Jul 1998 | A |
5778187 | Monteiro et al. | Jul 1998 | A |
5781226 | Sheehan | Jul 1998 | A |
5781227 | Goode et al. | Jul 1998 | A |
5781228 | Sposato | Jul 1998 | A |
5781246 | Alten et al. | Jul 1998 | A |
5787259 | Haroun et al. | Jul 1998 | A |
5788507 | Redford et al. | Aug 1998 | A |
5790198 | Roop et al. | Aug 1998 | A |
5790202 | Kummer et al. | Aug 1998 | A |
5790423 | Lau et al. | Aug 1998 | A |
5793366 | Mano et al. | Aug 1998 | A |
5793412 | Asamizuya | Aug 1998 | A |
5793964 | Rogers et al. | Aug 1998 | A |
5793971 | Fujita et al. | Aug 1998 | A |
5794217 | Allen | Aug 1998 | A |
5796952 | Davis et al. | Aug 1998 | A |
5798785 | Hendricks et al. | Aug 1998 | A |
5798921 | Johnson et al. | Aug 1998 | A |
5802284 | Karlton et al. | Sep 1998 | A |
5805154 | Brown | Sep 1998 | A |
5805763 | Lawler et al. | Sep 1998 | A |
5805804 | Laursen et al. | Sep 1998 | A |
5805806 | McArthur | Sep 1998 | A |
5808608 | Young et al. | Sep 1998 | A |
5808694 | Usui et al. | Sep 1998 | A |
5809246 | Goldman | Sep 1998 | A |
5812123 | Rowe et al. | Sep 1998 | A |
5812205 | Milnes et al. | Sep 1998 | A |
5812931 | Yuen | Sep 1998 | A |
5815146 | Youden et al. | Sep 1998 | A |
5815297 | Ciciora | Sep 1998 | A |
5818438 | Howe et al. | Oct 1998 | A |
5819019 | Nelson | Oct 1998 | A |
5819160 | Foladare et al. | Oct 1998 | A |
5822530 | Brown | Oct 1998 | A |
5828420 | Marshall et al. | Oct 1998 | A |
RE35954 | Levine | Nov 1998 | E |
5835126 | Lewis | Nov 1998 | A |
5841979 | Schulhof et al. | Nov 1998 | A |
5844620 | Coleman et al. | Dec 1998 | A |
5850218 | LaJoie et al. | Dec 1998 | A |
5858866 | Berry et al. | Jan 1999 | A |
5861906 | Dunn et al. | Jan 1999 | A |
5867233 | Tanaka et al. | Feb 1999 | A |
5872588 | Aras et al. | Feb 1999 | A |
5875108 | Hoffberg et al. | Feb 1999 | A |
5877803 | Wee et al. | Mar 1999 | A |
5880768 | Lemmons et al. | Mar 1999 | A |
5881245 | Thompson | Mar 1999 | A |
5883621 | Iwamura | Mar 1999 | A |
5884028 | Kindell et al. | Mar 1999 | A |
5884141 | Inoue et al. | Mar 1999 | A |
5886707 | Berg | Mar 1999 | A |
5886732 | Humpleman | Mar 1999 | A |
5886746 | Yuen et al. | Mar 1999 | A |
5887243 | Harvey et al. | Mar 1999 | A |
5892915 | Duso et al. | Apr 1999 | A |
5894589 | Reber et al. | Apr 1999 | A |
5896414 | Meyer et al. | Apr 1999 | A |
5898441 | Flurry | Apr 1999 | A |
5898456 | Wahl | Apr 1999 | A |
5899582 | DuLac | May 1999 | A |
5900904 | Okada et al. | May 1999 | A |
5903234 | Kimura | May 1999 | A |
5903263 | Emura | May 1999 | A |
5903264 | Moeller et al. | May 1999 | A |
5905522 | Lawler | May 1999 | A |
5905847 | Kobayashi et al. | May 1999 | A |
5907323 | Lawler et al. | May 1999 | A |
5909638 | Allen | Jun 1999 | A |
5911046 | Amano | Jun 1999 | A |
5913039 | Nakamura et al. | Jun 1999 | A |
5913727 | Ahdoot | Jun 1999 | A |
5914746 | Matthews, III et al. | Jun 1999 | A |
5914941 | Janky | Jun 1999 | A |
5915090 | Joseph et al. | Jun 1999 | A |
5915094 | Kouloheris et al. | Jun 1999 | A |
5916303 | Scott | Jun 1999 | A |
5917538 | Asamizuya | Jun 1999 | A |
5917835 | Barrett et al. | Jun 1999 | A |
5920702 | Bleidt et al. | Jul 1999 | A |
5920800 | Schafer | Jul 1999 | A |
5922045 | Hanson | Jul 1999 | A |
5922048 | Emura | Jul 1999 | A |
5923361 | Sutton, Jr. | Jul 1999 | A |
5926204 | Mayer | Jul 1999 | A |
5926205 | Krause et al. | Jul 1999 | A |
5926624 | Katz et al. | Jul 1999 | A |
5928327 | Wang et al. | Jul 1999 | A |
5929849 | Kikinis | Jul 1999 | A |
5929850 | Broadwin et al. | Jul 1999 | A |
5930473 | Teng et al. | Jul 1999 | A |
5930493 | Ottesen et al. | Jul 1999 | A |
5931901 | Wolfe et al. | Aug 1999 | A |
5933125 | Fernie et al. | Aug 1999 | A |
5933603 | Vahalia et al. | Aug 1999 | A |
5933835 | Adams et al. | Aug 1999 | A |
5935206 | Dixon et al. | Aug 1999 | A |
5936569 | Ståhle et al. | Aug 1999 | A |
5940071 | Treffers et al. | Aug 1999 | A |
5940073 | Klosterman et al. | Aug 1999 | A |
5943046 | Cave et al. | Aug 1999 | A |
5943047 | Suzuki | Aug 1999 | A |
5945987 | Dunn | Aug 1999 | A |
5945988 | Williams et al. | Aug 1999 | A |
5947746 | Tsai | Sep 1999 | A |
5949411 | Doerr et al. | Sep 1999 | A |
5956482 | Agraharam et al. | Sep 1999 | A |
5956716 | Kenner et al. | Sep 1999 | A |
5959659 | Dokic | Sep 1999 | A |
5961603 | Kunkel et al. | Oct 1999 | A |
5963202 | Polish | Oct 1999 | A |
5964455 | Catanzarite et al. | Oct 1999 | A |
5969283 | Looney et al. | Oct 1999 | A |
5969714 | Butcher | Oct 1999 | A |
5973680 | Ueda | Oct 1999 | A |
5973722 | Wakai et al. | Oct 1999 | A |
5974217 | Haraguchi | Oct 1999 | A |
5977963 | Gaughan et al. | Nov 1999 | A |
5977964 | Williams et al. | Nov 1999 | A |
5978567 | Rebane et al. | Nov 1999 | A |
5978843 | Wu et al. | Nov 1999 | A |
5980256 | Carmein | Nov 1999 | A |
5986650 | Ellis et al. | Nov 1999 | A |
5988078 | Levine | Nov 1999 | A |
5989157 | Walton | Nov 1999 | A |
5990881 | Inoue et al. | Nov 1999 | A |
5995649 | Marugame | Nov 1999 | A |
5999970 | Krisbergh et al. | Dec 1999 | A |
6002394 | Schein et al. | Dec 1999 | A |
6002720 | Yurt et al. | Dec 1999 | A |
6005548 | Latypov et al. | Dec 1999 | A |
6005564 | Ahmad et al. | Dec 1999 | A |
6005600 | Hill | Dec 1999 | A |
6008802 | Iki et al. | Dec 1999 | A |
6009210 | Kang | Dec 1999 | A |
6009465 | Decker et al. | Dec 1999 | A |
6012089 | Hasegawa | Jan 2000 | A |
6012091 | Boyce | Jan 2000 | A |
6014184 | Knee et al. | Jan 2000 | A |
6014381 | Troxel et al. | Jan 2000 | A |
6014693 | Ito et al. | Jan 2000 | A |
6014694 | Aharoni et al. | Jan 2000 | A |
6014706 | Cannon et al. | Jan 2000 | A |
6016141 | Knudson et al. | Jan 2000 | A |
6018359 | Kermode et al. | Jan 2000 | A |
6018765 | Durana et al. | Jan 2000 | A |
6020912 | De Lang | Feb 2000 | A |
6022223 | Taniguchi et al. | Feb 2000 | A |
6023725 | Ozawa et al. | Feb 2000 | A |
6025837 | Matthews, III et al. | Feb 2000 | A |
6025868 | Russo | Feb 2000 | A |
6028600 | Rosin et al. | Feb 2000 | A |
6029064 | Farris et al. | Feb 2000 | A |
6032202 | Lea et al. | Feb 2000 | A |
6038591 | Wolfe et al. | Mar 2000 | A |
6038614 | Chan et al. | Mar 2000 | A |
6049831 | Gardell et al. | Apr 2000 | A |
6052145 | Macrae et al. | Apr 2000 | A |
6054991 | Crane et al. | Apr 2000 | A |
6057909 | Yahav et al. | May 2000 | A |
6066075 | Poulton | May 2000 | A |
6072494 | Nguyen | Jun 2000 | A |
6073489 | French et al. | Jun 2000 | A |
6077201 | Cheng | Jun 2000 | A |
6085236 | Lea | Jul 2000 | A |
6091883 | Artigalas et al. | Jul 2000 | A |
6097441 | Allport | Aug 2000 | A |
6098082 | Gibbon et al. | Aug 2000 | A |
6098458 | French et al. | Aug 2000 | A |
6100896 | Strohecker et al. | Aug 2000 | A |
6101289 | Kellner | Aug 2000 | A |
6104334 | Allport | Aug 2000 | A |
6111677 | Shintani et al. | Aug 2000 | A |
6112186 | Bergh et al. | Aug 2000 | A |
6118450 | Proehl et al. | Sep 2000 | A |
6125230 | Yaginuma | Sep 2000 | A |
6128003 | Smith et al. | Oct 2000 | A |
6130677 | Kunz | Oct 2000 | A |
6130726 | Darbee et al. | Oct 2000 | A |
6141003 | Chor | Oct 2000 | A |
6141463 | Covell et al. | Oct 2000 | A |
6141488 | Knudson et al. | Oct 2000 | A |
6147678 | Kumar et al. | Nov 2000 | A |
6147715 | Yuen et al. | Nov 2000 | A |
6152856 | Studor et al. | Nov 2000 | A |
6154206 | Ludtke | Nov 2000 | A |
6159100 | Smith | Dec 2000 | A |
6160546 | Thompson et al. | Dec 2000 | A |
6160796 | Zou | Dec 2000 | A |
6166730 | Goode et al. | Dec 2000 | A |
6167188 | Young et al. | Dec 2000 | A |
6169543 | Wehmeyer | Jan 2001 | B1 |
6169725 | Gibbs et al. | Jan 2001 | B1 |
6170006 | Namba | Jan 2001 | B1 |
6173066 | Peurach et al. | Jan 2001 | B1 |
6177931 | Alexander et al. | Jan 2001 | B1 |
6181343 | Lyons | Jan 2001 | B1 |
6182094 | Humpleman et al. | Jan 2001 | B1 |
6184878 | Alonso et al. | Feb 2001 | B1 |
6188777 | Darrell et al. | Feb 2001 | B1 |
6208335 | Gordon et al. | Mar 2001 | B1 |
6208341 | van Ee et al. | Mar 2001 | B1 |
6208384 | Schultheiss | Mar 2001 | B1 |
6215890 | Matsuo et al. | Apr 2001 | B1 |
6215898 | Woodfill et al. | Apr 2001 | B1 |
6219839 | Sampsell | Apr 2001 | B1 |
6226396 | Marugame | May 2001 | B1 |
6226444 | Goldschmidt Iki et al. | May 2001 | B1 |
6226618 | Downs et al. | May 2001 | B1 |
6229913 | Nayar et al. | May 2001 | B1 |
6232539 | Looney et al. | May 2001 | B1 |
6233734 | Macrae et al. | May 2001 | B1 |
6236395 | Sezan et al. | May 2001 | B1 |
6237049 | Ludtke | May 2001 | B1 |
6239794 | Yuen et al. | May 2001 | B1 |
6243104 | Murray | Jun 2001 | B1 |
6243707 | Humpleman et al. | Jun 2001 | B1 |
6243725 | Hempleman et al. | Jun 2001 | B1 |
6256033 | Nguyen | Jul 2001 | B1 |
6256400 | Takata et al. | Jul 2001 | B1 |
6263501 | Schein et al. | Jul 2001 | B1 |
6263503 | Margulis | Jul 2001 | B1 |
6268849 | Boyer et al. | Jul 2001 | B1 |
6283860 | Lyons et al. | Sep 2001 | B1 |
6289112 | Jain et al. | Sep 2001 | B1 |
6289165 | Abecassis | Sep 2001 | B1 |
6299308 | Voronka et al. | Oct 2001 | B1 |
6308565 | French et al. | Oct 2001 | B1 |
6310886 | Barton | Oct 2001 | B1 |
6314575 | Billock et al. | Nov 2001 | B1 |
6316934 | Amorai-Moriya et al. | Nov 2001 | B1 |
6324338 | Wood et al. | Nov 2001 | B1 |
6331877 | Bennington et al. | Dec 2001 | B1 |
6353700 | Zhou | Mar 2002 | B1 |
6354378 | Patel | Mar 2002 | B1 |
6356971 | Katz et al. | Mar 2002 | B1 |
6357043 | Ellis et al. | Mar 2002 | B1 |
6359661 | Nickum | Mar 2002 | B1 |
6363160 | Bradski et al. | Mar 2002 | B1 |
6384819 | Hunter | May 2002 | B1 |
6388714 | Schein et al. | May 2002 | B1 |
6393430 | Van Ryzin | May 2002 | B1 |
6411744 | Edwards | Jun 2002 | B1 |
6418556 | Bennington et al. | Jul 2002 | B1 |
6430997 | French et al. | Aug 2002 | B1 |
6441832 | Tao et al. | Aug 2002 | B1 |
6456621 | Wada et al. | Sep 2002 | B1 |
RE37881 | Haines | Oct 2002 | E |
6463585 | Hendricks et al. | Oct 2002 | B1 |
6466080 | Kawai et al. | Oct 2002 | B2 |
6473559 | Knudson et al. | Oct 2002 | B1 |
6476834 | Doval et al. | Nov 2002 | B1 |
6483986 | Krapf | Nov 2002 | B1 |
6487145 | Berhan | Nov 2002 | B1 |
6487362 | Yuen et al. | Nov 2002 | B1 |
6496598 | Harman | Dec 2002 | B1 |
6496981 | Wistendahl et al. | Dec 2002 | B1 |
6498895 | Young et al. | Dec 2002 | B2 |
6503195 | Keller et al. | Jan 2003 | B1 |
6505348 | Knowles et al. | Jan 2003 | B1 |
6522342 | Gagnon et al. | Feb 2003 | B1 |
6536041 | Knudson et al. | Mar 2003 | B1 |
6539931 | Trajkovic et al. | Apr 2003 | B2 |
6564378 | Satterfield et al. | May 2003 | B1 |
6570555 | Prevost et al. | May 2003 | B1 |
6577735 | Bharat | Jun 2003 | B1 |
6611654 | Shteyn | Aug 2003 | B1 |
6614987 | Ismail et al. | Sep 2003 | B1 |
6633294 | Rosenthal et al. | Oct 2003 | B1 |
6640202 | Dietz et al. | Oct 2003 | B1 |
6647417 | Hunter et al. | Nov 2003 | B1 |
6651253 | Dudkiewicz et al. | Nov 2003 | B2 |
6657116 | Gunnerson | Dec 2003 | B1 |
6661918 | Gordon et al. | Dec 2003 | B1 |
6671882 | Murphy et al. | Dec 2003 | B1 |
6681031 | Cohen et al. | Jan 2004 | B2 |
6714665 | Hanna et al. | Mar 2004 | B1 |
6731799 | Sun et al. | May 2004 | B1 |
6738066 | Nguyen | May 2004 | B1 |
6741617 | Rosengren et al. | May 2004 | B2 |
6751402 | Elliott et al. | Jun 2004 | B1 |
6756997 | Ward, III et al. | Jun 2004 | B1 |
6760412 | Loucks | Jul 2004 | B1 |
6760537 | Mankovitz | Jul 2004 | B2 |
6765726 | French et al. | Jul 2004 | B2 |
6788809 | Grzeszczuk et al. | Sep 2004 | B1 |
6801637 | Voronka et al. | Oct 2004 | B2 |
6813775 | Finseth et al. | Nov 2004 | B1 |
6813777 | Weinberger et al. | Nov 2004 | B1 |
6816175 | Hamp et al. | Nov 2004 | B1 |
6816904 | Ludwig et al. | Nov 2004 | B1 |
6820278 | Ellis | Nov 2004 | B1 |
6826512 | Dara-Abrams et al. | Nov 2004 | B2 |
6837789 | Garahi et al. | Jan 2005 | B2 |
6839769 | Needham et al. | Jan 2005 | B2 |
6868225 | Brown et al. | Mar 2005 | B1 |
6873710 | Cohen-Solal et al. | Mar 2005 | B1 |
6873723 | Aucsmith et al. | Mar 2005 | B1 |
6876496 | French et al. | Apr 2005 | B2 |
6882793 | Fu et al. | Apr 2005 | B1 |
6897904 | Potrebic et al. | May 2005 | B2 |
6901603 | Ziedler et al. | May 2005 | B2 |
6931593 | Grooters | Aug 2005 | B1 |
6937742 | Roberts et al. | Aug 2005 | B2 |
6938101 | Hayes et al. | Aug 2005 | B2 |
6950534 | Cohen et al. | Sep 2005 | B2 |
6950624 | Kim et al. | Sep 2005 | B2 |
6972680 | Yui et al. | Dec 2005 | B2 |
6973474 | Hatayama | Dec 2005 | B2 |
7003134 | Covell et al. | Feb 2006 | B1 |
7003791 | Mizutani | Feb 2006 | B2 |
7013478 | Hendricks et al. | Mar 2006 | B1 |
7020704 | Lipscomb et al. | Mar 2006 | B1 |
7028269 | Cohen-Solal et al. | Apr 2006 | B1 |
7036094 | Cohen et al. | Apr 2006 | B1 |
7038855 | French et al. | May 2006 | B2 |
7039643 | Sena et al. | May 2006 | B2 |
7039676 | Day et al. | May 2006 | B1 |
7042440 | Pryor et al. | May 2006 | B2 |
7047377 | Elder et al. | May 2006 | B2 |
7050606 | Paul et al. | May 2006 | B2 |
7055166 | Logan et al. | May 2006 | B1 |
7058204 | Hildreth et al. | Jun 2006 | B2 |
7058635 | Shah-Nazaroff et al. | Jun 2006 | B1 |
7060957 | Lange et al. | Jun 2006 | B2 |
7086077 | Giammaressi | Aug 2006 | B2 |
7088952 | Saito et al. | Aug 2006 | B1 |
7098958 | Wredenhagen et al. | Aug 2006 | B2 |
7103906 | Katz et al. | Sep 2006 | B1 |
7113918 | Ahmad et al. | Sep 2006 | B1 |
7120925 | D'Souza et al. | Oct 2006 | B2 |
7121946 | Paul et al. | Oct 2006 | B2 |
7134130 | Thomas | Nov 2006 | B1 |
7143432 | Brooks et al. | Nov 2006 | B1 |
7159235 | Son et al. | Jan 2007 | B2 |
7165098 | Boyer et al. | Jan 2007 | B1 |
7170492 | Bell | Jan 2007 | B2 |
7178161 | Fristoe et al. | Feb 2007 | B1 |
7184048 | Hunter | Feb 2007 | B2 |
7200852 | Block | Apr 2007 | B1 |
7202898 | Braun et al. | Apr 2007 | B1 |
7206892 | Kim et al. | Apr 2007 | B2 |
7213071 | DeLima et al. | May 2007 | B2 |
7213089 | Hatakenaka | May 2007 | B2 |
7222078 | Abelow | May 2007 | B2 |
7224889 | Takasu et al. | May 2007 | B2 |
7227526 | Hildreth et al. | Jun 2007 | B2 |
7231175 | Ellis | Jun 2007 | B2 |
7237253 | Blackketter et al. | Jun 2007 | B1 |
7240356 | Iki et al. | Jul 2007 | B2 |
7248778 | Anderson et al. | Jul 2007 | B1 |
7259747 | Bell | Aug 2007 | B2 |
7260461 | Rao et al. | Aug 2007 | B2 |
7268833 | Park et al. | Sep 2007 | B2 |
7269733 | O'Toole, Jr. | Sep 2007 | B1 |
7296284 | Price et al. | Nov 2007 | B1 |
7308112 | Fujimura et al. | Dec 2007 | B2 |
7317836 | Fujimura et al. | Jan 2008 | B2 |
7340490 | Teloh et al. | Mar 2008 | B2 |
7344084 | DaCosta | Mar 2008 | B2 |
7346920 | Lamkin et al. | Mar 2008 | B2 |
7348963 | Bell | Mar 2008 | B2 |
7359121 | French et al. | Apr 2008 | B2 |
7367887 | Watabe et al. | May 2008 | B2 |
7379563 | Shamaie | May 2008 | B2 |
7379566 | Hildreth | May 2008 | B2 |
7386871 | Knudson et al. | Jun 2008 | B1 |
7389591 | Jaiswal et al. | Jun 2008 | B2 |
7412077 | Li et al. | Aug 2008 | B2 |
7421093 | Hildreth et al. | Sep 2008 | B2 |
7428744 | Ritter et al. | Sep 2008 | B1 |
7430312 | Gu | Sep 2008 | B2 |
7436496 | Kawahito | Oct 2008 | B2 |
7450736 | Yang et al. | Nov 2008 | B2 |
7452275 | Kuraishi | Nov 2008 | B2 |
7458093 | Dukes et al. | Nov 2008 | B2 |
7460690 | Cohen et al. | Dec 2008 | B2 |
7483964 | Jackson et al. | Jan 2009 | B1 |
7489812 | Fox et al. | Feb 2009 | B2 |
7518503 | Peele | Apr 2009 | B2 |
7536032 | Bell | May 2009 | B2 |
7536704 | Pierre et al. | May 2009 | B2 |
7555142 | Hildreth et al. | Jun 2009 | B2 |
7560701 | Oggier et al. | Jul 2009 | B2 |
7570805 | Gu | Aug 2009 | B2 |
7574020 | Shamaie | Aug 2009 | B2 |
7574723 | Putterman et al. | Aug 2009 | B2 |
7576727 | Bell | Aug 2009 | B2 |
7590262 | Fujimura et al. | Sep 2009 | B2 |
7593552 | Higaki et al. | Sep 2009 | B2 |
7598942 | Underkoffler et al. | Oct 2009 | B2 |
7603685 | Knudson et al. | Oct 2009 | B2 |
7607509 | Schmiz et al. | Oct 2009 | B2 |
7620202 | Fujimura et al. | Nov 2009 | B2 |
7624337 | Sull et al. | Nov 2009 | B2 |
7624416 | Vandermolen et al. | Nov 2009 | B1 |
7650621 | Thomas et al. | Jan 2010 | B2 |
7668340 | Cohen et al. | Feb 2010 | B2 |
7680298 | Roberts et al. | Mar 2010 | B2 |
7683954 | Ichikawa et al. | Mar 2010 | B2 |
7684592 | Paul et al. | Mar 2010 | B2 |
7684673 | Monroe | Mar 2010 | B2 |
7689510 | Lamkin et al. | Mar 2010 | B2 |
7689556 | Garg et al. | Mar 2010 | B2 |
7701439 | Hillis et al. | Apr 2010 | B2 |
7702130 | Im et al. | Apr 2010 | B2 |
7704135 | Harrison, Jr. | Apr 2010 | B2 |
7710391 | Bell et al. | May 2010 | B2 |
7729530 | Antonov et al. | Jun 2010 | B2 |
7746345 | Hunter | Jun 2010 | B2 |
7760182 | Ahmad et al. | Jul 2010 | B2 |
7783632 | Richardson et al. | Aug 2010 | B2 |
7809167 | Bell | Oct 2010 | B2 |
7834846 | Bell | Nov 2010 | B1 |
7840977 | Walker | Nov 2010 | B2 |
7852262 | Namineni et al. | Dec 2010 | B2 |
7852416 | Bennett et al. | Dec 2010 | B2 |
RE42256 | Edwards | Mar 2011 | E |
7898522 | Hildreth et al. | Mar 2011 | B2 |
7907213 | Biere et al. | Mar 2011 | B1 |
7917933 | Thomas et al. | Mar 2011 | B2 |
7929551 | Dietrich | Apr 2011 | B2 |
8029359 | Cheng | Oct 2011 | B2 |
8035612 | Bell et al. | Oct 2011 | B2 |
8035614 | Bell et al. | Oct 2011 | B2 |
8035624 | Bell et al. | Oct 2011 | B2 |
8060399 | Ullah | Nov 2011 | B2 |
8072470 | Marks | Dec 2011 | B2 |
8104066 | Colsey et al. | Jan 2012 | B2 |
8122491 | Yee et al. | Feb 2012 | B2 |
8266666 | Friedman | Sep 2012 | B2 |
8331987 | Rosenblatt | Dec 2012 | B2 |
8539357 | Hildreth | Sep 2013 | B2 |
8601526 | Nishimura et al. | Dec 2013 | B2 |
8667519 | Small et al. | Mar 2014 | B2 |
8791787 | Hardacker et al. | Jul 2014 | B2 |
20010004338 | Yankowski | Jun 2001 | A1 |
20010007147 | Goldschmidt Iki et al. | Jul 2001 | A1 |
20010026287 | Watanabe | Oct 2001 | A1 |
20010039660 | Vasilevsky et al. | Nov 2001 | A1 |
20010042107 | Palm | Nov 2001 | A1 |
20010043700 | Shima et al. | Nov 2001 | A1 |
20010044726 | Li et al. | Nov 2001 | A1 |
20020010652 | Deguchi | Jan 2002 | A1 |
20020032907 | Daniels | Mar 2002 | A1 |
20020043700 | Sasaki et al. | Apr 2002 | A1 |
20020046315 | Miller et al. | Apr 2002 | A1 |
20020056087 | Berezowski et al. | May 2002 | A1 |
20020056119 | Moynihan | May 2002 | A1 |
20020059588 | Huber et al. | May 2002 | A1 |
20020059610 | Ellis | May 2002 | A1 |
20020059642 | Russ et al. | May 2002 | A1 |
20020062481 | Slaney et al. | May 2002 | A1 |
20020069218 | Sull et al. | Jun 2002 | A1 |
20020069746 | Taira et al. | Jun 2002 | A1 |
20020070982 | Hill et al. | Jun 2002 | A1 |
20020075402 | Robson et al. | Jun 2002 | A1 |
20020078293 | Kou et al. | Jun 2002 | A1 |
20020078453 | Kuo | Jun 2002 | A1 |
20020082901 | Dunning et al. | Jun 2002 | A1 |
20020087588 | McBride et al. | Jul 2002 | A1 |
20020088011 | Lamkin et al. | Jul 2002 | A1 |
20020090203 | Mankovitz | Jul 2002 | A1 |
20020091568 | Kraft et al. | Jul 2002 | A1 |
20020104091 | Prabhu et al. | Aug 2002 | A1 |
20020113824 | Myers, Jr. | Aug 2002 | A1 |
20020120935 | Huber et al. | Aug 2002 | A1 |
20020157099 | Schrader et al. | Oct 2002 | A1 |
20020161579 | Saindon et al. | Oct 2002 | A1 |
20020165751 | Upadhya | Nov 2002 | A1 |
20020165770 | Khoo et al. | Nov 2002 | A1 |
20020166123 | Schrader et al. | Nov 2002 | A1 |
20020174430 | Ellis et al. | Nov 2002 | A1 |
20020174444 | Gatto et al. | Nov 2002 | A1 |
20020180803 | Kaplan et al. | Dec 2002 | A1 |
20020188735 | Needham et al. | Dec 2002 | A1 |
20020194011 | Boies et al. | Dec 2002 | A1 |
20020194586 | Gutta et al. | Dec 2002 | A1 |
20020194600 | Ellis et al. | Dec 2002 | A1 |
20030005440 | Axelsson et al. | Jan 2003 | A1 |
20030005446 | Jaff et al. | Jan 2003 | A1 |
20030026592 | Kawahara et al. | Feb 2003 | A1 |
20030035404 | Ozluturk et al. | Feb 2003 | A1 |
20030046437 | Eytchison et al. | Mar 2003 | A1 |
20030066084 | Kaars | Apr 2003 | A1 |
20030066085 | Boyer et al. | Apr 2003 | A1 |
20030066092 | Wagner | Apr 2003 | A1 |
20030068154 | Zylka | Apr 2003 | A1 |
20030070177 | Kondo et al. | Apr 2003 | A1 |
20030078784 | Jordan et al. | Apr 2003 | A1 |
20030093803 | Ishikawa et al. | May 2003 | A1 |
20030097408 | Kageyama et al. | May 2003 | A1 |
20030105813 | Mizutani | Jun 2003 | A1 |
20030110499 | Knudson et al. | Jun 2003 | A1 |
20030135860 | Dureau | Jul 2003 | A1 |
20030140343 | Falvo et al. | Jul 2003 | A1 |
20030149621 | Shteyn | Aug 2003 | A1 |
20030149980 | Ellis | Aug 2003 | A1 |
20030149988 | Ellis et al. | Aug 2003 | A1 |
20030152096 | Chapman | Aug 2003 | A1 |
20030156134 | Kim | Aug 2003 | A1 |
20030162096 | Michot et al. | Aug 2003 | A1 |
20030163813 | Klosterman et al. | Aug 2003 | A1 |
20030163832 | Tsuria et al. | Aug 2003 | A1 |
20030164858 | Klosterman et al. | Sep 2003 | A1 |
20030188310 | Klosterman et al. | Oct 2003 | A1 |
20030188311 | Yuen et al. | Oct 2003 | A1 |
20030194260 | Ward et al. | Oct 2003 | A1 |
20030196201 | Schein et al. | Oct 2003 | A1 |
20030206710 | Ferman et al. | Nov 2003 | A1 |
20030208756 | Macrae et al. | Nov 2003 | A1 |
20030214955 | Kim | Nov 2003 | A1 |
20030237093 | Marsh | Dec 2003 | A1 |
20040008972 | Haken | Jan 2004 | A1 |
20040023810 | Ignatiev et al. | Feb 2004 | A1 |
20040064835 | Bellwood et al. | Apr 2004 | A1 |
20040070491 | Huang et al. | Apr 2004 | A1 |
20040088731 | Putterman et al. | May 2004 | A1 |
20040100088 | Tellenbach et al. | May 2004 | A1 |
20040103434 | Ellis | May 2004 | A1 |
20040104806 | Yui et al. | Jun 2004 | A1 |
20040117831 | Ellis et al. | Jun 2004 | A1 |
20040128686 | Boyer et al. | Jul 2004 | A1 |
20040139233 | Kellerman et al. | Jul 2004 | A1 |
20040156614 | Bumgardner et al. | Aug 2004 | A1 |
20040177370 | Dudkiewicz | Sep 2004 | A1 |
20040181814 | Ellis et al. | Sep 2004 | A1 |
20040184763 | DiFrancesco | Sep 2004 | A1 |
20040193425 | Tomes | Sep 2004 | A1 |
20040193648 | Lai et al. | Sep 2004 | A1 |
20040210926 | Francis et al. | Oct 2004 | A1 |
20040210932 | Mori et al. | Oct 2004 | A1 |
20040220091 | Adam et al. | Nov 2004 | A1 |
20040226034 | Kaczowka et al. | Nov 2004 | A1 |
20040237104 | Cooper et al. | Nov 2004 | A1 |
20040255326 | Hicks, III et al. | Dec 2004 | A1 |
20040259537 | Ackley | Dec 2004 | A1 |
20040267965 | Vasudevan et al. | Dec 2004 | A1 |
20040268403 | Krieger et al. | Dec 2004 | A1 |
20050021397 | Cui et al. | Jan 2005 | A1 |
20050028208 | Ellis et al. | Feb 2005 | A1 |
20050039208 | Veeck et al. | Feb 2005 | A1 |
20050071876 | van Beek | Mar 2005 | A1 |
20050091680 | Kondo | Apr 2005 | A1 |
20050102324 | Spring et al. | May 2005 | A1 |
20050120373 | Thomas et al. | Jun 2005 | A1 |
20050125718 | Van Doorn | Jun 2005 | A1 |
20050132264 | Joshi et al. | Jun 2005 | A1 |
20050138137 | Encarnacion et al. | Jun 2005 | A1 |
20050138658 | Bryan | Jun 2005 | A1 |
20050138660 | Boyer et al. | Jun 2005 | A1 |
20050149966 | Fairhurst et al. | Jul 2005 | A1 |
20050160461 | Baumgartner et al. | Jul 2005 | A1 |
20050204381 | Ludvig et al. | Sep 2005 | A1 |
20050204388 | Knudson et al. | Sep 2005 | A1 |
20050227611 | Ellis | Oct 2005 | A1 |
20050246393 | Coates et al. | Nov 2005 | A1 |
20050246746 | Yui et al. | Nov 2005 | A1 |
20050251827 | Ellis et al. | Nov 2005 | A1 |
20050259963 | Sano et al. | Nov 2005 | A1 |
20050285966 | Bamji et al. | Dec 2005 | A1 |
20060004685 | Pyhalammi et al. | Jan 2006 | A1 |
20060007479 | Henry et al. | Jan 2006 | A1 |
20060015888 | Shih | Jan 2006 | A1 |
20060026635 | Potrebic et al. | Feb 2006 | A1 |
20060026665 | Rodriguez et al. | Feb 2006 | A1 |
20060031883 | Ellis et al. | Feb 2006 | A1 |
20060037054 | McDowell et al. | Feb 2006 | A1 |
20060053449 | Gutta | Mar 2006 | A1 |
20060064728 | Son et al. | Mar 2006 | A1 |
20060080707 | Laksono | Apr 2006 | A1 |
20060083301 | Nishio | Apr 2006 | A1 |
20060085835 | Istvan et al. | Apr 2006 | A1 |
20060095942 | van Beek | May 2006 | A1 |
20060098221 | Ferlitsch | May 2006 | A1 |
20060101492 | Lowcock | May 2006 | A1 |
20060173838 | Garg et al. | Aug 2006 | A1 |
20060190978 | Russ et al. | Aug 2006 | A1 |
20060212900 | Ismail et al. | Sep 2006 | A1 |
20060215650 | Wollmershauser et al. | Sep 2006 | A1 |
20060218573 | Proebstel | Sep 2006 | A1 |
20060218604 | Riedl et al. | Sep 2006 | A1 |
20060238648 | Wogsberg | Oct 2006 | A1 |
20060248570 | Witwer | Nov 2006 | A1 |
20060253874 | Stark et al. | Nov 2006 | A1 |
20060259949 | Schaefer et al. | Nov 2006 | A1 |
20060263758 | Crutchfield et al. | Nov 2006 | A1 |
20060265427 | Cohen et al. | Nov 2006 | A1 |
20060271953 | Jacoby et al. | Nov 2006 | A1 |
20060277579 | Inkinen | Dec 2006 | A1 |
20060294574 | Cha | Dec 2006 | A1 |
20070011709 | Katz et al. | Jan 2007 | A1 |
20070022442 | Gil et al. | Jan 2007 | A1 |
20070028267 | Ostojic et al. | Feb 2007 | A1 |
20070033607 | Bryan | Feb 2007 | A1 |
20070036303 | Lee et al. | Feb 2007 | A1 |
20070050242 | Kralik | Mar 2007 | A1 |
20070055980 | Megeid et al. | Mar 2007 | A1 |
20070055989 | Shanks et al. | Mar 2007 | A1 |
20070061022 | Hoffberg-Borghesani et al. | Mar 2007 | A1 |
20070074245 | Nyako et al. | Mar 2007 | A1 |
20070076665 | Nair et al. | Apr 2007 | A1 |
20070078708 | Yu et al. | Apr 2007 | A1 |
20070078709 | Rajaram | Apr 2007 | A1 |
20070089132 | Qureshey et al. | Apr 2007 | A1 |
20070089160 | Ando | Apr 2007 | A1 |
20070094702 | Khare et al. | Apr 2007 | A1 |
20070100690 | Hopkins | May 2007 | A1 |
20070113246 | Xiong | May 2007 | A1 |
20070124781 | Casey et al. | May 2007 | A1 |
20070130089 | Chiu | Jun 2007 | A1 |
20070130283 | Klein et al. | Jun 2007 | A1 |
20070147351 | Dietrich et al. | Jun 2007 | A1 |
20070157234 | Walker | Jul 2007 | A1 |
20070157237 | Cordray et al. | Jul 2007 | A1 |
20070157240 | Walker | Jul 2007 | A1 |
20070157241 | Walker | Jul 2007 | A1 |
20070157242 | Cordray et al. | Jul 2007 | A1 |
20070157260 | Walker | Jul 2007 | A1 |
20070157266 | Ellis et al. | Jul 2007 | A1 |
20070161402 | Ng et al. | Jul 2007 | A1 |
20070162661 | Fu et al. | Jul 2007 | A1 |
20070162850 | Adler et al. | Jul 2007 | A1 |
20070167689 | Ramadas et al. | Jul 2007 | A1 |
20070171286 | Ishii et al. | Jul 2007 | A1 |
20070174774 | Lerman et al. | Jul 2007 | A1 |
20070186240 | Ward et al. | Aug 2007 | A1 |
20070198659 | Lam | Aug 2007 | A1 |
20070214471 | Rosenberg | Sep 2007 | A1 |
20070214489 | Kwong et al. | Sep 2007 | A1 |
20070220024 | Putterman et al. | Sep 2007 | A1 |
20070220580 | Putterman et al. | Sep 2007 | A1 |
20070229300 | Masato | Oct 2007 | A1 |
20070243930 | Zalewski et al. | Oct 2007 | A1 |
20070282969 | Dietrich et al. | Dec 2007 | A1 |
20070283046 | Dietrich et al. | Dec 2007 | A1 |
20070283395 | Wezowski | Dec 2007 | A1 |
20080004959 | Tunguz-Zawislak et al. | Jan 2008 | A1 |
20080026838 | Dunstan et al. | Jan 2008 | A1 |
20080030621 | Ciudad et al. | Feb 2008 | A1 |
20080033826 | Maislos et al. | Feb 2008 | A1 |
20080046930 | Smith et al. | Feb 2008 | A1 |
20080046935 | Krakirian | Feb 2008 | A1 |
20080059988 | Lee et al. | Mar 2008 | A1 |
20080060001 | Logan et al. | Mar 2008 | A1 |
20080074546 | Momen | Mar 2008 | A1 |
20080077952 | St. Jean et al. | Mar 2008 | A1 |
20080077965 | Kamimaki et al. | Mar 2008 | A1 |
20080102947 | Hays et al. | May 2008 | A1 |
20080120112 | Jordan et al. | May 2008 | A1 |
20080120668 | Yau | May 2008 | A1 |
20080126919 | Uskali et al. | May 2008 | A1 |
20080127253 | Zhang et al. | May 2008 | A1 |
20080130951 | Wren et al. | Jun 2008 | A1 |
20080134256 | DaCosta | Jun 2008 | A1 |
20080147501 | Gilliam | Jun 2008 | A1 |
20080155585 | Craner et al. | Jun 2008 | A1 |
20080169929 | Albertson et al. | Jul 2008 | A1 |
20080184294 | Lemmons et al. | Jul 2008 | A1 |
20080196068 | Tseng | Aug 2008 | A1 |
20080204450 | Dawson et al. | Aug 2008 | A1 |
20080262909 | Li et al. | Oct 2008 | A1 |
20080263227 | Roberts et al. | Oct 2008 | A1 |
20080278635 | Hardacker | Nov 2008 | A1 |
20080282288 | Heo | Nov 2008 | A1 |
20080300985 | Shamp et al. | Dec 2008 | A1 |
20080301729 | Broos et al. | Dec 2008 | A1 |
20090019492 | Grasset | Jan 2009 | A1 |
20090025024 | Beser et al. | Jan 2009 | A1 |
20090052859 | Greenberger et al. | Feb 2009 | A1 |
20090059175 | Le Quesne et al. | Mar 2009 | A1 |
20090087039 | Matsuura | Apr 2009 | A1 |
20090118002 | Lyons et al. | May 2009 | A1 |
20090125971 | Belz et al. | May 2009 | A1 |
20090133051 | Hildreth | May 2009 | A1 |
20090138805 | Hildreth | May 2009 | A1 |
20090150925 | Henderson | Jun 2009 | A1 |
20090158170 | Narayanan et al. | Jun 2009 | A1 |
20090163139 | Wright-Riley | Jun 2009 | A1 |
20090165046 | Stallings et al. | Jun 2009 | A1 |
20090174658 | Blatchley et al. | Jul 2009 | A1 |
20090183208 | Christensen et al. | Jul 2009 | A1 |
20090192874 | Powles et al. | Jul 2009 | A1 |
20090195392 | Zalewski | Aug 2009 | A1 |
20090199235 | Surendran et al. | Aug 2009 | A1 |
20090210898 | Childress et al. | Aug 2009 | A1 |
20090217315 | Malik et al. | Aug 2009 | A1 |
20090217335 | Wong et al. | Aug 2009 | A1 |
20090222874 | White et al. | Sep 2009 | A1 |
20090248505 | Finkelstein et al. | Oct 2009 | A1 |
20090248914 | Choi et al. | Oct 2009 | A1 |
20090249391 | Klein et al. | Oct 2009 | A1 |
20090271829 | Larsson et al. | Oct 2009 | A1 |
20090288131 | Kandekar et al. | Nov 2009 | A1 |
20090288132 | Hegde | Nov 2009 | A1 |
20090292671 | Ramig et al. | Nov 2009 | A1 |
20090298535 | Klein et al. | Dec 2009 | A1 |
20090299843 | Shkedi | Dec 2009 | A1 |
20090300144 | Marr et al. | Dec 2009 | A1 |
20090313658 | Nishimura et al. | Dec 2009 | A1 |
20090325661 | Gross | Dec 2009 | A1 |
20090327073 | Li et al. | Dec 2009 | A1 |
20090328087 | Higgins et al. | Dec 2009 | A1 |
20100053458 | Anglin et al. | Mar 2010 | A1 |
20100070987 | Amento et al. | Mar 2010 | A1 |
20100086204 | Lessing | Apr 2010 | A1 |
20100107046 | Kang et al. | Apr 2010 | A1 |
20100107194 | McKissick et al. | Apr 2010 | A1 |
20100145797 | Hamilton, II et al. | Jun 2010 | A1 |
20100146445 | Kraut | Jun 2010 | A1 |
20100146560 | Bonfrer | Jun 2010 | A1 |
20100146573 | Richardson et al. | Jun 2010 | A1 |
20100154021 | Howarter et al. | Jun 2010 | A1 |
20100169072 | Zaki et al. | Jul 2010 | A1 |
20100177751 | Fischer et al. | Jul 2010 | A1 |
20100199313 | Rhim | Aug 2010 | A1 |
20100205562 | de Heer | Aug 2010 | A1 |
20100262987 | Imanilov | Oct 2010 | A1 |
20100269145 | Ingrassia et al. | Oct 2010 | A1 |
20100310234 | Sigvaldason | Dec 2010 | A1 |
20110016492 | Morita | Jan 2011 | A1 |
20110026384 | Ryu | Feb 2011 | A1 |
20110029922 | Hoffberg et al. | Feb 2011 | A1 |
20110069940 | Shimy et al. | Mar 2011 | A1 |
20110070819 | Shimy et al. | Mar 2011 | A1 |
20110072452 | Shimy et al. | Mar 2011 | A1 |
20110078731 | Nishimura | Mar 2011 | A1 |
20110106375 | Gurusamy Sundaram | May 2011 | A1 |
20110107388 | Lee et al. | May 2011 | A1 |
20110131607 | Thomas et al. | Jun 2011 | A1 |
20110134320 | Daly | Jun 2011 | A1 |
20110157368 | Jo | Jun 2011 | A1 |
20110163939 | Tam et al. | Jul 2011 | A1 |
20110164175 | Chung et al. | Jul 2011 | A1 |
20110167447 | Wong | Jul 2011 | A1 |
20110184945 | Das et al. | Jul 2011 | A1 |
20110185392 | Walker | Jul 2011 | A1 |
20110258211 | Kalisky et al. | Oct 2011 | A1 |
20110309933 | Marino | Dec 2011 | A1 |
20120008917 | Katz et al. | Jan 2012 | A1 |
20120011226 | Katz et al. | Jan 2012 | A1 |
20120011454 | Droz et al. | Jan 2012 | A1 |
20120047166 | Katz et al. | Feb 2012 | A1 |
20120072964 | Walter et al. | Mar 2012 | A1 |
20120077574 | Walker et al. | Mar 2012 | A1 |
20120105720 | Chung et al. | May 2012 | A1 |
20120114303 | Chung et al. | May 2012 | A1 |
20120150650 | Zahand | Jun 2012 | A1 |
20120327123 | Felt | Dec 2012 | A1 |
20130047189 | Raveendran et al. | Feb 2013 | A1 |
20130191875 | Morris et al. | Jul 2013 | A1 |
20140002619 | Morohoshi | Jan 2014 | A1 |
20140250447 | Schink | Sep 2014 | A1 |
20140375752 | Shoemake et al. | Dec 2014 | A1 |
Number | Date | Country |
---|---|---|
2635571 | May 2009 | CA |
101254344 | Sep 2008 | CN |
31 51 492 | Jul 1983 | DE |
195 31 121 | Feb 1997 | DE |
197 40 079 | Mar 1999 | DE |
0 424 469 | May 1991 | EP |
2 256 115 | Nov 1992 | EP |
0 535 749 | Apr 1993 | EP |
0 572 090 | Dec 1993 | EP |
0 583 061 | Feb 1994 | EP |
0 605 115 | Jul 1994 | EP |
0 624 039 | Nov 1994 | EP |
0 662 771 | Jul 1995 | EP |
0 682 452 | Nov 1995 | EP |
0 711 076 | May 1996 | EP |
0 725 539 | Aug 1996 | EP |
0 753 964 | Jan 1997 | EP |
0 758 833 | Feb 1997 | EP |
0 762 756 | Mar 1997 | EP |
0 763 938 | Mar 1997 | EP |
0 774 853 | May 1997 | EP |
0 793 225 | Sep 1997 | EP |
0 805 594 | Nov 1997 | EP |
0 854 645 | Jul 1998 | EP |
0 673 160 | Aug 1998 | EP |
0 874 524 | Oct 1998 | EP |
0 924 927 | Jun 1999 | EP |
0 932 275 | Jul 1999 | EP |
0 940 985 | Sep 1999 | EP |
0 944 253 | Sep 1999 | EP |
0 944 257 | Sep 1999 | EP |
0 986 046 | Mar 2000 | EP |
1 217 787 | Jun 2002 | EP |
1 363 452 | Nov 2003 | EP |
1 427 148 | Jun 2006 | EP |
2 129 113 | Dec 2009 | EP |
2 154 882 | Feb 2010 | EP |
2 299 711 | Mar 2011 | EP |
2 325 670 | May 2011 | EP |
2 256 115 | Nov 1992 | GB |
2 265 792 | Oct 1993 | GB |
2 458 727 | Oct 2009 | GB |
60-061935 | Sep 1985 | JP |
03-022770 | Jan 1991 | JP |
06111413 | Apr 1994 | JP |
07-336318 | Dec 1995 | JP |
08-056352 | Feb 1996 | JP |
08044490 | Feb 1996 | JP |
09-102827 | Apr 1997 | JP |
09-214873 | Aug 1997 | JP |
10-65978 | Mar 1998 | JP |
11 032272 | Feb 1999 | JP |
11-177962 | Jul 1999 | JP |
11-205711 | Jul 1999 | JP |
11-341040 | Dec 1999 | JP |
2000-004272 | Jan 2000 | JP |
2001084662 | Mar 2001 | JP |
2002153684 | May 2002 | JP |
2003162444 | Jun 2003 | JP |
2003209893 | Jul 2003 | JP |
2004080382 | Mar 2004 | JP |
2004215126 | Jul 2004 | JP |
2004304372 | Oct 2004 | JP |
2005150831 | Jun 2005 | JP |
2006101261 | Apr 2006 | JP |
2006324809 | Nov 2006 | JP |
2007036911 | Feb 2007 | JP |
2007081719 | Mar 2007 | JP |
2007524316 | Aug 2007 | JP |
2007274246 | Oct 2007 | JP |
2008035533 | Feb 2008 | JP |
2008079039 | Apr 2008 | JP |
2009060487 | Mar 2009 | JP |
2009081877 | Apr 2009 | JP |
2009111817 | May 2009 | JP |
2009130866 | Jun 2009 | JP |
2009164655 | Jul 2009 | JP |
2010510696 | Apr 2010 | JP |
1999-0086454 | Dec 1999 | KR |
20090064814 | Jun 2009 | KR |
20100076498 | Jul 2010 | KR |
247388 | Oct 1994 | RO |
WO-8703766 | Jun 1987 | WO |
WO-8804507 | Jun 1988 | WO |
WO-8903085 | Apr 1989 | WO |
WO-8912370 | Dec 1989 | WO |
WO-9000847 | Jan 1990 | WO |
WO-9100670 | Jan 1991 | WO |
WO-9107050 | May 1991 | WO |
WO-9204801 | Mar 1992 | WO |
WO-9308542 | Apr 1993 | WO |
WO-9310708 | Jun 1993 | WO |
WO-9322877 | Nov 1993 | WO |
WO-9414282 | Jun 1994 | WO |
WO-9501056 | Jan 1995 | WO |
WO-9501058 | Jan 1995 | WO |
WO-9501059 | Jan 1995 | WO |
WO-9504431 | Feb 1995 | WO |
WO-9510910 | Apr 1995 | WO |
WO-9515658 | Jun 1995 | WO |
WO-9528055 | Oct 1995 | WO |
WO-9531069 | Nov 1995 | WO |
WO-9532583 | Nov 1995 | WO |
WO-9532584 | Nov 1995 | WO |
WO-9532585 | Nov 1995 | WO |
WO-9532587 | Nov 1995 | WO |
WO-9607270 | Mar 1996 | WO |
WO-9609721 | Mar 1996 | WO |
WO-9613932 | May 1996 | WO |
WO-9617467 | Jun 1996 | WO |
WO-9620555 | Jul 1996 | WO |
WO-9625821 | Aug 1996 | WO |
WO-9631980 | Oct 1996 | WO |
WO-9633572 | Oct 1996 | WO |
WO-9634467 | Oct 1996 | WO |
WO-9634491 | Oct 1996 | WO |
WO-9641472 | Dec 1996 | WO |
WO-9641478 | Dec 1996 | WO |
WO-9713368 | Apr 1997 | WO |
WO-9717598 | May 1997 | WO |
WO-9721291 | Jun 1997 | WO |
WO-9731480 | Aug 1997 | WO |
WO-9732434 | Sep 1997 | WO |
WO-9734413 | Sep 1997 | WO |
WO-9734414 | Sep 1997 | WO |
WO-9736422 | Oct 1997 | WO |
WO-9737500 | Oct 1997 | WO |
WO-9742763 | Nov 1997 | WO |
WO-9746016 | Dec 1997 | WO |
WO-9746943 | Dec 1997 | WO |
WO-9747106 | Dec 1997 | WO |
WO-9747124 | Dec 1997 | WO |
WO-9747143 | Dec 1997 | WO |
WO-9748228 | Dec 1997 | WO |
WO-9749237 | Dec 1997 | WO |
WO-9750251 | Dec 1997 | WO |
WO-9801995 | Jan 1998 | WO |
WO-9807277 | Feb 1998 | WO |
WO-9810589 | Mar 1998 | WO |
WO-9812872 | Mar 1998 | WO |
WO-9816062 | Apr 1998 | WO |
WO-9817033 | Apr 1998 | WO |
WO-9817064 | Apr 1998 | WO |
WO-9818260 | Apr 1998 | WO |
WO-9819459 | May 1998 | WO |
WO-9826528 | Jun 1998 | WO |
WO-9826584 | Jun 1998 | WO |
WO-9826596 | Jun 1998 | WO |
WO-9831115 | Jul 1998 | WO |
WO-9831116 | Jul 1998 | WO |
WO-9834405 | Aug 1998 | WO |
WO-9838831 | Sep 1998 | WO |
WO-9847279 | Oct 1998 | WO |
WO-9847283 | Oct 1998 | WO |
WO-9848566 | Oct 1998 | WO |
WO-9853611 | Nov 1998 | WO |
WO-9903267 | Jan 1999 | WO |
WO-9904561 | Jan 1999 | WO |
WO-9911060 | Mar 1999 | WO |
WO-9912320 | Mar 1999 | WO |
WO-9914945 | Mar 1999 | WO |
WO-9914947 | Mar 1999 | WO |
WO-9927681 | Jun 1999 | WO |
WO-9928897 | Jun 1999 | WO |
WO-9930491 | Jun 1999 | WO |
WO-9935753 | Jul 1999 | WO |
WO-9939466 | Aug 1999 | WO |
WO-9944698 | Sep 1999 | WO |
WO-9956473 | Nov 1999 | WO |
WO-9960790 | Nov 1999 | WO |
WO-9964969 | Dec 1999 | WO |
WO-9965244 | Dec 1999 | WO |
WO-9966725 | Dec 1999 | WO |
WO-0004706 | Jan 2000 | WO |
WO-0005885 | Feb 2000 | WO |
WO-0011869 | Mar 2000 | WO |
WO-0016548 | Mar 2000 | WO |
WO-0017738 | Mar 2000 | WO |
WO-0030345 | May 2000 | WO |
WO-0033208 | Jun 2000 | WO |
WO-0033560 | Jun 2000 | WO |
WO-0033565 | Jun 2000 | WO |
WO-0033576 | Jun 2000 | WO |
WO-0058967 | Oct 2000 | WO |
WO-0059230 | Oct 2000 | WO |
WO-0101677 | Jan 2001 | WO |
WO-0101689 | Jan 2001 | WO |
WO-0135662 | May 2001 | WO |
WO-0150743 | Jul 2001 | WO |
WO-0191458 | Nov 2001 | WO |
WO-0191460 | Nov 2001 | WO |
WO-03046727 | Jun 2003 | WO |
WO-2004023810 | Mar 2004 | WO |
WO-2004032511 | Apr 2004 | WO |
WO-2004061699 | Jul 2004 | WO |
WO-2007036891 | Apr 2007 | WO |
WO-2007078503 | Jul 2007 | WO |
WO-2008042267 | Apr 2008 | WO |
WO-2008047184 | Apr 2008 | WO |
WO-2009067670 | May 2009 | WO |
WO-2009067676 | May 2009 | WO |
WO-2009079560 | Jun 2009 | WO |
WO-2009130862 | Oct 2009 | WO |
WO-2009148056 | Dec 2009 | WO |
WO-2009148833 | Dec 2009 | WO |
WO-2009151635 | Dec 2009 | WO |
WO-2011008638 | Jan 2011 | WO |
WO-2011037761 | Mar 2011 | WO |
WO-2011037781 | Mar 2011 | WO |
WO-2011084950 | Jul 2011 | WO |
Entry |
---|
U.S. Appl. No. 11/324,202, filed Dec. 29, 2005, Yates. |
U.S. Appl. No. 11/324,202, filed Dec. 29, 2005, Yates, Douglas. |
“Addressable Converters: A New Development at CableData,” Via Cable, vol. 1, No. 12 (Dec. 1981). |
“Audio Advertisement Recognition,” SIGNALogic [online]. Retrieved from the Internet on Sep. 8, 2010: URL: <http://www.signalogic.com/index.pl?p.=ad—recog>. Two pages. |
“Digital Video Broadcasting (DVB); DVB specification for data broadcasting”, European Telecommunications Standards Institute, Draft EN 301 192 V1.2.1 (Jan. 1999). |
“Electronic Programme Guide (EPG); Protocol for a TV Guide using electronic data transmission” by European Telecommunication Standards Institute, May 1997, Valbonne, France, publication No. ETS 300 707. |
“Honey, is there anything good on the remote tonight?”, advertisement from Multichannel News, Broadband Week Section, Nov. 30, 1998, p. 168. |
“How Evolve Works,” from the Internet at http://www.evolveproducts.com/network.html, printed on Dec. 28, 1998. |
“Jini Architecture Overview,” by Jim Waldo, from the Internet at http://Java.sun.com/products/jini/whitepapers/architectureoverview.pdf/ pinted on Jan. 25, 1999. The document bears a copyright date of 1998. |
“Reaching your subscribers is a complex and costly process-until now,” from the Internet at http://www.evolveproducts.com/info.html, printed on Dec. 28, 1998. |
“Simulation and Training”, 1994, 6 pages, HP Division Incorporated. |
“Sun's Next Steps in Digital Set-Tops,” article in Cablevision, Nov. 16, 1998, p. 56. |
“The Evolve EZ Guide. The Remote Control,” from the Internet at http://www.evolveproducts.com/display2.html, printed on Dec. 28, 1998. |
“Using StarSight 2,” published before Apr. 19, 1995. |
“Virtual High Anxiety”, Tech Update, Aug. 1995, pp. 22. |
“What is Jini?”, from the Internet at http://java.sun.com/products/jini/whitepapers/whatsjini.pdf, printed on Jan. 25, 1999. |
“Why Jini Now?”, from the internet at http://java.sun.com/products/jini/whitepapers/whyjininow.pdf, printed on Jan. 25, 1999. The document bears a copyright date of 1998. |
A. C. Snoeren et al., “An End-to-End Approach to Host Mobility” 6th ACM-IEEE International Conference on Mobile Computing and Networking (MOBICOM 2000), Boston, MA, USA, Aug. 2000, pp. 1-12. |
Aggarwal et al.. “Human Motion Analysis: A Review”, IEEE Nonrigid and Articulated Motion Workshop, 1997, pp. 90-102, University of Texas at Austin, Austin, TX. |
Arango et al., “The Touring Machine System,” Communications of the ACM, Jan. 1993, vol. 36, No. 1, pp. 68-77. |
Article: “Windows 98 Feature Combines TV, Terminal and the Internet”, New York Times, Aug. 18, 1998. |
Azarbayejani et al., “Visually Controlled Graphics”, Jun. 1993, pp. 602-605, vol. 15, No. 6, IEEE Transactions on Pattern Analysis and Machine Intelligence. |
Breen et al., “Interactive Occlusion and Collision of Real and Virtual Objects in Augmented Reality”, Technical Report ECRC-95-02, 1995, European Computer-Industry Research Center GmbH, Munich, Germany. |
Brogan et al., “Dynamically Simulated Characters in Virtual Environments”, Sep./Oct. 1998, pp. 58-69, vol. 18, Issue 5, IEEE Computer Graphics and Applications. |
BrugLiera, V. “Digital On-Screen Display—A New Technology for the Consumer Interface,” Symposium Record Cable Sessions, 18th International Television Symposium and Technical Exhibition, Montreux, Switzerland Jun. 10-15, 1993, pp. 571-586 (Jun. 11, 1993). |
CableData brochure, “A New Approach to Addressability” (undated). |
Chang, Y., et al., “An Open-Systems Approach to Video on Demand,” IEEE Communications Magazine, vol. 32, No. 5 pp. 68-80 (May 1994). |
Darrow, A. et al., “Design Guidelines for Technology-Mediated Social Interaction in a Presence Sensing Physical Space,” Carnegie Mellon University Research Showcase, Carnegie Institute of Technology, Jan. 1, 2007, pp. 1-9. |
David M. Rudnick, U.S. Appl. No. 09/283,681, filed Apr. 1, 1999, entitled Interactive Television Program Guide System Having Graphic Arrangements of Program Event Regions. |
Dimitrova, et al. “Personalizing Video Recorders in Multimedia Processing and Integration.” ACM 2001. |
Eitz, Gerhard, “Zukunftige Informations-Und Datenangebote Beim Digitalen Fernsehen—EPG Und Lesezeichen,” Rundfunktechnische Mitteilungen, vol. 41, pp. 67-72, Apr. 30, 1997. (English language translation attached.). |
F. Teraoka et al., “Host Migration Transparency in IP networks: The VIP Approach” ACM SIGCOMM—Computer Communication Review, ACM Press, New York, NY, USA, Jan. 1993, pp. 45-65. |
Fisher et al., “Virtual Environment Display System”, ACM Workshop on Interactive 3D Graphics, Oct. 1986, 12 pages, Chapel Hill, NC. |
Fortino et al., A Cooperative Playback System for On-Demand Multimedia Sessions over Internet, 2000 IEEE, pp. 41-44. |
Freeman et al., “Television Control by Hand Gestures”, Dec. 1994, Mitsubishi Electric Research Laboratories, TR94-24, Caimbridge, MA. |
Gondow, S., et al., “The Architecture of Communication Migration and Media State Management for Distributed Applications on Wearable Networks,” Information Processing Society of Japan (National Conference Lecture Collected Paper), Tokyo, Japan, Oct. 3, 2000, pp. 1-2. |
Granieri et al., “Simulating Humans in VR”, The British Computer Society, Oct. 1994, pp. 1-4,6-9, 12, 15-16, and 18-21 (15 pages) Academic Press. |
Haas et al., Proceedings of ICIP 2002 Personalized News Through Content Augmentation and Profiling:, Rochester, NY, Sep. 2002. |
Han et al., “Dynamic Adaptation in an Image Transcoding Proxy for Mobile Web Browsing,” IEEE Personal Communications, Dec. 1998, pp. 8-17. |
Hasegawa et al., “Human-Scale Haptic Interaction with a Reactive Virtual Human in a Real-Time Physics Simulator”, Jul. 2006, 12 pages, vol. 4, No. 3, Article 6C, ACM Computers in Entertainment, New York, NY. |
He, “Generation of Human Body Models”, Apr. 2005, 111 pages, University of Auckland, New Zealand. |
Hofmann, et al., “Videotext Programmiert Videorecorder,” Rundfunktechnische Mitteilungen, Nov.-Dec. 1982, pp. 254-257 (translation abstract attached). |
Hongo et al., “Focus of Attention for Face and Hand Gesture Recognition Using Multiple Cameras”, Mar. 2000, pp. 156-161, 4th IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France. |
Hsieh. C.. “Personalized Advertising Strategy for Integrated Social Networking Websites”, IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, Dec. 2008, pp. 369-372, IEEE Computer Society Washington, DC, USA. |
IBM Corporation “IBM Content Manager VideoCharger, New dimensions for enterprise content, DB2 Data Management Software” pp. 1-4, Mar. 2002. |
IBM Corporation “IBM Content Manager VideoCharger, Version 8, New dimensions for enterprise content, DB2 Data Management Software, pp. 1-4,” May 2002, Visit IBM Web site at ibm.com/software/data/videocharger. |
IBM Corporation “IBM VideoCharger for AIX Version 2.0 Streaming the power of video to your desktop, pp. 1-5 ” Visit the IBM VideoCharger Website at: www.software.ibm.com/data/videocharger/. |
IBM Corporation, “IBM Video Charger Server”, pp. 102, Jun. 1998. |
Index Systems Inc., “Gemstar Service Object Model,” Data Format Specification, Ver. 2.0.4, pp. 58-59, Dec. 20, 2002. |
International Preliminary Report on Patentability dated Mar. 26, 2013 in International Patent Application No. PCT/ US/2011/048706. 5 pages. |
International Search Report dated Apr. 10, 2012 in International Patent Application No. PCT/US/2011/048706. 3 pages. |
Isard et al., “CONDENSATION—Conditional Density Propagation for Visual Tracking”, 1998, pp. 5-28, International Journal of Computer Vision 29(1), Netherlands. |
Jaidev, “EXSLT—Wired and Wireless Case Study,” http://csharpcomputing.com/XMLTutorial/Lession15.htm. |
Kanade et al., “A Stereo Machine for Video-rate Dense Depth Mapping and Its New Applications”, IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1996, pp. 196-202,The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA. |
Kim, G., “IAB Social Advertising Best Practices”, The IAB User-Generated Content & Social Media Committee, May 2009, [online], [retrived on Jun. 22, 2010] Retrieved from the Social Media of IAB using Internet ,<URL: http://www.iab. net/media/file/Sociai-Advertising-Best-Practices-0509.pdf>, 19 pages. |
Kohler, “Special Topics of Gesture Recognition Applied in Intelligent Home Environments”, In Proceedings of the Gesture Workshop, 1998, pp. 285-296, Germany. |
Kohler, “Technical Details and Ergonomical Aspects of Gesture Recognition applied in Intelligent Home Environments”, 1997, 35 pages, Germany. |
Kohler, “Vision Based Remote Control in Intelligent Home Environments”, University of Erlangen-Nuremberg/ Germany, 1996, pp. 147-154, Germany. |
Li, et al., “Distributed Multimedia Systems,” Proceedings of the IEEE, vol. 85, No. 7, pp. 1063-1108 (Jul. 1997). |
Livingston, “Vision-based Tracking with Dynamic Structured Light for Video See-through Augmented Reality”, 1998, 145 pages, University of North Carolina at Chapel Hill, North Carolina, USA. |
Mah et al., “Providing Network Video Service to Mobile Clients,” 1993 IEEE, pp. 48-54. |
Miller, M. D. “A Scenario for the Deployment of Interactive Multimedia Cable Television Systems in the United States in the 1990's,” Proceedings of the IEEE, vol. 82, No. 4, pp. 585-589 (Apr. 1994). |
Miyagawa et al., “CCD-Based Range Finding Sensor”, Octobe51855986—1r 1997, pp. 1648-1652, vol. 44 No. 10, IEEE Transactions on Electron Devices. |
Neumann, Andreas, “WDR Online Aufbau Und Perspektiven Automatisierter Online-Dienste Im WDR,” Rundfunktechnische Mitteilungen, vol. 41, Jan. 21, 1997, pp. 56-66. (English language translation attached.). |
Owyang, J., “Contextual Ads Based Off Social Network Profile: Twitter and Facebook”, Web Strategy [online], Jun. 18, 2009 [retrieved on Jun. 22, 2010], Retrieved from the Internet: <URL: http://www.web-strategist.com/blog/2009/06/18/contextual-ads-based-off-social-network-profile-twitter-and-facebookl>, 10 pages. |
Papers Delivered (Part1), 61st National Conference, Information Processing Society of Japan, Oct. 3-5, 2000. |
Pavlovic et al., “Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review”, Jul. 1997, pp. 677-695, vol. 19, No. 7, IEEE Transactions on Pattern Analysis and Machine Intelligence. |
Pham et al,, “Exploiting Location-Based Composite Devices to Support and Facilitate Situated Ubiquitous Computing,” HUC 2000, LNCS 1927, pp. 143-156. |
Pogue, D., “State of the Art: For TiVo and Replay, New Reach,” N.Y. Times, May 29, 2003. |
Qian et al., “A Gesture-Driven Multimodallnteractive Dance System”, Jun. 2004, pp. 1579-1582, IEEE International Conference on Multimedia and Expo (ICME), Taipei, Taiwan. |
Randerson, J., “Let Software Catch the Game for You,” New Scientist, Jul. 3, 2004. |
Realplayer 8 Plus User Manual, Rev. 1, Real Networks, Inc. p. 32 (2000). |
Rogers, Curt, “Telcos vs. Cable TV: The Global View,” Data Communications, No. 13, New York, Sep. 1995, pp. 75, 76, 78 and 80. |
Rosenhahn et al., “Automatic Human Model Generation”, 2005, pp. 41-48, University of Auckland (CITR), New Zealand. |
Seles, Sheila Murphy, “Audience Research for Fun and Profit: Rediscovering the Value of Television Audiences”, Submitted to the program in Comparative Media Studies, Jun. 2010, 128 pages, Massachusetts Institute of Technology. |
Shao et al., “An Open System Architecture for a Multimedia and Multimodal User Interface”, Aug. 24, 1998, 8 pages, Japanese Society for Rehabilitation of Persons with Disabilities (JSRPD), Japan. |
Sheridan et al., “Virtual Reality Check”, Technology Review, Oct. 1993, pp. 21-28, vol. 96, No. 7. |
Sorce, J. et al., “Designing a Broadband Residential Entertainment Service: A Case Study,” 13th International Symposium Human Factors in Telecommunications, Torino, Italy, Sep. 10-14, 1990 pp. 141-148. |
Stevens, “Flights into Virtual Reality Treating Real-World Disorders”, The Washington Post, Mar. 27, 1995, Science Psychology, 2 pages. |
The New York Times Website Article, “2 Makers Plan Introductions of Digital VCR”, by John Markoff, Mar. 29, 1999. |
Toyama, et al, “Probabillistic Tracking in a Metric Space,” Eighth international Conference on Computer Vision, Vancouver, Canada, vol. 2, Jul. 2001, 8 page. |
Wren et al.. “Pfinder: Real-Time Tracking of the Human Body”, MIT Media Laboratory Perceptual Computing Section Technical Report No. 353. Jul. 1997, vol. 19, No. 7, pp. 780-785, IEEE Transactions on Pattern Analysis and Machine Intelligence, Caimbridge, MA. |
Zhao, “Dressed Human Modeling, Detection, and Parts Localization”, 2001, 121 pages, The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA. |
Number | Date | Country | |
---|---|---|---|
20150128158 A1 | May 2015 | US |